Hello. I am developing an HPLC method for the quantitation of a substance at a particular wavelength. I had no idea as to the limit of detection, as this is a new product, so I ran 5 calibration standards: 0 ppm (solvent blank), 62.5 ppm, 125 ppm, 250 ppm, and 500 ppm. Using the calibration curve from the responses, and the formulae LOD = 3*[(std deviation)/(slope)] & LOQ = 10*[(std deviation)/(slope)], I found my LOD and LOQ to be 102 and 340 ppm, respectively.
These seem to be exorbitantly high, compared with the smallest non-blank standard of 62.5, which gave me a well-behaved chromatogram. My boss wants to know why the LOD and LOQ are what they are compared to the 62.5 ppm standard, and I have no answers for him. I don't know if I'm doing something wrong or if I just lack the statistical knowledge to explain why my work is correct (if it is). The software I'm using is old and doesn't offer any help in the way of automatic LOD determination. With this information, can someone tell me if I'm OK with those LOD and LOQ numbers, or did I mess up somewhere?