I am a little bit lost about use of origin/ zero in calibration and using a blank solution for making a calibration curve. I am not an analytical chemist, but using it all the time..
I uses vis spectrophotometry to determine protein concentration. I then develop a series of known concentration of protein, e.g. 0.0, 0.1, 0.2, 0.3, 0.4 and 0.5 mg/mL. I then add reagents, and finally measure the Absorbance. Before reading all the standards, I calibrate (set to 0) the instrument using 0 mg/mL standard, which is the blank. and read all the rest against it. for example I have the following data: (0.0, 0.000) (0.1, 0.122) (0.2, 0.254) (0.3, 0.382) (0.4, 0.512) (0.5, 0.624)
So, my question is, as the instrument has been "told" that the reagent mixture without analyte result in 0.000 absorbance, should we input (0.0, 0,000) as one of the point when we are making the standard curve??
Its not that because I have samples with Absorbance less then 0.122, but because the instrument has been set that absorbance 0 is all the reagent mixture without the analyte. If we don't include the (0.0, 0.000) point for the linear regression calculation, I tought the absorbance of the other standard should be read without setting the instrument to zero (calibrate the instrument).
What actually the different between what I am doing and Use of origin/zero in calibration curve?
I hope everyone can get what I am trying to say.. Sorry for the bad English..
PS: To Borec: Sorry to come up again with this topic, I just want a more clear explanations