Okay, let's rephrase and condense my first post.
About the continuum background correction used in atomic absorption spectroscopy.
I'm looking from a theoretical point of view, as I haven't had the chance to do labs this term.
What I do understand is the following:
°When a hollow cathode lamp is used, we measure the absorbance at a resonance wavelength of for example 230 nm. The absorbance that is measured, is the sum of the absorbance by the analyte and the absorbance of the background ( molecules that did not atomize,...)
°When a deuterium lamp (continuum source) is used, we also measure the absorbance at the same resonance wavelength of 230 nm. The absorbance measured is approximately equal to the absorbance by the background. This is because of the fact that a portion of the continuum is allowed to exit the monochromator and hit the detector. Less than 1% of the measured absorbance is due to the analyte.
What I'm struggling with is the following:
Suppose that the bandwidth of the light hitting the detector is 0,2nm and the linewidth of the resonance line is 0,002nm. This band of light consists of several wavelengths. Is it necessary that the intensity of all these wavelengths are (both initially and after absorption) approximately the same? When all these wavelengths hit the detector we measure an absorbance value.
This absorbance measured is from several wavelengths hitting the detector and not just the resonance line. How is this measured absorbance an estimate of absorbance by the background at the resonance line? Is this measured absorption maybe an average absorbance over the whole bandwidth?