April 18, 2024, 05:28:38 AM
Forum Rules: Read This Before Posting


Topic: How chemical shift was measured in classic NMR instrument?  (Read 2275 times)

0 Members and 1 Guest are viewing this topic.

Offline Bidagdha_TADIR

  • Regular Member
  • ***
  • Posts: 49
  • Mole Snacks: +2/-2
  • Gender: Male
How chemical shift was measured in classic NMR instrument?
« on: April 26, 2014, 03:25:17 AM »
I am learning the basics of NMR this year. One of the books recommended to me is "Introduction to spectroscopy" by Donald L. Pavia. While reading the book, I learned that the older instruments were designed to keep the rf frequency constant and would vary the external magnetic field strength to cause absorption of energy by different hydrogen nuclei.
But chemical shift is measured in delta scale as -

[tex] \frac {(\nu_s - \nu_{TMS})}{(Spectrometer frequency in MHz)} [/tex]

where,
[tex] \nu_s = Resonance  frequency  of  that  nuclei [/tex]
[tex] \nu_{TMS} = Resonance  frequency  of  hydrogen  nuclei  in  TMS [/tex]

Every term in the equation is related to frequency. So, how  an instrument measuring the magnetic field strength required to cause the absorption would calculate the chemical shift by above definition ? Is something like the following equation would be used -

[tex] \frac {(B_s - B_{TMS})}{(Spectrometer frequency in MHz)} [/tex]

Where,

Bs = Magnetic field required to cause absorption by that nuclei
BTMS = Magnetic field required to cause absorption by hydrogen nuclei in TMS

This seems unlikely because the the change in the magnetic field required would be very small (I think 1-2 Gauss).

(I am extremely sorry if the equations do not appear in a legible form, I tried to see the preview but LaTex form would not show in the preview).

Sponsored Links