MRI signal-to-noise ratio (SNR) is the key factor for image quality. Conventionally, SNR is proportional to nuclear spin polarization, which scales linearly with magnetic field strength. Yet ever-stronger magnets present numerous technical and financial limitations. Low-field MRI can mitigate these constraints with equivalent SNR from non-equilibrium 'hyperpolarization' schemes, which increase polarization by orders of magnitude independently of the magnetic field. Here, theory and experimental validation demonstrate that combination of field independent polarization (e.g. hyperpolarization) with frequency optimized MRI detection coils (i.e. multi-turn coils using the maximum allowed conductor length) results in low-field MRI sensitivity approaching and even rivaling that of high-field MRI. Four read-out frequencies were tested using samples with identical numbers of (1)H and (13)C spins. Experimental SNRs at 0.0475T were ∼40% of those obtained at 4.7T. Conservatively, theoretical SNRs at 0.0475T 1.13-fold higher than those at 4.7T were possible despite an ∼100-fold lower detection frequency, indicating feasibility of high-sensitivity MRI without technically challenging, expensive high-field magnets. The data at 4.7T and 0.0475T was obtained from different spectrometers with different RF probes. The SNR comparison between the two field strengths accounted for many differences in parameters such as system noise figures and variations in the probe detection coils including Q factors and coil diameters.
Copyright © 2013 Elsevier Inc. All rights reserved.