FMCG/Data Acquisition with LabVIEW Software
Jump to navigation
Jump to search
This part of the guide will cover the actual data acquisition with the LabVIEW software. It will assume the user has N operational magnetometers which have been field-nulled and optimized. The first part will explain how to use the program to acquire data in DC-SERF mode, the second part in Z-mode.
DC-SERF mode data acquisition
In DC-SERF mode, the magnetometers are sensitive to magnetic fields in the Y direction. The program is set up to calibrate the magnetometers' response using the shell-mounted Y coils, and then collect data.
- From the FPGA Magnetometer.lvproj project, under Main Programs, open the Magnetometer_16.1_chirp-calibration.vi program.
- The front panel can be somewhat intimidating, but a lot of it is not used. We'll hit on the parts that are necessary to take a measurement.
- In the top left, under Y Channel Settings are controls for the calibration. Depress the yellow Select? button under the channels that you wish to measure. The other three inputs will tell the program how large of a chirp signal should be sent to the magnetic field coils, as well as calculate the size of the field in magnetic field units for calibration purposes.
- The Amp. (V) determines the amplitude of the chirp. This voltage will be output from the FPGA to the current supplies. This voltage must be at least .02 V for noise reasons. To keep electronic noise down, large SR570 I-V gain is generally used so using too large of a calibration signal will cause the magnetometer to rail during calibration. Generally I keep this at .02 V.
- The R(out) value should match the chosen output resistor on the Y shell coils, most likely 5000 Ω.
- The Field Coils input simply points the LabVIEW program to a particular coil calibration number which calculates how large of a calibration field (in T) is being applied. This should almost always be left on Printed Y - Coil, which tells the program that a coil calibration of 4.3 x 10-5 T/A should be used.
- The only other settings that may need to be adjusted (though the default values are normally fine) are those under Run/Calib Settings.
- Regardless of the number of channels you are using, always leave Max Chans at 4.
- Sample Rate (Hz) will set the FPGA acquisition rate. Sampling faster and then later downsampling (see below) can give better averaging and lower noise, but generally this type of noise is not what limits a measurement. 100000 Hz is usually sufficient.
- Downsample (Hz) sets the downsampling rate. To save space and take advantage of signal averaging, the signals collected by the FPGA are downsampled by an averaging procedure which takes blocks of N points (where N = fsample/fdownsample) and averages them to a single value. The Nyquist frequency of the downsample rate must still be larger than our bandwidth of interest (DC-100 Hz), thus we've generally set this to 1000 Hz.
- The Run Time (s) will determine how long of a post-calibration time series is made. For noise-measurement purposes, usually a short scan (6-10 s) is used. For a patient measurement, a much longer sample (60-180 s) is more common.
- Chirp Iterations sets the number of calibration chirps sent to each magnetometer during the calibration. The responses to these chirps are averaged when calculating the response. Historically we have used 5 chirps for this purpose.
- I'm not 100% sure what Calib Freq Max does, but I think it should be set equal to the Downsample frquency, or 1000 Hz.