FMCG/Data Acquisition with LabVIEW Software
Jump to navigation
Jump to search
This part of the guide will cover the actual data acquisition with the LabVIEW software. It will assume the user has N operational magnetometers which have been field-nulled and optimized. The first part will explain how to use the program to acquire data in DC-SERF mode, the second part in Z-mode.
DC-SERF mode data acquisition
In DC-SERF mode, the magnetometers are sensitive to magnetic fields in the Y direction. The program is set up to calibrate the magnetometers' response using the shell-mounted Y coils, and then collect data.
- From the FPGA Magnetometer.lvproj project, under Main Programs, open the Magnetometer_16.1_chirp-calibration.vi program.
- The front panel can be somewhat intimidating, but a lot of it is not used. We'll hit on the parts that are necessary to take a measurement.
- In the top left, under Y Channel Settings are controls for the calibration. Depress the yellow Select? button under the channels that you wish to measure. The other three inputs will tell the program how large of a chirp signal should be sent to the magnetic field coils, as well as calculate the size of the field in magnetic field units for calibration purposes.
- The Amp. (V) determines the amplitude of the chirp. This voltage will be output from the FPGA to the current supplies. This voltage must be at least .02 V for noise reasons. To keep electronic noise down, large SR570 I-V gain is generally used so using too large of a calibration signal will cause the magnetometer to rail during calibration. Generally I keep this at .02 V.
- The R(out) value should match the chosen output resistor on the Y shell coils, most likely 5000 Ω.
- The Field Coils input simply points the LabVIEW program to a particular coil calibration number which calculates how large of a calibration field (in T) is being applied. This should almost always be left on Printed Y - Coil, which tells the program that a coil calibration of 4.3 x 10-5 T/A should be used.
- The only other settings that may need to be adjusted (though the default values are normally fine) are those under Run/Calib Settings.
- Regardless of the number of channels you are using, always leave Max Chans at 4.
- Sample Rate (Hz) will set the FPGA acquisition rate. Sampling faster and then later downsampling (see below) can give better averaging and lower noise, but generally this type of noise is not what limits a measurement. 100000 Hz is usually sufficient.
- Downsample (Hz) sets the downsampling rate. To save space and take advantage of signal averaging, the signals collected by the FPGA are downsampled by an averaging procedure which takes blocks of N points (where N = fsample/fdownsample) and averages them to a single value. The Nyquist frequency of the downsample rate must still be larger than our bandwidth of interest (DC-100 Hz), thus we've generally set this to 1000 Hz.
- The Run Time (s) will determine how long of a post-calibration time series is made. For noise-measurement purposes, usually a short scan (6-10 s) is used. For a patient measurement, a much longer sample (60-180 s) is more common.
- Chirp Iterations sets the number of calibration chirps sent to each magnetometer during the calibration. The responses to these chirps are averaged when calculating the response. Historically we have used 5 chirps for this purpose.
- I'm not 100% sure what Calib Freq Max does, but I think it should be set equal to the Downsample frquency, or 1000 Hz.
- That's all the setup for now! Click the Run arrow to start the program.
- The first prompt you'll see is one asking whether the heater PID circuit has been disabled. This is a relic of a previous-generation heater circuit, so you can always click Yes here.
- The next couple of prompts will ask the user to calibrate the device. If a calibration run has been completed already that day, the sequence will first ask the user if they'd like to re-calibrate the sensors. However, if it's the first time the program has been run that day, the calibration must be done before proceeding.
- The first prompt will tell you to calibrate the OPAMP_Y. During this stage of the calibration, the calibration chirp will be applied to the coil current supply circuit, and the monitor voltage from that circuit will be read. First, ensure on the main box that the Z-Mode switch is set to OFF and the Chirp switch is set to Y. Now, on the the FPGA breakout box, flip the switches under the green FPGA monitor BNC ports UP. This will connect the FPGA input to the monitor output on the Y current supply. Once this is done, click OK on the LabVIEW prompt and the sequence of 5 (or whatever you selected) chirp signals will be applied sequentially to the current supplies and the circuits' responses will be recorded.
- The next prompt will tell you to calibrate the MAGNETOMETER_Y. The chirp will be applied to the same current supply, except now the FPGA input should be connected to the magnetometer signals from the photodiodes (via the I-V converters). Flip the switches below the green ports DOWN to make that happen. Then click OK on the prompt. You should see the chirp signals appear on the scope. If the chirp signals are too large or a DC field has caused the magnetometer signal to swing such that the output rails during the calibration, it will have to be redone later. For now, just assume that it went fine.
- Lastly, a Noise/Heartbeat Measurement prompt will show up. No further switching is necessary (besides maybe adjusting the room or coil fields) before hitting OK. A time series of length Run Time will be collected.
- When the time series has been collected, the progress bar will be completely lit up and the message DONE will flash above it. Collection complete!