LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Channel calibration

Solved!
Go to solution

I have a JKI state machine that I am attempting to code for general Data Acquisition.  I have it to where I can select the physical channels and write to a csv but want to have the option to calibrate individual physical channels and select the trigger channel.

 

Is there a conventional way to calibrate with the DAQmx API programmatically?  Like using a FOR loop with the number of channels and an array of calibration values.

0 Kudos
Message 1 of 4
(297 Views)
Solution
Accepted by DePeppers

You don't say anything about the DAQ hardware you are using.  There are (at least) two dimensions to the concept of "channel calibration" -- "gain" (or the "Y dimension" of the signal (you might also add "offset" here) and "time" (or the "X dimension" of the signal.  

 

Consider taking "vibration" (or "movement") signals from a 3-D Accelerometer.  Suppose you want to take data (continuously) at 1 kHz, and the specification for the three axes were that the "gain" of each channel (which are arranged to be accurately spatially arranged to be orthogonal to each other) is 1 V/g with an offset of 0 V, with gain and offset being ±0.3 V/g in gain and ±0.3 V offset.  So if you read a signal of 0V from the Z channel, the "true" acceleration can range from -0.3 g to +0.3g.  [I actually asked some BME grad students how they were going to handle this question -- seems they never learned about "calibrating the Instrument").  Hint -- for a simple triaxial accelerometer, designing a "calibration routine" is really simple, and only depends on being able to hold the accelerometer "relatively motionless" for a second and take 6 readings at 6 diffeent (spatial) orientations).

 

But what about "time" calibration?  How do you ensure your PC can acquire data at 1 kHz with (relatively) high precision and accuracy?  Here the "good news" is most good DAQ hardware.  You need to acquire the data using the hardware clock in the DAQ hardware and use its internal buffer to hold, say, 1000 samples (which, at a sampling rate of 1 kHz takes 1.000 seconds) and, whicn acquired, dumps it into PC memory when the code does a "DAQ Read" request.  The timing accuracy is managed by the timing chip on the DAQ device, and isn't affected by what is happening on the PC (such as a Virus scan, display update, or other activities).

 

So good DAQmx code handles Time calibration for you, and all you need to do is to take measurements using known inputs to calibrate the "non-time" dimensions of your signal.

 

Bob Schor 

 

 

Message 2 of 4
(279 Views)

Thank you Bob_Schor for your thorough explanation of the dimensions of calibration.  Your analogy to your previous work with accelerometers reminds me of a force torque sensor I used with DAQmx Assistant.  An initial reading was made to zero the balance and an calibration matrix was used to calibrate the gain.

 

What I am working with now is a 9185 DAQ chassis with a 9205, 9223, and 9215 BNC modules for differential measurement.  The 9205 is my trigger enabled module and the rest will be for voltage measurement.

 

What I should've said in my initial question is how to scale channels in a preexisting task and modifiy trigger settings.  I would like to read and write the properties of the task or generate them through the VI.

 

Since I do not have the fundamentals of Labview, I'll go ahead and just create tasks for my desired purposes.

0 Kudos
Message 3 of 4
(239 Views)

@Bob_Schor wrote:

[I actually asked some BME grad students how they were going to handle this question -- seems they never learned about "calibrating the Instrument").  Hint -- for a simple triaxial accelerometer, designing a "calibration routine" is really simple, and only depends on being able to hold the accelerometer "relatively motionless" for a second and take 6 readings at 6 diffeent (spatial) orientations).

 

 


reminds me of https://www.howtogeek.com/519142/how-to-calibrate-the-compass-on-android-to-improve-device-location-...

"Move your phone in a figure-eight pattern until the "Compass Accuracy" says "High." Now you can tap 'Done.""

0 Kudos
Message 4 of 4
(217 Views)