05-17-2010 09:55 AM
I am trying to calibrate two sensors that I will be continuously monitoring real time. My code is inside a while loop continously monitoring the two sensors. In order to keep them calibrated and properly monitor the values despite the change in temperature I need to know the initial voltage values they start at to adjust for thermal changes.
How would I take the first value acquired in the loop at start up and store it so that I have an initial value to subtract from my continuos final values. I"m rather new to LabView and don't know if this will be done in a nested "for" or "while" loop or case stucture for that matter.
Thanks
-Dan
05-17-2010 10:03 AM
There are a couple of problems with your VI:
There are a couple of ways to accomplish the task of getting the initial values and holding them as "calibration" values. Note that I use the word "calibration" in quotes since this isn't really a calibration, but an offset from some arbitrary value.
05-17-2010 10:35 AM
Im using the for loop to acquire 10 samples in under a second so that I have a more stable value every time the while loop iterates. At least I thought this would give me a more stable reading in under a second then asking the DAQ to simply request 10 samples.
I don't under stand this "first call" primitive you are referring too. I still dont understand how I would "freeze" the first values obtained so that I can use them for a "calibration."
Thanks
-Dan
05-17-2010 10:43 AM
I think I got it.
Is this along the lines of what you were referring too?
Thanks
05-17-2010 12:58 PM
I don't think this will work. On the first loop iteration (i=0), the DAQ values will be put into the shift registers on the right side. The values on the left are not intialized and will be unknown, so you are storing unknown values. Try this vi: