08-21-2005 04:51 PM
08-21-2005 04:52 PM
08-22-2005 12:02 PM
Hello,
I believe that the problem here is that the "Simulate Signal" VI has
a timestamp from 0 to (x-1) where x is the number of samples you are
acquiring. Each time your loop iterates it generates x new samples.
When I run the test using real data from my DAQ card it does not roll over each
x samples. I would try simulating a DAQ card (available with NI-DAQmx 7.4
or higher) and running the DAQ assistant exactly like you plan to do in the
lab.
I hope this helps, if you have any other questions feel free to let us know,
and for questions regarding seting up your actual acquisition our Engineers and
customers in the Multifunction DAQ Forum may be a great resource.
Thanks for posting and have a great afternoon!
Regards,
Travis M
Applications Engineer
National Instruments
08-22-2005 02:00 PM
08-22-2005 04:36 PM
08-22-2005 05:57 PM
08-23-2005 09:17 AM
Hello Sherri,
I'm not exactly sure what you mean by the dt is changing in the formula
node. I don't really see any place where dt is being directly used by the
formula node. If you extract the waveform components of your simulated
signal and examine the value for dt, it does not appear to deviate (see the
simplified VI attached).
With regards to NI-DAQmx, if you are using one of our National Instruments Data
Acquisition (DAQ) cards, this is most likely the driver software that you are
using in LabVIEW to control the parameters of your acquisition. Beginning
with NI-DAQmx 7.4 and now with 7.5 as well, you have the ability to simulate a
DAQ card and are no longer just limited to simulating a signal.
Again, thanks for posting, and let us know if we can help out further.
Travis M
NI
08-24-2005 11:50 AM
08-24-2005 02:18 PM
08-25-2005 02:27 PM
Good afternoon Sherri,
The first question I have for you is about your acquisition system. I
understand that you are reading voltages and converting these to
temperatures. If you are using thermocouples or RTDs for this you can use
some
With regards to the dt being changed, this is because of your Formula VI.
The Formula express VI operates on only one datatype, and all other datatypes
wired to it will be converted to that datatype in some way. Since you are
wiring a datatype of type "signals" into the formula node, LabVIEW is
converting the other parameters to a datatype of type
"signals". When this conversion takes place it will essentially
construct a constant waveform. The formula node appears to attempt to be
"smart" about how it constructs this waveform -- that is, it seems to
try to construct any new waveforms with the same dt values as the one wired to
it. This does not seem to always happen. When this does not happen
you can essentially just replace the dt value in the resulting waveform with
the one that was in the original. Also in the zip file attached
here is a VI which demonstrates the formula node working correctly, strangely,
and a workaround for when it behaves strangely. I would use the
workaround for now. In the meantime I am going to look further into why
the formula node sometimes appears to unnecessarily change the dt values of its
inputs.
Hope this helps.
Regards,
Travis M
Applications Engineer
National Instruments