03-12-2020 09:10 AM
Hi Friends,
1. I am bit confused when we use the While loop what's type of clock it use while running the simulation. For example: If we plot sine wave graph and we can control while loop iteration using the wait(ms) object. Question: a. While loop take the clock speed of the CPU or processor? if it takes the clock speed can we convert our loop in such a way that we can measure the time in seconds(real-time data) and not the cpu clock time nor the while loop execution time. b. for my project, I don't want to know how much time the while loop take it to execute but i would like to see the time at which my instantaneous voltage sample is acquired and I can plot my amplitude vs time graph correctly? X axis of the graph must be that time at which the voltage is recorded. if there is any delay in such measurement, then what are the possible issue and can it be reduced?
2. When we use the Wait(ms) , does it slow down the loop with 100ms or it measure each voltage (voltage digitization) with the uncertainty on the order of 0.001s ?
3. Currently, I am dealing in acquiring the analog input from the gas flow meter, in that case, I really don't understand I should choose N sample or 1-Sample on demand. ? What are their advantages and disadvantages?
4. What is Deadtime? How can we reduce it during the data acquisition.?
5. There are two DAQinput and DAQ output what they mean? in my case i have the analog input from the device.
6. How can I improve my analog to digital data acquisition any suggestion on that?
--
03-12-2020
09:29 AM
- last edited on
12-19-2024
10:21 PM
by
Content Cleaner
Hi skdubey,
are you related to this question?
@skdubey wrote:
Hi Friends,
1. I am bit confused when we use the While loop what's type of clock it use while running the simulation. For example: If we plot sine wave graph and we can control while loop iteration using the wait(ms) object. Question: a. While loop take the clock speed of the CPU or processor? if it takes the clock speed can we convert our loop in such a way that we can measure the time in seconds(real-time data) and not the cpu clock time nor the while loop execution time. b. for my project, I don't want to know how much time the while loop take it to execute but i would like to see the time at which my instantaneous voltage sample is acquired and I can plot my amplitude vs time graph correctly? X axis of the graph must be that time at which the voltage is recorded. if there is any delay in such measurement, then what are the possible issue and can it be reduced?
When you want to have an (mostly) accurate timestamp to your AI signals then you should depend on hardware timing instead of relying on some "clock" or "CPU cycles": does your DAQ device support hardware-timed acquisition?
@skdubey wrote:
2. When we use the Wait(ms) , does it slow down the loop with 100ms or it measure each voltage (voltage digitization) with the uncertainty on the order of 0.001s ?
3. Currently, I am dealing in acquiring the analog input from the gas flow meter, in that case, I really don't understand I should choose N sample or 1-Sample on demand. ? What are their advantages and disadvantages?
4. What is Deadtime? How can we reduce it during the data acquisition.? 5. There are two DAQinput and DAQ output what they mean? in my case i have the analog input from the device.6. How can I improve my analog to digital data acquisition any suggestion on that?
2. Wait[ms] waits for the amount of milliseconds you define. It does not measure any voltages…
3. How is this related to while loops and wait functions? It depends on your requirements…
4. What are you talking about when you bring up the word "deadtime"?
5. One is signal input and the other is signal output. What is unclear about this?
6. Learn DAQmx basics…
03-12-2020
10:45 AM
- last edited on
12-19-2024
10:21 PM
by
Content Cleaner
It sounds like you're making a single measurement in each iteration using On-Demand software timing.
As GerdW said, you'd be much better making hardware timed measurements if your device supports it (use DAQmx Timing node).
You can read about DAQmx here: Learn 10 Functions in NI-DAQmx and Handle 80 Percent of Your Data Acquisition Applications
This would allow a very reliable dt value between measurements, but getting an "absolute" time value is still potentially tricky. But really, why do you need an absolute value?
If it's to compare with some other measurement, you only need to know the relative time between their zero values, right? In which case you can either share a trigger (and have their start times synchronise, if in the same device) or make measurements about skew and correct (if in different devices).
Depending on the hardware you're using, it's also possible in some cases to get GPS devices to get an "absolute" time value as the output of the measurement. This can be paired with your voltage etc (but is perhaps more detail/cost than you want?)
To discuss your Q2, I'm going to hazard you mean "what is the relationship between the loop start and end of an iteration, and the time at which a single voltage measurement is made?" to which I'd say - it depends. If you're using On-Demand timing, it is strictly between the start and the end, but beyond that, unclear.
Probably it happens shortly after the start (i.e. a measurement time after the start,ish) and then the loop waits for everything else to finish (likely the Wait takes the most time) but this isn't guaranteed.
If using hardware timing, it will depend on the time since the previous measurement, but they'll be evenly spaced (which is usually what you'd need).