LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

High Speed Synchronized CAN/DAQ

1) What it does:  record CAN and analog data (hopefully synchronized) at a very high sample rate (~9kHz)
2) How I did it:  Left Side initializes array, acquisition parameters.  While loop replaces elements in the array, Top half of while collects the data, the Bottom half is to get me a time channel in micro-seconds?
3) Questions: 
-Is my analog and CAN actually synch'd??  I see most examples of CAN/DAQ synch use waveforms and aligning their time stamps with "Align Waveform Timestamps" vi while I am not using that mostly due to issues I encountered trying to keep my acquisition rate up.  (I've only been at Labview for three weeks and storing Clusters/Bundles in arrays for later writing to file escapes me, thus the only way I know to get a timestamp in the measurement file involved having the write to measurement file vi inside the while loop  -- which slowed down the acquisition).
 
-Is my time channel accurate and synch'd with the other two functions??  It is using the 20Mhz clock of the 6031E (analog card) which is the same clock used by the CAN board via RTSI (If i understand things correctly), so is the loop to loop delta I'm calculating accurate?  Would it be better if used the counter on the 6723 card?
 
Ideally, I only need 2.5kHz sample rate, but I have not figured out how to get the while loop to exactly that timing.  As is, my while loop is in "free wheeling" mode so rate is anywhere between 3 & 18kHz.
 
system info: PCI bus w/ 6031E Analog card and PCI-CAN/2  (also have a 6723 on the RTSI bus)
 
Thanks in advance for any advice you can lend!
0 Kudos
Message 1 of 6
(3,367 Views)
sorry guys, apparantly i lost my attachment by previewing my post....
0 Kudos
Message 2 of 6
(3,362 Views)
Maybe pinging this to the top early on a Tuesday will work better than adding it late on Friday?  Have I posted it in the wrong forum?
0 Kudos
Message 3 of 6
(3,340 Views)
Hello,

I took a look at your VI, and your Analog acquisition and CAN acquisition should be in sync the way you have it set up. I am not sure I understand your second question though about the time channel. Could you please be a little more specific about what you are trying to do in the bottom half. Thanks!

Regards,
Ebele O.
National Instruments
0 Kudos
Message 4 of 6
(3,340 Views)

Thank you Ebele,

In the bottom half of the vi I am generating a time channel to go into my data file.  I believe it is generating an accurate time signal in microseconds, but I am not certain (perhaps I am compounding too much roundoff error).  My loop time is fast enough that if I use the "Get Date/Time In Seconds" I do not have enough resolution (I'm collecting ~65000 data points in ~7 seconds), same thing is true if I try to use "Tick Count (ms)" (i get ~nine data points with the same "time"), therefore, this is what I came up with.

My belief is that both the "Get Date/Time In Seconds" and "Tick Count (ms)" use the same source that I am using (20Mhz Timebase) and therefore my calculation should be accurate, but I would really appreciate any advice

Also, Any ideas on how to limit my while loop to a rock steady 2.5 kHz?  Thanks again for your help.

0 Kudos
Message 5 of 6
(3,336 Views)
Hello,

I believe what you have configured does actually provide you with access to a clock with ~20MHz resolution. However, the code itself relies on the fact that you are repeatedly calling the DAQmx read VI. Only when this VI is called does the counter operation report its current count value for you to make a calculation of the time difference. There will be an inherent amount of delay from the VI call to the register read on the counter. Additionally, the execution time of the loop from one run to another will not always be the same amount of time. So, while you may be getting different values for each data point by using this clock, I would not necessarily trust the accuracy of this method.

Whenever you run a loop, there is some amount of what is called jitter that will exist. Jitter is a difference in the time it takes to execute code from one iteration to another. Because Windows is not a real-time operating system, this jitter can be relatively large as compared to a true real-time OS. This is why the timing VIs are limited to millisecond range in LabVIEW for Windows. Thus, if you would like the most accurate loop rates, I would suggest running your code on an RT target (such as a PXI controller). In this case, you can use the Timed Loop structure to obtain a truly accurate rate of 2.5 kHz. I hope this was useful for you!

Mike D.
National Instruments
Applications Engineer

0 Kudos
Message 6 of 6
(3,317 Views)