07-28-2020 01:09 PM
Good day all!
I created a LabVIEW project to capture multiple channels of voltage data over a user input period of time. I'm having an issue with writing to a text file at “consistent” time intervals at the very beginning of the test.
Setup details: NI cDAQ-9185 chassis with an NI 9205 module, LabVIEW 2019, HP Z Book laptop (not exactly new) running Windows 10. I’m capturing one sample on each of eight enabled channels (ai0:7), at a 1 kHz sample rate. I used the Producer/Consumer architecture (see the attached zipped project, saved in LabVIEW 2018).
When I capture a row of data (one sample from each of the eight channels), I also write a sequential Capture Number (red column in the attached Data Sheet Screen Shot.png) and the approximate seconds after start of test (blue column in the screen shot) to the file. This makes post-test correlation with other data easier.
As a check, I graphed the first two columns (Capture Number on X axis, seconds after test on the Y axis). I expected to see a straight line since the time between each capture and write should be relatively constant. However; the first ~60 captures (~50 ms) show a consistent, nonlinear behavior. (See the attached Graph Screen Shot.png.) Beyond those first captures, the capture intervals written to the drive are well behaved for the duration of the test, which can run from seconds to many minutes.
I changed the sample rate to 500 Hz to see if it effected the behavior. The shape and timing stayed the same so I don’t think it has anything to do with the cDAQ or it's settings.
Could this be an artifact of writing to the magnetic hard drive? Is there a way to reduce or eliminate this? It’s probably not a show stopper for the test but I’d like to do better than to tell the user to ignore for the first 50 ms/60 data points.
Solved! Go to Solution.
07-28-2020 02:03 PM
You are trying to get timing of the producer loop by measuring the consumer loop. Completely asynchronous.
Did youy know that index array is resizeable and there are better functions to get an array of data into a delimited and terminated string row?
THe following could be reduced to a postage stamp!
Try "Array to spreadsheet string" with proper delimiters and "format into file", nothing else.
Why can't you read more than one set of data? Isn't the timing exactly determined by the row number? Why even measure it?
07-28-2020 10:24 PM
You are trying to get timing of the producer loop by measuring the consumer loop. Completely asynchronous.
That's what I was afraid of; synchronous would be my preference.
Did youy know that index array is resizeable and there are better functions to get an array of data into a delimited and terminated string row?
Nope, apparently not. Won't forget that now, though.
Try "Array to spreadsheet string" with proper delimiters and "format into file", nothing else.
Much nicer!
Why can't you read more than one set of data?
I didn't see an obvious way to do that and still be able to add the Capture Number and the elapsed time.
Isn't the timing exactly determined by the row number? Why even measure it?
Ideally, yes, I could calculate the time based on row number. However; I didn't know if there was going to be a significant delay or other timing anomaly that would introduce an error into a calculated time so I opted for objective proof (at least until I was confident in the capture timing). An offset is easy to subtract out of course, but the odd behavior I'm seeing - not sure what to do about that.
I went with the Producer/Consumer format because a previous effort ran too slow. This was my attempt to move everything out of the loop containing the DAQmx Read. Should I condense the Producer/Consumer loops to a single loop without a queue?
Thanks
07-29-2020 06:25 AM
Your code is really hard to follow as it is more than one screen.
Here are a few observations:
1. As altenbach already suggested, the timing that you're reporting is bogus. The data are being collected at regular intervals. Your timing is showing delays in writing to the file.
2. Consider collecting your data as a waveform. Waveforms include timing information.
3. Consider collecting more than one sample per read.
4. Consider using a .tdms file instead of a text file. The .tdms file is not human readable, but can be easily read with Excel using the Excel add-in. There is even a DAQmx function so that you can set your task to automatically log to .tdms.
5. Do not try to put your producer and consumer in the same loop - this will only lead to trouble.
08-05-2020 11:21 AM
Sorry for the delay getting back to you. Your responses are very much appreciated.
1. As altenbach already suggested, the timing that you're reporting is bogus. The data are being collected at regular intervals. Your timing is showing delays in writing to the file.
That's what I suspected.
2. Consider collecting your data as a waveform. Waveforms include timing information.
Not being very experienced with the waveform data type, it wasn't obvious to me how to incorporate Mr. Altenbach's solution (Array to Spreadsheet String and Format Into File VIs) with waveforms. I had to opt for "quick" to get the setup to the lab. I will file that method away, though, and use it my next opportunity since I like the idea of having the actual time. (After a little more poking around, I found the Export Waveforms to Spreadsheet File VI (2D). Probably a good starting point.)
3. Consider collecting more than one sample per read.
Going down the one sample per read path started when I wanted to "manually" time stamp each reading. I understand, and have demonstrated to myself, the limitations this imposes. For this effort, I took Mr. A's suggestion and assumed the data was being collected at the proper intervals and simply calculated the time. For future efforts I'll move away from single reads for data logger applications.
4. Consider using a .tdms file instead of a text file. The .tdms file is not human readable, but can be easily read with Excel using the Excel add-in. There is even a DAQmx function so that you can set your task to automatically log to .tdms.
I'll check that out at my next opportunity.
5. Do not try to put your producer and consumer in the same loop - this will only lead to trouble.
Got it.