LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Producer Consumer Pauses at high rates

Solved!
Go to solution
Solution
Accepted by topic author skinnert

@skinnert wrote:

I can only mark 1 thing as a solution so I did that. The McDuff


You can mark multiple posts as solutions if desired.

 


@skinnert wrote:

I like the DAQ-MX logging quite a bit. I haven't yet figured out how to enable/disable it since I don't want it recording all the time and also how to add additional header information to the resulting TDMS files... I've got some research to do but I'll get there.


If you use the Event Structure template I posted earlier, it is an easy mod. In fact I just made the mod. See below and attachment. Adding additional header information is not currently supported; you can do this, but it needs to be after the acquisition. You can kudo this idea.

 

snip.png

Message 21 of 29
(664 Views)

Multiple solutions accepted and Kudos given! Thanks!

0 Kudos
Message 22 of 29
(633 Views)

I've been using the mcduff DAQmx logging solution for the last couple of years that works great. Big thanks to mcduff for the original implementation. It has handled continuous data streaming and on-demand recording really well.

 

However, my requirements have recently changed. Previously, I had a predictable event to capture. Now the event can occur at any time, and I might be waiting up to an hour for it. Logging an hour-long file just to catch a 10-second event isn't ideal.

 

What I’d like to do is:

 

  1. Continuously monitor incoming data.
  2. When I press a "Record" button, I want to capture:
  3. 5 seconds of pre-trigger data
  4. 5 seconds of post-trigger data
  5. Save the total 10-second chunk to disk.
  6. Then return to monitoring in case the event happens again.

This is similar to what Phantom high-speed camera users refer to as “buffered recording,” where you store data in a circular buffer, then freeze and save it when an event occurs.

 

I initially looked into tweaking the DAQmx input/output buffer sizes, but I now understand those are mostly for hardware transfer or memory allocation, not logging.

 

It seems like the right path is to build a software circular buffer that continuously stores a few seconds of recent data in memory, and then dumps it to disk when triggered. The existing architecture already uses a state machine with DAQmx logging and pause functionality, but I’m not sure how to capture and save the buffered pre-trigger data when "paused," then continue logging post-trigger data, and finally go back to monitoring.

 

Does anyone have guidance, examples, or suggestions for:

  • Managing a software circular buffer in LabVIEW with DAQmx logging and pausing?
  • Cleanly transitioning between buffering, triggering, and post-event logging?

Thanks!

0 Kudos
Message 23 of 29
(178 Views)

A fixed-size queue with lossy enqueue is a simple buffer that works really well. You can use queue status to get all buffer data.

Message 24 of 29
(164 Views)

Good news- your data requirements decreased 🙂

 

A fixed size queue should work great. You could make it out of individual datapoints but you'd have to enqueue each one, so I think I'd make a queue of waveforms. Each waveform would be one "read" from your DAQmx function.

 

Also, I can't find it now but I'm almost certain Kevin Price had some neat buffer tricks to handle this exact situation.

 

Edit: Also, your card may support analog Start Triggers, which would handle all of this for you.

Message 25 of 29
(153 Views)

Can't edit the reply, but I (finally) found the post I was remembering:

 

https://forums.ni.com/t5/LabVIEW/A-should-be-simple-DAQmx-question-take-finite-samples-in/m-p/361991...

 

I remembered slightly wrong. Kevin's solution is for when something other than the signal itself dictates when you need to grab the data. So, if you need to read an analog channel and decide if "the thing" happened, then use a queue/shift register/etc. If you have something else that comes in to tell you when "the thing" happened (like a user clicking a button), then use Kevin's post. It will continually read samples, and when you actually ask for data, it will immediately give you the last n samples. I'm sure you could configure it to return more samples too, since you need 5 seconds back and 5 seconds forward- just fiddle with the offsets and read positions.

Message 26 of 29
(144 Views)

I like these suggestions and I can see how they should work fundamentally, but I'm uncertain of how I incorporate them into the existing VI (my actual version attached).

 

On the Queue side, since this is DAQmx logging, does the save file even care about the queue? I don't actually need the data out wire from the DAQmxRead except for the waveform graph. As far as I can tell, all the logging is internal so how would a queue be implemented here? Task in/out?

 

With Kevin's solution, I think that could work except I've already got a specific sample interval tied to the number of samples per channel and that triggers the event...

 

Ok, maybe I can make another step in the event structure which activates when I release the pause and records all the pretrigger data using Kevin's method? Once it's completed that it steps over into the every N-samples?

0 Kudos
Message 27 of 29
(103 Views)

Alright, this didn't quite work.

 

I went ahead and duplicated Kevin's code and verified that it was functional in my application.  It appears to work. 

 

I then added in the DAQmx logging and got some errors. After some troubleshooting it appears that negative offsets will not work with DAQmx logging. In hindsight, this makes sense to me since the logging starts immediately, there is no offset data to collect.

 

skinnert_0-1753471481834.png


So, I'm curious how to get queue's working with DAQmx Logging... I'm on the hunt.

0 Kudos
Message 28 of 29
(93 Views)

I'm mostly off grid until next week,  have just a few minutes now. 

 

Looks like "my" method can't mix with tdms logging.  The suggested continuous reads, software processing to detect the moment of interest, and lossy queues will be the better approach. 

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 29 of 29
(60 Views)