LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Producer%2FConsumer timed

I’m trying to sample voltages with a NI9223 and trying to save to a word file for later analysis. But with this code I’m only getting about a fourth of the data points for instance if I’m sampling at a 100Hz for two minutes I’m only getting about 3 million data point when I should have 12 million. Any suggestion would help/

0 Kudos
Message 1 of 6
(2,952 Views)

100kHz

0 Kudos
Message 2 of 6
(2,947 Views)

Don't release your queue until after all of the data has been saved.  The easiest way to do this is to move the Release Queue to after the Merge Errors.  You should also come up with a way to let the consumer know that the producer is complete.  I usually use a command of some sort in the queue.



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 3 of 6
(2,928 Views)

@crossrulz wrote:

Don't release your queue until after all of the data has been saved.  The easiest way to do this is to move the Release Queue to after the Merge Errors.  You should also come up with a way to let the consumer know that the producer is complete.  I usually use a command of some sort in the queue.



The queue release is not the problem (Force Destroy = F) So what we have here is an infinate loop in the consumer!

 

Looking at that Consumer Loop again we are storing the data into 2 files One using a X vi.  I would styrongly suspect that the write to measurements file is slowing down the consumer


"Should be" isn't "Is" -Jay
0 Kudos
Message 4 of 6
(2,922 Views)

No, the queue was only obtained once.  Regardless, the producer and consumer are using the same queue reference.  This reference is being released and causing an error in the producer causing it to quit.  There is still data in the queue when the reference is released, which is why only part of the data is written to disk.



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 5 of 6
(2,919 Views)

I initially did not notice the second file being written.  But here is a possible alternative.  You can configure the DAQmx task to stream straight to a TDMS file.  This will eliminate a lot of this complexity.  To do this, look at the DAQmx Configure Logging.vi (it is in the DAQmx Advanced Task Options palette).



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 6 of 6
(2,916 Views)