LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Lost data in consumer loop

Solved!
Go to solution

@Witschas82 wrote:

Thanks a lot for your comment, which I will test of course. When should I do that? For every Iteration?


 

I assume you always want to write all data accumulated in the queue so far. Yes. it will work equally well if the queue has zero or only one element.

 

(note that the shift register is important for the fileref if there is a chance that the FOR loop iterates zero times, else the reference would become invalid. Same for the error, which would get cleared)

Message 11 of 29
(316 Views)

@altenbach wrote:

Instead of writing only the oldest element, why not flush the queue and write everything that's in the queue?


I was once told the Flush Queue also clears out the memory of the queue. For this reason, it may be better to do the Dequeue in a small FOR loop (dequeue X times or until a timeout occurs). You can then process the array of data as a whole instead of 1 element at a time.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 12 of 29
(298 Views)

@crossrulz wrote:

@altenbach wrote:

Instead of writing only the oldest element, why not flush the queue and write everything that's in the queue?


I was once told the Flush Queue also clears out the memory of the queue.


While I don't really know what that actually means. Are you taking about deallocation? Since a queue can grow and shrink, memory is probably somewhat dynamic anyway.

 

In any case, the help is silent about that and I have never used it. No expert here. 😄

0 Kudos
Message 13 of 29
(294 Views)

I just have one question: How do you read the files ? 

0 Kudos
Message 14 of 29
(288 Views)

Also have you tried to flush the file before closing the file? 

0 Kudos
Message 15 of 29
(277 Views)

@dkfire wrote:

I just have one question: How do you read the files ? 


Well, it is just flat binary and we know the datatype and byte order, so we con read it back as one huge blue array (or read in sections).

0 Kudos
Message 16 of 29
(290 Views)

@altenbach wrote:

@dkfire wrote:

I just have one question: How do you read the files ? 


Well, it is just flat binary and we know the datatype and byte order, so we con read it back as one huge blue array (or read in sections).


I know, but the OP hasn't shown that part of the code. 

Just to make sure it is not at the reading there is a bug. 

0 Kudos
Message 17 of 29
(283 Views)

First, I want to thank the group here for the valuable comments and discussion. This is great help.

Unfortunately, flushing the queue did not solve my problem. 

I guess, I read the data correctly, but it is a very good idea to double-check.

As my DAQ-data is I8, I also convert my header data to I8-values before saving, e.g., an I32 values results in a I8 array of length 4. Hence, reading my binary file looks as follows: 

Witschas82_0-1755120139527.png

 

All values are correct from both the header as well as the data file. It is just that data is missing when an new file was opened. When I write larger files, e.g. 8 GByte, I can read continuous data for 8 GByte before a data is missing again. Can this be an issue of wrong reading?

 

Thanks again and all the best, Benjamin

0 Kudos
Message 18 of 29
(269 Views)

Arrays are limited to 2147483647 (index is I32) elements while files have no such limits. Here's your problem.

 

Your truncated code snipped makes very little sense. I cannot comment further until I see the entire code.

0 Kudos
Message 19 of 29
(260 Views)

I agree, a small picture does not help. 

 

But I found one problem in your producer loop: 

dkfire_0-1755152862529.png

That insert into array will destroy/remove the first 256 samples of your data. 

That should have been a build array.  

 

And you math with the 2d array to 1d array does not make much sense to me. 

Do you know how much data the get in each batch of reading?

Is it a fixed size ? 

 

The producer loop for reference: 

dkfire_1-1755153155968.png

 

 

0 Kudos
Message 20 of 29
(240 Views)