LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Lost data in consumer loop

Solved!
Go to solution

Dear Labview-Experts, 

Currently, I am encountering a problem with storing my data that I get not solved. I would very much appreciate any thoughts regarding that issue.

I am using a Ni-5162 to acquire signal with 750 Hz, sample the signal with 625 MHz with 62500 sample length. This leads to a large amount of data (~ 1 Gbyte per minute). I generate this data in a producer loop and transfer it to the consumer. This transfer works pretty well. I have verified that with a 100 kHz reference clock that I save to each data stream. This yields constantly 1.33 ms, which is correct considering the 750 Hz trigger rate.

For saving the data, I produce files with e.g. 2 GB size. Once the file size of 2 GByte is reached, I automatically generate the next file.

When the file is smaller than 2 GByte I just store the data:

Witschas82_0-1755073271397.png

When the file reaches the 2 GByte, I close the file, open a new one and than proceed with storing:

Witschas82_1-1755073325624.png

However, while doing that, I lose data. This is very obvious when plotting the 100 kHz reference clock of the stored data:

Witschas82_2-1755073396132.png

I tried so many things like putting wait functions, but nothing helped. I can plot the 100 kHz reference clock in the consumer loop and see that it is continuously producing data. But whenever I generate the new file, data is lost, leading to the gap shown above.

Is there any possibility to pause the queue correctly, such that it waits until the new file is ready to be written?

Is there any other opportunity to solve this issue?

 

Any idea would highly be appreiated.

If I forgot to mention any important details, please let me know. I can of course also share my code. However, currently it contains a lot of things that are not part of the problem.

 

Thanks a lot for your time and all the best,

 Benjamin

 

 

0 Kudos
Message 1 of 29
(399 Views)

Hi Benjamin,

 


@Witschas82 wrote:

I tried so many things like putting wait functions, but nothing helped. I can plot the 100 kHz reference clock in the consumer loop and see that it is continuously producing data. But whenever I generate the new file, data is lost, leading to the gap shown above.

Is there any possibility to pause the queue correctly, such that it waits until the new file is ready to be written?

Is there any other opportunity to solve this issue?


  • How should placing even more delays (wait function) help with processing data more quickly?
  • Why do you need to fiddle with setting file position for a newly created file?
  • I would place the code to "close the old file, then open/create a new one" into a subVI…
  • What is the (max) size of the queue?
  • You don't "pause" the queue, you just fetch elements - or you don't fetch elements…
  • I recommend to handle file sizes with multiples of 2^9 (512) or even 2^12 (4096), right now you calculate filesize as multiple of 10^6… (A MiB is 2^20.)
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 29
(382 Views)

Dear Gerd, 

thanks a lot for your fast reply.

Actually, I added a "pause" as I read that opening new files takes some time. My idea was to "fill" the queue while creating the new file, and empty it afterwards. 

The queue size is infinite (-1), but it does not fill at all. So I thought that the consumer is faster than the producer. I do not see any other number than "0" when I plot elements in queue (except when I add the wait function, it increases to a few hundred until the pause is over). 

I also tried different file size limits which did not solve my problem. 

But I will try to create a SubVi and see if this changes my problem.

 

So far, I do not really understand what is happening with the data? I see that it is transferred and red correctly (when plotting the clock), but it does not find its way to the binary file. Why?

 

Thanks a lot and best regards,

 Benjamin 

0 Kudos
Message 3 of 29
(376 Views)

Hi Benjamin,

 

please post the code that saves to file.

(Please downconvert when using recent LabVIEW versions, I prefer LV2019.)

 

And cleanup before posting, there shouldn't be any backward wires atleast…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 4 of 29
(363 Views)

Dear Gerd, 

sure. Please find my condensed code attached. I use LV18, so this should be fine (is it?). 

I tried to delete all parts that are not important but verified that the data loss problem still exists.

If you need any further explanation, just let me know. 

You help is highly appreciated. Thanks a lot and best regards,

 Benjamin 

0 Kudos
Message 5 of 29
(359 Views)

Hi Benjamin,

 

I asked for cleaned up code…

 

  • Suggestion:
    There are better ways to create a filepath - by using file functions instead of manipulating strings!
  • No need to set the fileposition to "current"…
  • No need for local variables in this loop: use shift registers!
  • Do you really need the ClearErrors inside the case structure? Which errors do you need to clear?
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 6 of 29
(346 Views)

Hi Benjamin,

 

one more suggestion:

Place one more item into the cluster in the queue: the loop iterator value of the DAQ loop.

(You need to change the cluster type definition, too.)

 

This way you can most easily check in the consumer loop for consecutive messages: that iterator value needs to be incremented by 1 with each new queue element. This will help you to determine to source of those "lost data": is the consumer lacking some messages or does the producer miss some data?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 7 of 29
(305 Views)

Dear Gerd, thanks a lot for your helpful replies which I tried immediately.

Adapting the data storage looks much better now but yields the same result. Hence, I also added the index from the producer loop. When I plot the index in the consumer loop, I get every single value (contiously). So this works as expected:

Witschas82_0-1755100171273.png

However, when I add the index data to the stored file and plot it, I have the same problem. Whenever the new file opens, data is missing:

Witschas82_1-1755100219831.png

So it looks like that this issue still remains. Really strange. Does anybody has any other ideas to be tested.

Thanks alot and all the best,

Benjamin 

 

0 Kudos
Message 8 of 29
(273 Views)

Instead of writing only the oldest element, why not flush the queue and write everything that's in the queue?

 

altenbach_0-1755101310773.png

 

0 Kudos
Message 9 of 29
(263 Views)

Thanks a lot for your comment, which I will test of course. When should I do that? For every Iteration?

Thanks and all the best, Benjamin 

0 Kudos
Message 10 of 29
(255 Views)