08-13-2025 03:29 AM
Dear Labview-Experts,
Currently, I am encountering a problem with storing my data that I get not solved. I would very much appreciate any thoughts regarding that issue.
I am using a Ni-5162 to acquire signal with 750 Hz, sample the signal with 625 MHz with 62500 sample length. This leads to a large amount of data (~ 1 Gbyte per minute). I generate this data in a producer loop and transfer it to the consumer. This transfer works pretty well. I have verified that with a 100 kHz reference clock that I save to each data stream. This yields constantly 1.33 ms, which is correct considering the 750 Hz trigger rate.
For saving the data, I produce files with e.g. 2 GB size. Once the file size of 2 GByte is reached, I automatically generate the next file.
When the file is smaller than 2 GByte I just store the data:
When the file reaches the 2 GByte, I close the file, open a new one and than proceed with storing:
However, while doing that, I lose data. This is very obvious when plotting the 100 kHz reference clock of the stored data:
I tried so many things like putting wait functions, but nothing helped. I can plot the 100 kHz reference clock in the consumer loop and see that it is continuously producing data. But whenever I generate the new file, data is lost, leading to the gap shown above.
Is there any possibility to pause the queue correctly, such that it waits until the new file is ready to be written?
Is there any other opportunity to solve this issue?
Any idea would highly be appreiated.
If I forgot to mention any important details, please let me know. I can of course also share my code. However, currently it contains a lot of things that are not part of the problem.
Thanks a lot for your time and all the best,
Benjamin
Solved! Go to Solution.
08-13-2025 03:56 AM - edited 08-13-2025 03:57 AM
Hi Benjamin,
@Witschas82 wrote:
I tried so many things like putting wait functions, but nothing helped. I can plot the 100 kHz reference clock in the consumer loop and see that it is continuously producing data. But whenever I generate the new file, data is lost, leading to the gap shown above.
Is there any possibility to pause the queue correctly, such that it waits until the new file is ready to be written?
Is there any other opportunity to solve this issue?
08-13-2025 04:04 AM
Dear Gerd,
thanks a lot for your fast reply.
Actually, I added a "pause" as I read that opening new files takes some time. My idea was to "fill" the queue while creating the new file, and empty it afterwards.
The queue size is infinite (-1), but it does not fill at all. So I thought that the consumer is faster than the producer. I do not see any other number than "0" when I plot elements in queue (except when I add the wait function, it increases to a few hundred until the pause is over).
I also tried different file size limits which did not solve my problem.
But I will try to create a SubVi and see if this changes my problem.
So far, I do not really understand what is happening with the data? I see that it is transferred and red correctly (when plotting the clock), but it does not find its way to the binary file. Why?
Thanks a lot and best regards,
Benjamin
08-13-2025 04:16 AM
08-13-2025 04:34 AM
Dear Gerd,
sure. Please find my condensed code attached. I use LV18, so this should be fine (is it?).
I tried to delete all parts that are not important but verified that the data loss problem still exists.
If you need any further explanation, just let me know.
You help is highly appreciated. Thanks a lot and best regards,
Benjamin
08-13-2025 05:03 AM - edited 08-13-2025 05:06 AM
Hi Benjamin,
I asked for cleaned up code…
08-13-2025 07:14 AM - edited 08-13-2025 07:17 AM
Hi Benjamin,
one more suggestion:
Place one more item into the cluster in the queue: the loop iterator value of the DAQ loop.
(You need to change the cluster type definition, too.)
This way you can most easily check in the consumer loop for consecutive messages: that iterator value needs to be incremented by 1 with each new queue element. This will help you to determine to source of those "lost data": is the consumer lacking some messages or does the producer miss some data?
08-13-2025 10:51 AM
Dear Gerd, thanks a lot for your helpful replies which I tried immediately.
Adapting the data storage looks much better now but yields the same result. Hence, I also added the index from the producer loop. When I plot the index in the consumer loop, I get every single value (contiously). So this works as expected:
However, when I add the index data to the stored file and plot it, I have the same problem. Whenever the new file opens, data is missing:
So it looks like that this issue still remains. Really strange. Does anybody has any other ideas to be tested.
Thanks alot and all the best,
Benjamin
08-13-2025 11:08 AM - edited 08-13-2025 11:09 AM
Instead of writing only the oldest element, why not flush the queue and write everything that's in the queue?
08-13-2025 11:19 AM
Thanks a lot for your comment, which I will test of course. When should I do that? For every Iteration?
Thanks and all the best, Benjamin