LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Has anybody a solution to handle with large data

has there anybody an idee how to can handel large data?
I have ah mabye 200 MByte data and i read it out step by step, this works, but when i put all this part from this data in together in one array, the memory is killed. My main task is it to show the complite data in one graph. Maybye one solution is to show it as an picture like plot, but therefore i need the complite array, Or can i put the data succesive by step bye step-readout in an plot, so that i don't need a complite array (only the array from the actuell readout-step), for showing the hole data in a plot?. Has there anybody an idee how to can show the complite data?

THX
0 Kudos
Message 1 of 9
(3,425 Views)
Hi!
   I think it's impossible to represent in an user-friendly manner ALL 200 MBytes of data... maybe you'd average them... It depends upon what kind of data it is!

   Anyway, in these cases, you should turn to Databases, it allows you to have more powerful means for handling large amount of data.... but it too depends on what you've.

   Let me know if this helps, if not, please provide further details!

   Have a nice day!

graziano
0 Kudos
Message 2 of 9
(3,410 Views)
Probably, this tutorial and asociated vis can be of help to you.


http://zone.ni.com/devzone/cda/tut/p/id/3625

Cheers
Alipio
---------------------------------------------------------
"Qod natura non dat, Salmantica non praestat"
---------------------------------------------------------
0 Kudos
Message 3 of 9
(3,402 Views)
I have mabye a 200 MByte data. In this data are measurement data, thats mean it is an 1D-array with value of voltage charges. When i read this data complite out, there is not enough memory for it, because the 1D-array becomes to large. ok i read it step bye step out, in mabye 1MByte steps  and  send it to the memory, so that i have at the end one complite 1D-array with the hole data in side. Now when i will show the data as a graph, the pc is realy realy slow, and the save-process in the memory becomes at the end also very slow. So i had the idee to save it in an 2D-array, so that every step is in his one array , so that the readout-steps are one after an other in this 2D-array. This is realy faster, but i have also the problem that i can't show the data complitly in one graph, this becomes also very slow. Has you any solution? maybe i can plot it?
When i have one 1D-array withe the hole data i could be plot it, so that every step is plot to the step before, and the data for the step before are killed?

I don't Know what i can do to make this faster

THX
0 Kudos
Message 4 of 9
(3,392 Views)
Not sure I understood....

I'd suggest 2 possibilities:
   A) Use decimation (possibly with low-pass filtering)

   B)  Plot data in windows, eg.: if you have data from 0 to sample 1M, you plot 1k of data per window (ie per graph), and you let the user select (scroll) this window.  This may  be confusing, so I'd prefer to decimate, ie to show fewer data, but which is significant to the user... really, I can't figure a plot with 2M samples....

   Have a nice day!

graziano

  
0 Kudos
Message 5 of 9
(3,387 Views)
I agree with you, but the question is for me, how can i show the data in one plot.
ok with scrolling it is a good idee. But can you say me, how i can unpuck the 2D-array so that i have the complite data in one graph with scolling??

THX
0 Kudos
Message 6 of 9
(3,381 Views)
If I understand right, you're just asking how to unpack 2D array into rows (Which are 1 D arrays). 

I attach a silly VI to demonstrate this.  Then, you have to add each row (1D array) to your plot, someway.  Good luck!

graziano
0 Kudos
Message 7 of 9
(3,374 Views)
My solution to this problem is to use the following process:
 
Divide the number of data samples by the double width of your computer screen in pixels
Read data from file in chunks this size
As each chunk is read use one of the following methods:
         Calculate the mean of data in each chunk (This will average data shown in each pixel)
                 or
         Calculate the minimum and maximum values in each chunk (This will show all peaks in the data, this is typically ideal)
Feed each resulting data point into an array (If the minimum/maximum method is used I place both points in an array in the order in which they occur.
 
Using this method will lower the amount of data stored in the system memory at one time.  Because there are more data points available than the size of the screen, you will essentially see the same data as if all data was loaded onto the graph.
 
Note that if you zoom in on the data, the process will have to be repeated for the range of data to make sure that you have the proper resolution.
0 Kudos
Message 8 of 9
(3,360 Views)
Hi chefcommander,

please read the knwledgebase article and the manual on handling large chunks of data!

And remember:
When you send data to an indicator you create a copy of the data! So even when you read in your 100MB of data you create another 100MB chunk of data in memory when displaying them in a graph (which is rather silly to display 100M values with roughly 500k pixels available...).
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 9 of 9
(3,354 Views)