LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Decimate a 1 GB size array to meaning full data and apply filter

Solved!
Go to solution

Hey Guys,

 

So I used an high precision laser interferometer to collect data at 10MHz frequencing resulting into a 1 GB of file of raw position data. I converted this postion data to velocity data for analysis. But I still have a 1GB file with billions of lines for processing. I am expected to get all the frequencies below 1500 Hz from this data in time domain to analysis for lower frequencies. I have no clue on how to proceed on this. Any help is greatly appreciated. 

 

Also, due to the high file size, Labview runs into out of memory errors, that is why I want to decimate the 1 GB file into meaningful data. 

0 Kudos
Message 1 of 21
(4,713 Views)

What is the datatype of the array?

What is the format of the file (hopefully binary, but you are talking about "lines")

Do you have LabVIEW 32bit or 64bit?

Why are you collecting at 10MHz if you are only interested in frequencies below 1.5kHz?

You probably need to aply some filtering to avoid seeing alias frequencies.

0 Kudos
Message 2 of 21
(4,709 Views)

The datatype of the array is DBL. 

I currently have LabVIEW 32 bit.

We colledcted data at 10 MHz to get better resolution for the frequencies but did not expect to run into out of memory errors. 

 

Also, I recently collected data at 1 MHz, Which also needs to be decimated into an array with just lower frequencies (<1.5 Khz) to be plotted in time-domain. 

0 Kudos
Message 3 of 21
(4,699 Views)

Are you collecting the entire array into memory or streaming it to a binary file?

0 Kudos
Message 4 of 21
(4,685 Views)

Well, to solve the reading part of the program and not to run out of memory, I did

 

Read the array, split them into small chunks, write these chunks to tdms files and store them in a temp location.

 

While reading for ploting and actual analysis, I read these chunks in the same order as they were written to the files( this is where the problem starts, this is where I stream everything to the memory) and the error comes up. 

 

Right now my huge concern is I have to plot from the veolocity profile data just to look for velocity ripples(changes in velocities) under 1.5 KHz which involves decimating the array by which I think should solve the out of memory error. 

0 Kudos
Message 5 of 21
(4,664 Views)

We probably need to see a simplified version of the code.

Are you reading these chunks and decimate them individually, re-using the same memory for each chunk?

0 Kudos
Message 6 of 21
(4,655 Views)

I read the chunks first, stich them together in a concatenating for loop and make this data available for decimating and plotting. 

0 Kudos
Message 7 of 21
(4,646 Views)

Well, that's the problem. You know the final size, so why would you concatenate, requiring constant memory reallocations as the array grows. Arrays need to be contguous in memory so you will definitely run out of contiguous memory space, even if you don't use any indicators or subVI calls. Typically you would initialize the full size once, then replace with good data as you go, keeping track of the insert point.

 

Still 1G is probably too much. Your problem can easily be solved piecewise.

 

Is your file 1GB or does your array contain 1G of elements? 1G of DBL is 8GB of RAM which is not even addressable in windows 32bit. 

0 Kudos
Message 8 of 21
(4,641 Views)

Its a Tdms file size of 1GB. Inside it has billions of rows with encoder position values from which I get my velocity profile done. 

0 Kudos
Message 9 of 21
(4,626 Views)

If the file is 1GB of DBL, you cannot have billions of rows unless the data is very redundant and highly compressed.

(One billion bytes is ~ 1GB).

0 Kudos
Message 10 of 21
(4,619 Views)