LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

Large data files

I have been tasked to write an app that reads very large data sets, in the order of 100's Megs to Gigs of data per day stored on a HDD.

After opening the files, Im using the function call ScanFile (filehandle, "%*i[b2]>%*i", n, n, Array) to read the data into an int array.

This appears to be very slow, taking about 30 secs per 100 Megs on an up to date laptop.

Is there a faster way of reading the data from HDD into arrays?. Can anyone help please.

Cheerio
Mike
0 Kudos
Message 1 of 3
(3,148 Views)
Mike, you will find that the C function "fread" is much faster.

Open the file with fopen, and when reading in the array you just supply the pointer to the start of the array (i.e. the array name) and the size and number of elements to read in. For example:

short myarray[NDIM];
fp = fopen (datafile, "rb");
fread (myarray, sizeof(myarray),1,fp);
fclose(fp);

A bit more work is needed if you are not reading in the whole array in order (which is very fast), but hopefully this will help.

--Ian

Message Edited by Ian W on 05-12-2005 10:45 AM

Message Edited by Ian W on 05-12-2005 10:45 AM

0 Kudos
Message 2 of 3
(3,143 Views)
I never use FormatIO set of function
They are always slow and there is no way to optimise them when you compile code with MSVC.
Indeed when you use fread and others then when compiling your code with MSVC compiler you will use the Visual C verison which are even faster than the one coming with CVI.
Regards, Philippe proud to be using LabWindows since version 1.2
// --------------------------------------------------------------------------------------------
0 Kudos
Message 3 of 3
(3,128 Views)