11-21-2005 07:26 AM
11-21-2005 07:40 AM
Rather than use the high level spreadsheet read VI, you probably need to read your data in a line at a time using the lower level open- read (til EOF) - close. For such big data sets you should really consider saving the data in a binary format, you will suffer less.
Sheldon
11-21-2005 09:24 AM
When you open (and read) a file you usually move it from your hard disk (permanent storage) to ram. This allows you to manipulate it in high speeds using fast RAM memory, if you don't have enough memory (RAM) to read the whole file, you will be forced to use virtual memory (uses swap space on the HD as "virtual" RAM) which is very slow. Since you only have 384 MB of RAM and want to process Huge files (200MB-600MB) you could easily and inexpensively upgrade to 1GB of RAM and see large speed increases. A better option is to lode the file in chunks looking at some number of lines at a time and processing this amount of data and repeat until the file is complete, this will be more programming but will allow you to use much lass RAM at any instance.
Paul
11-21-2005 09:30 AM
11-21-2005 09:53 AM - edited 11-21-2005 09:53 AM
Message Edited by Tomi M on 11-21-2005 05:55 PM
11-21-2005 11:59 AM
Hi Tomi,
I seem to recall the same value... 2GB limit.
A "fairly" large file, nevertheless.. 😉
11-22-2005 07:54 AM
The 2GByte limit does not exist for LV8, provided your file system can handle the larger files (e.g. FAT32 can't, NTFS can). For more tips and tricks on handling large files, check out Managing Large Data Sets in LabVIEW.