LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

save data with small & large numbers, with header

dear all,

I am sorry if this topic is already existing (I could not find anything so far). It is probably a general topic, related to saving data. 

 

I have 26 columns of data recorded with time. Time is from 0sec to 500 000sec (about 5 days), taking data every 1sec. Other data (column 2 to 26) are in 10^-9 range, or at least decimal. 

 

I need to add headers on the recorded file. 

 

So my "bad strategy" is

1- create a new file with "write to excel" using string format %s (because of headers)

2- write first row with header names (%s) on 26 columns

3- collect DBL row of 26 column

4- convert the row to string

5- append the string to the existing excel file

6- goto 3 (repeat)

 

This is surely not a good strategy because at some point, the time (column 1) becomes larger than 4 digits, and I do not get enough resolution. For example 510140 sec is repeated 10 times until changing to 510150 sec (it should be 510141, 510142, ...). This seems to be due to a default of 4 digits. I am not sure. 

 

So at step 4, I think I should define the string format as %6.9 rather than %.9 (9 being 10^-9). Am I correct ? 

 

But ideally speaking, I'd rather write DBL in excel file, using automatic format for each data. At the end, read the file, convert to string, and insert the header. The file is almost 30Mo. 

 

I know there is TDMS that is doing the job very well. But I cannot use TDMS for post-processing. I would have to convert TDMS to text/excel, but this is not convenient. 

 

What should be the correct way to write those large/small data together in a text file with a header ? 

Should I multiply the time by 10^-9  in order to have a total string length of 9 rather than 9+6=15 if I use %6.9 ? (%6 is suitable for time in sec, and %.9 is suitable for the other data, so the combinaison %6.9 lead to 15 digits + point = 16 digits). 

 

Yours

Laurent

 

 

 

0 Kudos
Message 1 of 8
(2,156 Views)

Look in the LabVIEW Help, under Fundamentals, File I/O, Basics of File I/O, and File Formats.  You can also read some of the subsequent explanations of the "finer details".

 

The issue is, do you want your Files to be "human readable", or is it OK to write binary files (where you can write each "element" of your Data in the most "logical" way, for example, Times as "TimeStamps", Counts as Signed (or Unsigned) Integers, and "measurements" as Dbl.  Each "reading" will be a mixture of "26 columns of data", of different types, suggesting that you organize your data in a Cluster, with each element having a "human-understandable" name (such as "Date/Time", "Temperature", "Weight", etc.  You can write the Cluster to a Binary file, or (if you have many measurements you want to write/read "all at once") you can write an Array of Cluster.  [Hint -- if you go this route, be sure to create a TypeDef for your Cluster and use it wherever you can].

 

A compromise between this kind of "flexibility" and "Human-readable" is to use something like XML, which allows you to parse the structure of your data and save it along with the values.  Note that XML files can be quite large ...

 

Or you can use Excel.  It is possible to use the Report Generation Toolkit to write a Worksheet one cell at a time, in a format appropriate for that cell.  So you write the first row as all Strings (since they are Column Labels), then you write the elements of your Cluster one cell at a time, using the format that is appropriate for those data.  I've done this, but it requires a pretty deep dive into the RGT, and a lot of experimentation and testing to get it right.

 

My suspicion is that TDMS (which, I must confess, I've not yet used, but think I will need for my current Project) is likely to be the best compromise between "Human Readability", "file efficiency", and "ease of programming".

 

Bob Schor

0 Kudos
Message 2 of 8
(2,103 Views)

Please take a step back and tell us the real requirements. Where does the 4 digit limitation come from? That's certainly not a given. Could it be that your cosmetic display format in excel is lacking?

 

Why does it have to be excel? Can't you just write e.g. a tab delimited text file and fully control the header as well as the format number of decimal digits for each column?

 

Tell us more about the "post processing" that you need to do. That seems to primarily dictate the solution to all this. I don't know what "text/excel" is (text and excel are completely different things!!!), but you should create a file that the post processing can read directly to eliminate the middle man.

 

Can you attach a simple VI with some typical data? Also explain what kind of program does the post processing (another LabVIEW program that you wrote? Some third party program? etc.).

0 Kudos
Message 3 of 8
(2,092 Views)

Thank you Bob for the suggestions. 

Yes, I would like a human readable file, and simple. XML is not needed I think. 

TDMS is very convenient, especially for large amount of data. I used it in another program, but it requires TDMS conversion to be readable with Matlab, excel, python, ...

So here, I did not implemented TDMS because the file size is reasonable (~30Mb), not Gigabytes. 

Basically, it is a kind of "write to excel with headers" that is needed. 

And my strange problem is just coming from the first column of data, which the time in sec, ranging from 0 to typically 500000 sec. The other columns are mostly decimal data <10 with in some case, 9 digits decimals. 

So if I set the unique format that will covert the 16 digits (xxxxx.xxxxxxxxx), I will get time as 500000.000000000 sec for each row.  Because of the first initialisation of the excel file with headers, the format will be "String" rather than DBL. So I convert the other row from DBL to string then append to the existing reference path of the created excel file. Probably my problem is that I do not set correctly the conversion from DBL to string, taking into account 16 digits (which is not needed for the 1st column = time). 

 

0 Kudos
Message 4 of 8
(2,064 Views)

Dear  

yes you are totally right : "Where does the 4 digit limitation come from?"

This is the problem I have. I could not understand this limitation in this program. 

Actually, I use "write to excel" but I define the file extension as .txt, so it is not really excel. 

I think I should use the basic write to text file in that case. 

 

Why using write to excel ? 

This is because I save 2 files : one is DBL without headers. The other one is same data but as string with headers. 

The first file in DBL, is a kind of dummy file. 

The second one, as string, is more convenient because it allows to write some comments during the experiment, and save those comments on the first rows, then add the DBL converted to string, following the comments. Indeed, the headers are added at the ned of the comment. 

This is the way I found to add text before headers, anytime during the experiment. 

And this is very convenient. Now, there is no need to use excel structure to do that, I agree. 

 

So I just need to understand why the first column (time), at some point, does not take into account 6 digits and is limited to 4 digits like xxxx00 sec rather than xxxxxx sec. 

I will investigate today in the lab. 

As other data are like x.xxxxxxxxx , I guess I have to set %6.9 in the conversion from DBL to string. This setting is not well implemented. 

 

Overall, my program works, but there is only this issue about time recording. 

The program is stable, it works continuously for 5 or 7 days. Another problem that I solved, was the intempestive Windows10 update that was restarting the computer during the experiment ! it never happened with Win7, but I fixed it by opening the Windows update panel through labview, and delaying updates for 7/14 days (through a call to windows command prompt). 

 

Post processing is performed through Matlab, with a 2000 lines of code that output 80 graphs in a powerpoint format, having a deep insight on all the recorded data ... Fortunately, I do not use the time dependent analysis, as plot(time, data), but rather plot(data1,data2), so there is no impact of this time recording problem, on my data analysis. That is why I did not find the problem earlier. But yesterday, I decided to plot data with time, and I found this lack of digits in the time column. I did not expect this problem. 

 

I will send a typical set of data, truncated because original data is 20Mb file !

 

0 Kudos
Message 5 of 8
(2,060 Views)

@arienai wrote:

As other data are like x.xxxxxxxxx , I guess I have to set %6.9 in the conversion from DBL to string. This setting is not well implemented. 

...

I will send a typical set of data, truncated because original data is 20Mb file !

 


I think you have some basic misunderstanding of format codes. For example %6.9f makes no sense. You cannot have a width of 6 and 9 decimal digits. Typically, the width is larger than the number of decimal digits. (Still, LabVIEW will probably gloss over the mistake ;))

 

Yes, please attach some typical (But truncated) data.

0 Kudos
Message 6 of 8
(2,055 Views)

yes, you are right, I did not (at the moment of the post) looked into the format definition. I wrote %6.9 without verifying the structure of this code. Thank you for refreshing my mind.

  

What I wanted to say, is that I need 6 digits on the left side of comma, and 9 digits on the right side. So I should write %16.9

0 Kudos
Message 7 of 8
(2,052 Views)

Dear Altenbach

I attached truncated data (beginning, middle, end of the experiment). 

We can see that there was only 4 digits in the first column.

So I read again the help, and I set the string format to %#9 at this moment. I hope it will be OK to handle any data like those typically shown in the attached document

Yours

 

0 Kudos
Message 8 of 8
(2,035 Views)