LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Time dependent applications

I am writeing an application that needs to generate a waveform on screen and also log the data to file. This has to be a textual input file.

I'd like to capture something like 10 readings a second from the device that is used as the measurement input and log this to file as well as displaying them on the waveform.

I've put together the program and have noticed that I get output that is on average 1 second appart, I have the portion of the program that waits to time the input waiting for 10ms, which should allow data to be gathered (in theory) every 10ms or so.

What could be causing this massive (in relation to the ideal time spaceing of measurements recorded to the log file) drop in time between each capture and store of the input de
vices data.

I have even gone to the length of collecting a specific ammount of data in a buffer before writeing it to the file to save on disk access, this made no change.

The device im inputing from is simplistic, it returns a couple of numbers from an rs232 connection.
0 Kudos
Message 1 of 2
(2,526 Views)
Post your code and we can probably give you a better idea.

Some things to watch out for:

Don't write to the file every time. You can try streaming to the file, which means opening the file in the beginning, continually writing to the file, and then closing the file when you are done (standard Write Characters to File, etc, open, write, and then close the file, a VERY time consuming process.

Don't reconfigure your data acquisition each time. This includes configuration of the serial port. Again, you want to open the port, read periodically, and then close the port when you are done.
0 Kudos
Message 2 of 2
(2,526 Views)