LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Timing Loop Local Variable

Hello All,
 
Attached is a program I developed, it utilises two loops one for timing and one for writing the data. The program works perfectly apart from a small timing issue, which is related to the local variable. Whenever I write the measurement file using the enable button that is sent from the timing loops to aquire, I always get different number of points in the data file. I am sure this is caused due to timing descrepancies between the sending of the local variable from one loop to another each time an aquisition takes place, and running in windows does not help.  Is there a way to rectify this problem, I.e apply a known time delay to when the local variable is read to stop the aquisition and thus get the same number of data points? If not I can always get rid of the extra points in post processing.
 
to use the program please enter some values for velocity and forcing frequency.
 
 
Cheers
Alloush
 
 
0 Kudos
Message 1 of 8
(3,712 Views)
Hi Alloush,

I would not call your vi 'Final' Smiley Wink

I made some changes to your vi to remove unneccessary wiring - have you heard of 'auto-indexing' before? It's a basic principle of Labview coding. You should also read the development style guide! And please check the comments I left in your code...
At the moment you write some simulated signal so you should get the same data on every run...

Message Edited by GerdW on 08-01-2007 04:39 PM

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 8
(3,702 Views)

Hi thanx for that,

why should i place a time delay in the aquisition loop?

 

Cheers,

Alloush

0 Kudos
Message 3 of 8
(3,686 Views)
Hi Alloush,

you should place a delay because:
1) it's style guide
2) at the moment you just simulate a signal, thus this loop is busy all the time (hogging the CPU)
3) your acquisition step in the upper loop takes at least 1 sec (100*10ms) - so you only need to check the 'acquiring'-local every ~500ms
4) there is no proper connection between the upper loop and the lower loop - you should use other techniques to control the acquiring of data when you need time repeatability: one way could be to set up a consumer/producer pattern using a queue to send commands to a state machine...

Remember: hogging the CPU will not give you proper timings...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 4 of 8
(3,668 Views)
Hi GerdW,

I undertand the style issue and the hogging of the CPU. In my example I only simulated a signal, where in reality I will be aquiring data at a rate of 10KHz. Therefore if I put in a time delay, that will control the rate of my acquisition rate and not the aquisition rate specefied by the user (correct ?) (dependingon how many samples I read etc..). I read the memory managememnt guide and read allot on queues etc... Will ths be easy to implmenet in my code, with the condition that when I first run the code, it begins aquisition so I can spot check the data, but does not save until I run the timing sequence (just like it does now). If this does not take allot of developement time, I am willing to develop my application for more some timing accuracy.

Cheers,
Alloush
0 Kudos
Message 5 of 8
(3,660 Views)
Hi Alloush,

ok - now you gave some more information on your task...
You use Windows and need some proper timing! Please keep this in mind.

Your DAQ runs at 10kHz giving a resolution of 100µs.
How accurate you need the timing?
How big is your "small timing issue"?
How do you measure the time? (Which timer is used? You should know the standard timer has a resolution of ~16ms...)
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 6 of 8
(3,656 Views)
Hi GerdW,

I understand that windows is not a real time operating system, and this has been discussed in length in these forums, with hints of how to improve its performance. I will be sampling 11 AI channels, as far as timing accuracy is concerned I just need the saving of data to start at the same time for all the channels (which it will do) the discrepancy arrives when the the data stops saving for each run, I.e. I will get files with slightly different sizes due to the delay when the local variable acquisition turns off. As far as measuring time I have not done that. the only reason I would like to do it accurately is to satisfy my labview curiousity and improve my developement capabilities. So If I am using DAQ in the while loop I should not place a time delay, correct ? if acquiring at 10 Khz.


Cheers,
Alloush
0 Kudos
Message 7 of 8
(3,653 Views)
Hi Alloush,

yes, with some actual DAQ in the loop you don't need an additional delay.
I made this comment as I thought you would do the DAQ in the upper loop and only used the lower one for saving data...
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 8 of 8
(3,646 Views)