LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

time loop problem

Hi all,

I will really apreciate if you can give me a clue about a problem I have.
I do an acquisition divided in several tasks, each of which happens in a while loop . The acquisition should take each time(for each task) 10 secs - a sound generator should prompt me at each sec.
What happens is that for each task the time interval(of 10 sec) decreases (not even the first task takes 10 secs), and not even the sound from the sound generator emits one a second as it supposed to be, but emits faster.
So in the end, for instance, for task1 I get recordings for 7 secs, for task2 I get recordings for 5 secs, for task3 I get recordings for 3 secs. I'll have more than 10 tasks, so after only few, there will be no more recordings due to this decresing timing.
I would like to ask you if you see any reasons for this? I cannot so far.
I atached the VI.
Thanks  a lot,
Dana
0 Kudos
Message 1 of 4
(3,154 Views)
Hello, it's me again.
To relieve you of the effort of trying to understand the problem, I will bring new information which are a bit changed due to a further process of undestanding that I underwent..
The 'errata' in the previous post is that time doesn't decrease, but just modifies, dependent on how fast I am to type in the name of the file in which the recordings are made.
So the solution would be to separate (in time)the file creation from the rest of the code (sound generator + data acquisition). In one sequence, the file creation should be done, and in the second, the sound generator should start produce the sound at the set intervals and the acquisition is done in parallel. And the two parts don't overlap in time.
Is this done by the flat sequence?

Thanks again,
Dana
0 Kudos
Message 2 of 4
(3,141 Views)
Hi Dana,

I can see atleast two issues:

(1) The input to your "Wait (ms)" functioin is zero. I think this should be 1000 and then you would get a sound beep every second.
(2) You should not use the "Tick count(ms)" function. The value of the millisecond timer wraps from (2^32) -1 to 0 and the base time is undefined. Please read the LabVIEW help for this function. I would rather use the "Get Date/Time in seconds" function.

As a side note, some level of documentation on your VI would be helpful.

Hope this helps.
0 Kudos
Message 3 of 4
(3,140 Views)


Prashant Oswal wrote:
(2) You should not use the "Tick count(ms)" function. The value of the millisecond timer wraps from (2^32) -1 to 0 and the base time is undefined. Please read the LabVIEW help for this function. I would rather use the "Get Date/Time in seconds" function.

Unless the time to be measured exceeds 2^32ms (~50 days) there is absolutely no problem using the "tick count(ms)" this way. The measurement will be correct even if the tick count "wraps" between readings. That's the way unsigned integer math works. 🙂 Try it!
 
The use of tick count(ms) is correct here. 😄
0 Kudos
Message 4 of 4
(3,110 Views)