LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Set duration with Tick Count

Solved!
Go to solution

HI,

I'm new to Labwiev and trieng my first steps in an university project.

In this I would like to modify a constant " Stress" so that it takes a very small value over a defined period (e.g. 10sec) (baseline) and after that for a very short duration (0.2s) a very high value (Peak).

I would like to run this cycle for a certain number of repetitions (as in the example) or over a certain period.

For this I coded an example, for which I have a few questions.

 

Why is the difference between the two millimeters sometimes 3001 ms, although I set it to 3000 ms. What can this be?

Furthermore, I would like to measure how long the entire process / for loop takes to have a control. Therefore, I have a timer in front and a timer behind the For loop attached and calculate the difference. However, if you let the process run for a long time, it deviates greatly from the time it can be calculated in the For loop.

I know it has something to do with the flow of data and how the tick counts works. But unfortunately, I'm stuck here a little.

 

Many thanks!

0 Kudos
Message 1 of 4
(2,305 Views)

Getting a loop to run at 1kHz on Windows is fairly unlikely, and certainly not reliable.

As a result, sometimes the wait will be a little longer than 1ms in a given iteration.

If this happens in the last iteration (but not previous iterations with your current method) then it will cause the time to be longer.

(If it happens in a previous iteration, you'll have fewer iterations)

 

Presumably you're planning on outputting these values via some sort of DAQ board, perhaps as an analog output?

If so, the much better and more reliable timing method you could use would be the DAQmx code (or equivalent perhaps, if you're not using NI hardware).

 

You can set a pattern of low-low-low-low.....-high-high-low-low-low...-high-high etc using the DAQmx code and then have the sample clock for the board control the timing.

This will make it much more reliable than trying to use a Windows For/While loop at 1kHz.

 


@feld123 wrote:

Furthermore, I would like to measure how long the entire process / for loop takes to have a control. Therefore, I have a timer in front and a timer behind the For loop attached and calculate the difference. However, if you let the process run for a long time, it deviates greatly from the time it can be calculated in the For loop.


Not sure exactly what you mean here - I see the time you're calculating, but what are you comparing it to? Your clock/watch? Or the expected time of iterations*(seconds1+seconds2)?

What is "a long time"?


GCentral
Message 2 of 4
(2,248 Views)

Hi,

thanks a lot for your quick answer. I had not thought about Windows, but yes, that make sense.

I work with VISA and control a motor via this Interface.

In terms of speed, I try to drive two levels with the same pattern I described before. Is hereby a possibility to program an reliable timer?

Cheers

0 Kudos
Message 3 of 4
(2,160 Views)
Solution
Accepted by topic author feld123

Hi,

 

Sorry for the delay responding - I haven't had reliable internet connection for a few days.

I'm not really sure what you mean regarding your "drive two levels" and the pattern you described.

 

It should be possible to control multiple motors with Windows provided you don't require a very fast loop iteration speed.

This can be helped with the idea I described above, namely:

Presumably you're planning on outputting these values via some sort of DAQ board, perhaps as an analog output?

If so, the much better and more reliable timing method you could use would be the DAQmx code (or equivalent perhaps, if you're not using NI hardware).

 


GCentral
0 Kudos
Message 4 of 4
(2,130 Views)