LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Need microsec. timed while loop

Hello,

I'm making a vi where I do read some data and give some signals to a chip.

While practicing, I made the while loop to run at 1k ms (so then I could see the changes and so on).

But I do need the while loop to run at microseconds and I don't know how.

Any help?

here you have a screen from the loop, nothing really important to understand but the ms timer.

if the thing i rounded could simply be 1microsecond, my vi is completed

pd sorry for the english still practicing đꙂ

 

 

thanks in advance

0 Kudos
Message 1 of 5
(2,788 Views)

Hi Gaizka,

 

when you need loops iterating in the (low) microseconds range you should use a RT target or even better a FPGA target…

 

Why do you need iteration times of microseconds? What is the purpose of this loop?

Which DAQ devices do you use? Can you employ hardware-timed DAQ tasks?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 5
(2,777 Views)

Windows is not capable of consistent microsecond timing. I wouldn't even trust it for consistent millisecond timing. Since you only attached a partial picture of your code (and not the code itself) I'm left to guess at why you think that you need microsecond timing. I'm guessing that you're taking a single data point each time your loop iterates. You would be much better to get take multiple data points using hardware timing and then operating on arrays of data. Also, try learning DAQmx functions - they're not hard and are much more powerful than the DAQ Assistant.

 

Finally, I recommend that you do a diagram cleanup. You've got lots of hidden wires and wires running backwards in addition to the block diagram being much larger than necessary.

0 Kudos
Message 3 of 5
(2,760 Views)

As others have said, only hardware timing will solve your problem if you want to run under windows. As a first step get rid of the express VIs (assistants).

 

Not sure why you have an indicator for the "millisecond timer value" That value is of no interest.

0 Kudos
Message 4 of 5
(2,709 Views)

I guess you technically could set up some DAQ hardware to sample at 1MS/s (of you have such good hardware) or a counter, and read 1 sample at a time, but i'm uncertain if it'd actually work out.

Usually, you bunch such tasks to work in the 1/10s to 1/100s range, so if you actually did sample at 1MS/s you'd ask for 10000 samples and process them.

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 5 of 5
(2,659 Views)