LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

external trigger/clock source

I need to have real time control over the running of the iterations of a for loop. It would be great if I could do this without having to resort to LabVIEW RT. Will it work if I have a case statment inside the for loop that will run a while loop until the input (external signal) is hi? I know that the unregulated execution time of the for loop is faster than I need, but can the software respond fast enough to the clock signal? What's the smallest regular time interval I could achieve between iterations? (I need the loop to run at 10 MHz)

Thanks in advance for your input.
0 Kudos
Message 1 of 13
(5,355 Views)
If you need realtime, then you need realtime. If you don't need true realtime, then by all means, fake it. Of course, to run at 10MHz is asking a lot of anything, including Realtime.

If you could provide some more detail as to what you are trying to accomplish, someone may be able to help.

As for your implementation, getting your software to synchronize with hardware at 10MHz is impossible. The time it takes to get the signals from the hardware alone would probably cut you down to less than 5kHz, probably a lot worse.

Also, I don't believe that LabVIEW RT is yet capable of 10MHz, but don't believe me for a minute.

If you want to avoid buying LabVIEW RT, and instead, waste more time and money, there is a way to do realtime, without LabVIEW realtim
e. Get a copy of Gary Johnson's "Power Programming" third edition. In about chapter 22, they discuss downloading LabVIEW to linux on an embedded system. This includes Linux Realtime. However; this means buying a copy of LabVIEW for Linux, so you may as well just get LabVIEW RT.

Of course, the real question here is: Do you really need realtime, or do you just need 'realfast'. There may be a way to get close to 10MHz with hardware, but you can't do it with software/hardware. Windows just has way too much overhead, and the ms timer is only accurate to 55ms!
0 Kudos
Message 2 of 13
(5,355 Views)
Not that I think you have a chance of reaching this rate, but give us a clue about your hardware. Usually the tough part isn't the software loop rate, but rather reading single hardware values fast enough in a loop. Have you tried simply reading single points from it into a while loop? That would give you a clue as to your maximum. What exactly is the "clock signal" you mention?

Regards,

Doug
0 Kudos
Message 3 of 13
(5,355 Views)
Please answer true or false,

Do you need to respond to an external condition in 100 nsec?

If yes LV can not do this! Real time or not.

What are you trying to do?

I can read from input device at 10MHz and display and save to disk. Along the way, I could subtract an offset value. To accomplish the sutraction I loop at better than 10 MHz. But,

I can only expect to collect the readings from my DAQ device at 40 KHz or less (if I am running in RT).

Please tell us what are trying to do and we will be able to help direct you.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 4 of 13
(5,355 Views)
Hi,

Thanks to all of you who responded to my post. I just read over it again and realized how convoluted and vague it was. I am new to LabVIEW and relatively inexperienced with hardware/software interfacing so I may not be asking the right questions. I'll try to describe more clearly what I am trying to do.

We want to implement a stand-alone test system for MRI gradient coils. This involves generating trapezoidal waveforms and sending this to gradient ampilifiers/drivers. The drivers need to receive each datapoint in the waveform at a rate of 250kHz. Each datapoint needs to be transmitted (in the form of a 16-bit long word) in sync with a 10Mhz clock. I thought the easiest way to do this was with a set of nested for loops, the outer generating the dat
apoints (@ 250kHz) and the inner one outputting it bit by bit (@ 10Mhz). The clock signal would come from an NI 653x card. So in reply to Ben's question, yes, I do need a response of 100ns.

Your help is greatly appreciated.

Lei
0 Kudos
Message 7 of 13
(5,355 Views)
Also, I know that when left to run without any timing control, my program will run at speeds much faster than needed.
0 Kudos
Message 8 of 13
(5,355 Views)
Also, I know that when left to run without any timing control, my program will run at speeds much faster than needed.
0 Kudos
Message 9 of 13
(5,355 Views)
Lei:

If you are only trying to capture some data, may I suggest that you
get a GPIB controlled high speed digital oscilloscope or logic
analyzer with a lot of memory? Agilent and Tektronix make them that
can go to the gigahertz range. Let the dedicated instrument do the
high speed capture and then retrieve the results with the host PC.
You will not be able to capture data at 10MHz rates with a PC directly
especially with LabVIEW.

Douglas De Clue
LabVIEW developer
ddeclue@bellsouth.net


Lei Dai wrote in message news:<5065000000080000000A510000-1023576873000@exchange.ni.com>...
> I need to have real time control over the running of the iterations of
> a for loop. It would be great if I could do this without having to
> resort to LabVIEW RT. W
ill it work if I have a case statment inside
> the for loop that will run a while loop until the input (external
> signal) is hi? I know that the unregulated execution time of the for
> loop is faster than I need, but can the software respond fast enough
> to the clock signal? What's the smallest regular time interval I could
> achieve between iterations? (I need the loop to run at 10 MHz)
>
> Thanks in advance for your input.
0 Kudos
Message 5 of 13
(5,355 Views)
I believe that NI has recently released a 10MHz MIO Bd.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 6 of 13
(5,355 Views)
Hi,

When I was asking "does it have to respond..."

I ment to say

Does your app require that you read a value, do a computation and then output a control signal (or what ever) where the time between reading the input and driving the output has to be 100ns or less.

From what you have said already, it sound like you you could just use a high speed DAQ board that uses external triggering.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 10 of 13
(5,355 Views)