05-10-2015 05:36 AM
Hello every one,
could some one could help me. i dont unterstand the timed sequence on this .vi
05-10-2015 08:34 AM
05-10-2015 08:55 AM
both of them. I dont understand. please help me. also the code I sent
05-10-2015 09:04 AM
05-10-2015 09:13 AM
on myDAQ
05-10-2015 09:16 AM
my daq and pc
05-10-2015 09:36 AM
The myDAQ is a LabVIEW device that you communicate with using LabVIEW drivers -- it doesn't strictly run LabVIEW code. The code that you show runs on your PC.
You show two things -- a (single) sample of N points from N (2, apparently) Analog input channels, and a second timed sequence (which I've never seen used, nor have I ever used it, myself) that appears to toggle on, then off, a single digital output line.
Here's what I would expect to happen when you run this code. You will get two N-element arrays in AI_1 and AI_O (note that you probably want to name that AI_0, zero rather than Capital O) whose value will depend on what is connected to the two inputs of your myDAQ. The time it takes to do this will depend on the sampling rate you set in your Task as well as the number of points that you sample. It is even possible that nothing will happen, as you don't appear to have started the Task (but it might be auto-starting).
I'm not as confident in predicting what will happen with the Timed Sequence. I notice you did not wire any timing information controlling the first frame, so the timing of this will depend on the default values. However, the timing might not be what you expect, as if these act like all other LabVIEW timing functions, they are "wait until", not "don't do until" functions.
So if (for sake of argument) the default was 1 tick of a 1KHz clock, and you wired in a period of 1000 (so a 1-second period), I would expect the Timed Sequence to behave as follows:
Thus the Period that you wire in will have no effect on the timing of the On/Off sequence, only on how long it takes to get out of the second frame.
If what you want to do it to toggle the digital bit on and off, I would recommend using the Timed Loop and a shift register. Wire the half-period into the dt input, and the initial setting (True) into the shift register. Inside the loop, wire the signal on the Shift Register into the DO function, and also wire it to the right-hand Shift Register terminal, putting a Not operator in (to turn On to Off and Off to On). If you want to do a single toggle (i.e. run through the loop exactly twice), you can test the loop index, "i", and if it is equal to 1 (meaning it has done the "0" and "1" case), wire this to the Stop terminal of the Timed loop.
Bob Schor
05-10-2015 09:58 AM
OK, the first thing is to lose the timed structure. To begin with there is so much latency in the windows os that the "timing" is pointless. The other reason to not use the timed sequence is that on a PC you should never use sequences of any type, ever.
LabVIEW is a dataflow language. That means order of operation is defined by when data becomes available -- not some aribrary and artificial structure.
Mike...
05-10-2015 04:53 PM
thanks guy.