Counter/Timer

cancel
Showing results for 
Search instead for 
Did you mean: 

send bytes, acquire counts and show data in 10ms

Hi!

I have asked something like this before but maybe i wasn't very explicit or maybe i was too much confusing so i decided to simplify my problem.

I have a PCI-6601 acquisiton board and i will use it along with a digital-analogic converter i made, to acomplish this:

1. I want to send 24bits, 2 pairs of 12 bits, to a DAC; those bits represent numbers from 3300 to 0;

2. The process is like this: i send the number 3300 to the DAC, then i will start collection the counts from the counter for 10ms; i will then update an xy graph with the data (x- voltage; y - counts);

3. Next the number is decresead by "voltage step" (for example, 1mV that corresponds to 1 in bits); so i will do 3300 - 1 = 3299;

4. Finally, i will compare the current value to the final one (for example 0) to know when to stop;

5. The process continues at step 1;

Here's a figure to show what I mean:



Now, I know how to do this process, it's indeed very simple, but, the problem that i founded it's related with TIME and "SHIFT REGISTERS"

First, I don't even know if it's possible to do this and have a 10ms precision;
Second, I don't know if shift registers are the best method to do this. I want to update the xy graph in real time and have the option to save all the data (x and y) to a file at the end of the process;

Another problem is with the tasks of the NI-DAQmx. I think I have to stop and reinitialize the task in each loop iteration, and maybe that's why I can't have the time precision I want;

Can anyone give me an hint on how to do this or if i'm going in the right way.

I don't know if I was explicit, but anyways, thanks in advance.
0 Kudos
Message 1 of 6
(3,816 Views)

A few quick thoughts:

1. With your board you will have to rely on software timing to produce your 12-bit DO patterns.  Software timing isn't strictly reliable under Windows.  The best approach available would be to use a Timed Loop structure which is automatically given a high execution priority.

2. It isn't clear to me what you expect to "count" with your counter task.  With an external DAC, I would kinda expect you'd want to perform an analog measurement.

3. You don't need to stop and reinit tasks inside your loop.  Config & Start them 1 time outside your loop, perform your Write / Reads inside the loop, then Stop & Clear it once after your loop.

4. Updating an XY graph every 10 msec in "real time" will probably not be the best plan.  There's gonna be a conflict between your desire for precise 10 msec intervals and your desire to update a visible GUI object frequently.  You'll end up needing to trade off their relative importance.

Hope this helps as a starting point.  Note that a different piece of DAQ hardware (such as an M-series board capable of hw-timed digital output) could give you perfect timing intervals in hardware.  The time required to architect and implement software that can still only approximate that precision isn't likely worth it, unless you're in a student-like situation where your time is deemed "free" by the powers-that-be.

-Kevin P.

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 2 of 6
(3,810 Views)


@Kevin Price wrote:

A few quick thoughts:

1. With your board you will have to rely on software timing to produce your 12-bit DO patterns.  Software timing isn't strictly reliable under Windows.  The best approach available would be to use a Timed Loop structure which is automatically given a high execution priority.

2. It isn't clear to me what you expect to "count" with your counter task.  With an external DAC, I would kinda expect you'd want to perform an analog measurement.

3. You don't need to stop and reinit tasks inside your loop.  Config & Start them 1 time outside your loop, perform your Write / Reads inside the loop, then Stop & Clear it once after your loop.

4. Updating an XY graph every 10 msec in "real time" will probably not be the best plan.  There's gonna be a conflict between your desire for precise 10 msec intervals and your desire to update a visible GUI object frequently.  You'll end up needing to trade off their relative importance.

Hope this helps as a starting point.  Note that a different piece of DAQ hardware (such as an M-series board capable of hw-timed digital output) could give you perfect timing intervals in hardware.  The time required to architect and implement software that can still only approximate that precision isn't likely worth it, unless you're in a student-like situation where your time is deemed "free" by the powers-that-be.

-Kevin P.




Thanks for your tips Smiley Happy

I'm getting a TTL signal from a spectrometer. I send the bits to the DAC, so that I get an output voltage between A and B in steps of x. After I send each "package" of bits, I will start counting the TTL signal from the spectrometer, during 10ms, then I will show that value in a graph.

So:

1. Send Voltage Value (in bits)
2. Start the counter  and gather the counts for 10ms
3. Get the value and show it on the graph
4. Decrease the voltage value by step=x
5. Repeat step 1. until voltage = B

I want to get the counts in each step but I want to reset the counter to 0 after I get the counts. I'm using a DAQmx Stop Task after I collect the counts, but I don't know if it's the best thing to do, is it???

Again, thank you for your answer.
0 Kudos
Message 3 of 6
(3,800 Views)
I was thinking about the tip you gave me about the timed loop.

I made a simple vi usind that loop and I have doubts about its behaviour.

In the attached example, I have a 10ms timed loop with two frames, so I think each frame will be done in 10ms right? That made me think...

If in a "normal" while loop, I couldn't get 10ms times, why can I get it in the timed loop? I know the timed loop has a higher execution priority but the question is: the loop is made in the time I want even if that implies that the code inside it it's not runed? Or in other words, labview ignores what's inside the loop to grant the execution time?

Another thing is: I want the first frame to be executed without delays, I mean, for the first frame I don't need time precision, I want do do it as quickly as it is possible and then for the second frame to make it in 10ms.

Just another thing: How can I be sure that the times are fully acomplished?

Thanks
0 Kudos
Message 4 of 6
(3,799 Views)
Ops, I forgot to put the vi.

Here it is...
0 Kudos
Message 5 of 6
(3,792 Views)

I'm not near a LV PC now, so can't look at your example yet.  Maybe I help a little in the meantime though.

 

If in a "normal" while loop, I couldn't get 10ms times, why can I get it in the timed loop? I know the timed loop has a higher execution priority but the question is: the loop is made in the time I want even if that implies that the code inside it it's not runed? Or in other words, labview ignores what's inside the loop to grant the execution time?

   The timed loop suggestion needs to stay coupled with the producer/consumer suggestion where you have 2 (or more) parallel loops.  If you only have a single loop, I'm not sure that making it a timed loop would matter much.  The idea is that data acq is done in a high priority timed loop that should do quite well at waking up and executing 1 time every 10 msec.  A regular OS like Windows won't be 100% reliable at this, but a timed loop will probably be more regular more often than a standard while loop with delay.

   Your other question is very important and the answer is that all the code in the loop WILL execute.  Even if you have too much code and it requires more than 10 msec to finish.  There are different settings to tell the timed loop when to start the next (late) iteration, depending on the needs of your app.

 

Another thing is: I want the first frame to be executed without delays, I mean, for the first frame I don't need time precision, I want do do it as quickly as it is possible and then for the second frame to make it in 10ms.

It's possible to wire in values to the outside timing terminals of a timed loop to define an initial delay of 0, and then wire a value to the right-hand-side timing terminal inside the loop to change it to the normal 10 msec thereafter.

 

Just another thing: How can I be sure that the times are fully acomplished?

Timed loops provide left-hand-side terminals you can use to verify your timing.

 

--  Oops, on review, I realize I was mentally merging this thread with another one I got involved in.  So as to the stuff above about producer / consumer?  Well, here's that other thread.  I would recommend the same for you, so you can separate the parts of the code that need to run as regularly as possible (data acq) from those that aren't so critical (GUI graph updates).

 

-Kevin P.

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 6 of 6
(3,788 Views)