LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

polling input

Solved!
Go to solution

Hi,

I am a newbie here. I am applying analog voltage to output and then waiting for a digital input to become true as a result (from false) .Maximum waiting time for input to change state is 25 msecs.  After applying the output ,I want to count number of milliseconds it takes for digital input to change state after output is applied.I tried reading in loop, also tried multiple reads without loop but I am not getting precise time lapsed, for the input to change state. I put time count everywhere and it shows me that is takes 5 msecs for every read operation.so basically there can be a difference of 5 msecs between the actual time difference and the value I get as time difference.

Is there any way to get this time within 1 msec tolerance?

Thanks in advance

Sam

0 Kudos
Message 1 of 13
(3,978 Views)

Post the code!  Descriptions like that help very little.

 

Also, as a general note, using Windows to get exact time measurements is ultimately a bad idea.  You want to configure your hardware to do time measurements and then read those values later to measure them.  Windows will arbitrarily take control away from LabVIEW to do system things at any time and add hiccups to your data.

0 Kudos
Message 2 of 13
(3,957 Views)

Thanks. Can you please give me some more details or let me know where can I get details on this(or examples)? As I have started using labview recently, I am little bit confused about this. Also as stated previously a tolerance of 1 msec is ok for me, if that is what windows is going to cause.

0 Kudos
Message 3 of 13
(3,951 Views)

Wait 1000.png

This should in theory produce a value of 1000.  100 wait times of 10 ms each.  (The first one doesn't count, since we're measuring the time only at the end of each FOR loop).

 

If you run this code, you're going to get a value of around 1030, but it might go up to 1042 on one run, down to 1021 on the next, and so on.  That's with LabVIEW assigned to do literally nothing besides counting times.

 

This is part of why I'm asking you to post your code (which you still have not done).  The other part is that if you're using a remote instrument (like something attached to a COM port) then you're at the mercy of the communications delay, which not only goes through Windows but also has the lag time of communications, plus the latency of that instrument.

 

However, if you're using something like DAQmx, there are hardware timing commands that you can use to get much more exact values.

 

There's also the chance that you've just made a rookie mistake in your dataflow somewhere.

 

So.... post your code!

0 Kudos
Message 4 of 13
(3,941 Views)

Sorry for the delay in sending the code. I was not in front of my computer before.

Here is the code. Basically I am starting a timer when DAQMX start task command for output is sent.Then ,using sequence structures to read the status of input .If the previous reading does not give me a "True" value, then I read it again. When I get the "true" value, I store the timer value again.

By the time I read it 3 to 4 times, I already reach my 25 msec maximum time limit for change in value of input so I just display message saying "delay value exceeded".

Before this, I tried a do loop but it was giving me more deviation in the time delay value 

0 Kudos
Message 5 of 13
(3,914 Views)

You can use a DAQ Event, like below.

 

snip.png

 

You can drag this snippet into a block diagram.

 

This will automatically fire an event every time the digital line changes, rising to falling or falling to rising, etc. You can configure how you want. No polling required. You will need to incorporate into your VI.

 

mcduff

0 Kudos
Message 6 of 13
(3,902 Views)

OK, since you're using DAQmx, you can get very precise timing.  That's the good news.

 

However, your current method is not the way to do it.  For starters, any time you use a sequence structure in LabVIEW is a red flag, and you're using a ton of them there.  Also, all of your "get time" nodes are floating in space, so they just run "whenever" the  sequence they are inside runs.

 

You've got a ton of nested case statements, each basically doing the same thing as the last one.  Those sorts of things should be in a loop that ends either on success, error, or timeout.

 

You have a subVI there ("Untitled 10") that wasn't included in your upload but I can only assume applies the analog voltage that you mentioned in your first post.  It's not synced up at all to your code that runs the digital input/output check, so it'll just start "whenever".  You could sync it up a bit by using the error chains better but it's really not the best way to go.

 

Setting up an event as McDuff suggests would be an improvement. 

 

All of that would get you closer to the accuracy you want, but you'd still be off by whatever random pauses Windows throws at your application.

 

However the proper way to do something like this is:

 

1. Set up the DAQmx task that starts the analog output, but don't run it.

2. Set up the DAQmx task that does the digital input, and set it to trigger on the analog output signal

3. Set up an event like the one McDuff posted

4. Start the digital input task (so that it waits for the trigger)

5. Run the task that sets the analog output, which will trigger the digital input task to start as well

6. Wait for the event to fire

7. Get all the data from the digital input and find the time offset of the first high signal from the first data point

8. Stop all tasks and clean up

 

That's all possible to do in LabVIEW with DAQmx but it's more than I'm willing to spend my spare time investing in creating at the moment.  

0 Kudos
Message 7 of 13
(3,882 Views)

Thanks a lot guys . I will try this and let know.

0 Kudos
Message 8 of 13
(3,836 Views)

I tried dragging the snippet but it gives me error 

Labview: File version is later than the current Labview version

An error occured loading VI 'VI Snippet" Labview load error code 9: VI Version 19.0 is newer than Labview version 17.0.

Do I have to update the version or is there any other way

Please let me know. In the meanwhile I am trying to crete the one just like you sent me

0 Kudos
Message 9 of 13
(3,796 Views)

LabVIEW 2017 version attached.

0 Kudos
Message 10 of 13
(3,791 Views)