LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Analog I/0 sync

Hey I would like how to sync analogic input with output on NI myDAQ, the code I sugest daq read and after write the input signal but I measure on my osciloscope and they not have a right timing wave , I hope someone can help with a code

0 Kudos
Message 1 of 23
(1,024 Views)

You can refer to DAQmx Multifunction Synchronization with Shared Sample Clock

Just change DI to AO.

 

-------------------------------------------------------
Applications Engineer | TME Systems
https://tmesystems.net/
0 Kudos
Message 2 of 23
(1,013 Views)

I changed the code but didnt work and seens its reading , my goal is get same signal on input as on output but with sync both

0 Kudos
Message 3 of 23
(998 Views)

Post your code so that we can have a look.

-------------------------------------------------------
Applications Engineer | TME Systems
https://tmesystems.net/
0 Kudos
Message 4 of 23
(945 Views)

 

Spoiler
You can't.  At least, not the way you're approaching it now.

Other than certain fairly rare exceptions, setting up continuous sampling with a sample clock also sets up buffering.  And buffering *necessarily* brings with it latency.  And latency means that your scope traces will inevitably show some delay from your input to your output.

In general, latency is harder to control on the output side than on the input side of things.  The most suitable compromise may well be to run your output task in software-timed on-demand mode (without a clock or buffer).   
    You can time your loop rate by choosing the # samples to read from your input task.  A smaller # leads to lower latency but there are still limits due to overhead associated with each read.  For example, you'd be quite unlikely to achieve a 10 kHz loop rate.  With some devices, especially those on an indirect bus like USB or ethernet, you'll probably be limited to 100's of Hz or less.

You could run both input and output as software-timed on-demand tasks to give you even lower latency, but your sample and update rates will be both limited in speed and more irregular.

In the end, it comes down to tradeoffs.  Like usual. 

 

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 5 of 23
(901 Views)

I tried that code but usually I see on forum is made for DAQmx I'm using NI myDAQ device

,my objective is have same signal I insert on daq input I want daq reproduce on output because I test and they are having ramdom lags , after they are sync I want modulate the output signal with a function, I'm reading both signals with a oscilloscope

0 Kudos
Message 6 of 23
(891 Views)

I know ll have a latency but the wave lag is random not a sync delay I would like find a code for ni mydaq because on NI site I just find codes for DAQmx and I try change physicals channels and get some errors

0 Kudos
Message 7 of 23
(889 Views)

Hi Chris,

 


@Chris9823Alb wrote:

my objective is have same signal I insert on daq input I want daq reproduce on output because I test and they are having ramdom lags


Currently you read your AI signal, then you display it in a chart and output that very same signal with this AO channel. This way the AO channel will always lag the AI reading!

 

You can minimize the lag by reducing the number of sample read from AI channel, with a minimum of 1 sample (giving a lag time with respect to your sample rate)…

 


@Chris9823Alb wrote:
on NI site I just find codes for DAQmx and I try change physicals channels and get some errors

Because DAQmx is used to handle your myDAQ AI/AO channels.

Which errors do you get?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 8 of 23
(887 Views)

I'm new on lab view and try make a code for this , my system is windows 11 using labview 2019 and NI myDAQ device I would like how is the right code for this task I'm getting errors like that and change buffer to standard (100samples/1k hz) and still error.

0 Kudos
Message 9 of 23
(874 Views)

Hi Chris,

 

did you even read the (whole) error message?

It mentions several possible solutions to get rid of this error!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 10 of 23
(868 Views)