LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Real problem with synchronisation using a delay with attached code

Hi
 
I was hoping somebody out there may be able to check the DAQmx code I have written for synchronized Analogue input/output in my application.
 
The code is supposed to run in the following manner:
 
The for loop on the top left generates an array of data points to write to the AO write sub vi (the start voltage and stop voltage and number of steps in AO voltage being the parameters that determine the output of this loop.
 
In the main for loop the AO output is swept according to the settings generated above.   On each step of the analogue output voltage there is a short delay set by the DAQmx trigger before data is sampled to the AI channel. The loop executes N times and the mean of the N sweeps is plotted to the graph (outside the loop).
 
However I don't think I have got the synchronization I right.  It certainly looks like when I try to run my instrumental hardware that data is out of synchronization.  Can anybody see any obvious errors in the code that would account for this and does anybody have any suggestions for improvement.
 
Hope someone can help.
 
Many thanks
 
Ashley
 
 
0 Kudos
Message 1 of 6
(2,927 Views)
Post your code so we can look at it.  How can you tell the data is out of sync?  Do you have a trigger from the AO to the AI so that the AI will read after the AO has completed writing?
- tbob

Inventor of the WORM Global
0 Kudos
Message 2 of 6
(2,919 Views)
Apologies - I forgot to attach the code to the original e-mail.
 
It is now attached.
 
The reason I believe that there is a lack of synchronism is that the hardware runs oddly.  I can't see why looking at the code and was wondering if someone more experienced in DAQmx may be able to spot the problem.
 
Many thanks
 
Ashley
 
 
0 Kudos
Message 3 of 6
(2,907 Views)

Hi Ashley,

Thanks for posting on the NI forums.

I have taken a quick look at your code, and it seems ok to me. Although it does seem strange that you are making the AO operation wait for the AI operation's trigger.

I have run your code 'as is' with my DAQ card here. I have just shorted the signal from AO to the AI. The results I'm seeing indicate that the values output, and the values acquired are the same, and seem fairly well synchronised, i.e. I have a linear curve returned to me (see attached jpeg).

If you could elaborate more on a) What you expect to see, and b) what you are currently seeing, then we'll see what we can do.

Thanks,

National Instruments | Northern California
0 Kudos
Message 4 of 6
(2,866 Views)
Hi Rob
 
I suspect that what I may actually have is a signal conditioning hardware problem as opposed to software problem.  I did the same test as you did with the code I posted on the message board and it looked fine. 
 
The reason I believed it was a software problem was that the signal (a Gaussian peak) was shifting and distorting and the severity of the problem depended on the sampling rate and width (voltage range) of the scan.  Most noticeable there seemed to be an inherent delay which was leading to a severe tailing edge on the signal. 
 
I will evaluate the hardware and let you know how I get on.
 
Best Regards.
 
Ashley.
0 Kudos
Message 5 of 6
(2,853 Views)

Hi again Ashley,

I suspect you may be onto something there. Dependant on the hardware involved, it is common for signal conditioning hardware to multiplex the conditioned analogue input channels through to one 'ai' channel on your DAQ board. This could result in a delay similar to the one you are seeing.

May I enquire as to what signal conditionng hardware you are using, and in what configuration?

Cheers,

National Instruments | Northern California
0 Kudos
Message 6 of 6
(2,821 Views)