Dynamic Signal Acquisition

cancel
Showing results for 
Search instead for 
Did you mean: 

Parallel operation of continuous output voltage and continuous acquisition voltage,Board model Pcie-6351

I want to achieve continuous output voltage and continuous acquisition voltage, which can change the voltage in real time outside the loop and output the collected data without delay to the outside.
The program has two loops, one for continuous output voltage (set sampling rate to 1M, sampling points to 100, signal frequency to 10KHz) and the other for data acquisition (set sampling rate to 1M, sampling points to 100).
The first question is that there is a state machine outside the loop, which implements some functions, such as first changing the output voltage through local variables, and then passing the data through a queue to process it as the calculation of the evaluation function. But how can I ensure that the changed voltage is the data collected? I tried to determine the delay through software by adding a manually adjusted wait between them to determine the delay for outputting new data, but the clock accuracy of the computer is in the millisecond range, and I am not sure if this is correct. Because the waiting time added in the middle is at least 7ms, the collected data transmitted is the changed data.
The second question is, in order to minimize the conversion delay from DAC to ADC, I used synchronous hardware clock to put the write and read data into one loop. However, if the number of sampling points is too small, it cannot run, at least 2000 points. How to reduce the number of sampling points.

0 Kudos
Message 1 of 5
(102 Views)

You need to synchronize your ao and ai tasks. See this discussion forum post for reference: https://forums.ni.com/t5/LabVIEW/Sync-Daqmx-AO-with-AI/td-p/3809098. Once your tasks are synchronized, you will know (within a sample or two) that the acquired samples correspond exactly to the generated samples. Do NOT add a wait in your code.

 

You are attempting to run loops at 100/1M = 0.1 ms which is a thousand times tighter than the typical recommendation (loop rate ~ 0.1 s). As it looks like you are modifying the generation parameters manually, why is it important to achieve such a fast loop rate? Also, you should be aware that your flat sequence will run exactly once - the current implementation will only change voltage amp once after the 2000 ms Delay completes. 

 

If you are trying to put ao and ai in the same loop, remember to prime the ao buffer before starting the synchronized tasks. Then, know that your ai Read is behind the ao Write by the length (typically twice the block size) of the signal you used to prime the buffer. 

Post back once you have synchronized tasks.

Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
0 Kudos
Message 2 of 5
(64 Views)

Hello,
I am conducting an experiment to implement closed-loop using the spgd algorithm. It is necessary to modify the output voltage value multiple times through the program in a short period of time and collect data to provide feedback to the algorithm. But what I'm not sure about is that I immediately collected data after changing the voltage. Is the data collected at this point already responsive to voltage. I need to achieve a fast response, and a delay of 1ms is acceptable.
Neither separating the two loops nor synchronizing them can meet my requirements. Because if there are too many sampling points, changing the data externally will cause a delay in the response. If there are few sampling points, the program cannot run. I really don't know what to do.

0 Kudos
Message 3 of 5
(52 Views)

For the sequential structure in my program, the added 2-second wait is to let other programs run first. Then output voltage through local variables and collect data through a queue (100 sampling points). Another high-precision waiting input control is added in between them, with the purpose of manually adjusting the waiting time (500us, 1ms, 5ms, 10ms) to determine how long the new waveform appears. But this approach may not be entirely correct, as the time accuracy of computers may not be as high, only on the order of milliseconds, and this method of calculating response time may be incorrect.

0 Kudos
Message 4 of 5
(49 Views)

Ah, I was misreading your first post as "...I need to measure the response with minimal delay...". Now I understand your application to be a control application. To minimize latency, you want to use hardware-timed single point sample mode.

For your control application, these examples can be cobbled together:
DAQmx>>Analog Input>>Voltage - HW-Timed Single Point Input.vi
DAQmx>>Analog Output>>Voltage - HW-Timed Single Point Output.vi

Control Design and System ID Module>>General PID Simulator.vi

 

Again, post back when your generation and acquisition is synchronized and up and running. Then, you can integrate control algorithm.

Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
0 Kudos
Message 5 of 5
(14 Views)