Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

sychronize AO/AI buffered data graph and measure data more than buffer size

I am trying to measure the response time (around 1ms) of the pressure drop indicated by AI channel 0 when the AO channel 0 gives a negetive single pulse to the unit under test (valve). DAQ board: Keithley KPCI-3108, LabView Version: 6.1, OS system: Win2000 professional.
My problem is I am getting different timed graph between the AI and AO channels every time I run my program, except the first time I can get real time graph. I tried to decrease the buffer size less than the max buffer size of the DAQ board (2048 samples), but it still does unreal time graph from AI channel, seems it was still reading from old data in the buffer when AO writes the new buffer data, that is my guessing. In my p
rogram, the AO and AI part is seperated, AO Write buffer is in a while loop while AI read is not. Would that be a problem? Or it's something else?
Also, I am trying to measure data much larger than the board buffer size limit. Is it possible to make the measurement by modifying the program?
I really appreciate any of your help. Thank you very much!

Best,
Jenna
0 Kudos
Message 1 of 3
(2,620 Views)
Jenna,

You can modify the X-axis of a chart/graph in LabVIEW to display real-time. I have included a link below to an example program that illustrates how to do this.

If you are doing a finite, buffered acquisition make sure that you are always reading everything from the buffer for each run. In other words, If you set a buffer size of 5000, then make sure you are reading 5000 scans (set number of scans to read to 5000). This will assure you are reading new data every time you run you program. You could always put the AI Read VI within a loop and read a smaller number from the buffer until the buffer is empty (monitor the scan backlog output of the AI Read VI to see how many scans are left in the buffer).

You can set a buffer size larger than the FIFO
buffer of the hardware. The buffer size you set in LabVIEW is actually a software buffer size within your computer's memory. The data is acquired with the hardware, stored temporarily within the on-board FIFO, transferred to the software buffer, and then read in LabVIEW.

Are you trying to create a TTL square wave with the analog output of the DAQ device? If so, the DAQ device has counters that can generate a highly accurate digital pulse as well. Just a suggestion. LabVIEW has a variety of shipping examples that are geared toward using counters (find examples>>DAQ>>counters). I hope this helps.

Real-Time Chart Example
http://venus.ni.com/stage/we/niepd_web_display.DISPLAY_EPD4?p_guid=B45EACE3E95556A4E034080020E74861&p_node=DZ52038&p_submitted=N&p_rank=&p_answer=&p_source=Internal

Regards,

Todd D.
National Instruments
Applications Engineer
Message 2 of 3
(2,620 Views)
Todd,

Thank you so much for your answer. I made a big progress, now i can get the real time every time now, by putting my ai read buffer into a while loop, connecting all the error in/out together and adding "AO trigger config" sub VI. You really helped me a lot:)
But the link you attached doesn't work, I have tried several times. Could you please resend the link or attach the vi? I do need it!

Thanks again,
Jenna
0 Kudos
Message 3 of 3
(2,620 Views)