Hello,
I am using NI-SCOPE to configure my 5122 digitizer card for multi-record acquisition. My trigger signal is connected to the TRIG input of the card, and my data will be acquired on Channel 0.
I have an X stage that I move in to position. Then I configure my SCOPE software to begin acquiring data on an External Trigger, and initiate the acquisition. Then I begin moving my X stage and after it moves 20 microns the first trigger is generated. After the initial trigger, subsequent triggers are generated every N microns, where N can vary between different scans but is a constant within a scan. After the acquisition is complete I display the resulting waveforms.
Please refer to the attached vi (LabVIEW 8.0) and jpg of the block diagram to see how I have the code set up. I've combined a few different sections of my code into this single vi.
You'll see that I have a trigger delay - in this vi it is a constant, but in my application it is a Control. I've run various tests and for some reason my data is always offset by 1.04E-5 seconds, which is why I've had to put in that trigger delay. What I mean is that if I set the trigger delay to 0, then the first 1.04E-5 seconds of my data is noise. After I use this trigger delay, my data is accurate beginning with the first data point.
Can anyone figure out where this trigger delay comes from? Thanks.
Steve