Will someone explain to me exactly what relative inital x is outputting. The help says " relativeInitialX is the time in seconds from the trigger to the first sample in the fetched waveform. " I seem to be getting better accuracy then the specs on the scope. I am assuming that the software is doing some sort of interpolation on the trigger signal to subtract out some of the jitter, but i would like a better explanation. For example, I have a 5124 12bit 200MS/s board. I use the full 200MS/s so i am getting a point every 5ns. I am measuring the peak of a wavelet taking that time and adding on the relativeInitalX. I seem to be getting sub nano-second time "accuracy" If i take the relativeInitalX i get 5ns accuracy which i would expect based on the scopes specs. long story short i just need to know how relativeInitialX is calculated.
Mark Mutton
Electrical Engineer