10-05-2012 03:24 PM
I've got my vi here that takes points from an oscilloscope and plots them. I choose the length of the record and the x-increment value is based on the record length and the points retrieved from the oscope ( I input -1 to get the maximum ). However, the x-increment value inexplicably changes when it is plotted.
When probing the vi to find the problem I see that within the fetch waveform.vi the values are calculated correctly and once they are sent to the waveforms they suddenly change to new values.
Does anyone have any idea as to why these value change and why my plots do not match my indicated record value?
Thanks!
The VI is attached for reference.
10-05-2012 03:36 PM
This appears to be a duplicate of your other post: http://forums.ni.com/t5/LabVIEW/How-to-fix-waveform-points-plots-read/m-p/2176958
10-05-2012 03:44 PM
Sorry, I wasn't able to find it and had figured out seemingly where the problem was occuring- I wasn't sure how to update it.
10-05-2012 03:45 PM
Do I delete this one and update the last one?
10-05-2012 04:04 PM
Since this thread contains more recent information it's best to continue the discussion here.
FYI: Clicking on your username will show all posts you've made.
10-08-2012 06:07 PM
I've been over weekend to see if I could fix this but to no avail, has anyone else got a bit of insight to share?
10-08-2012 06:28 PM
You need to provide more details:
Can you reduce your VI to a single graph, containing diagram constants containing the data from fetch waveform?
10-08-2012 06:44 PM - edited 10-08-2012 06:57 PM
Sorry, I'll try to be a little more clear.
Of the four graphs plotted the one with the incorrect axis i the one labeled Channel 1 and 2, which would be the top left-most one. I've attached the fetch waveform sub-vi. It gathers the y values (voltage in this case) and x values from the oscilloscope and brings the data into Labview. The 3 values the vi are the y-values, the x-incremented or my delta_t (which is calculated from the record length I specify and the number of pts gathered i.e. time/ # of pts = delta_t), and the intial x-value with x being my the time axis. Now when I run the program I should be able to calculate the delta_t based on how long the record time is and how many points are gathered. However, once I run the program the Ch1 and 2 plot x values change: the record length that is plotted is much longer than I have specified. I probed the wires going right into the Ch1 +2 plot and that indicated that the delta_t value had changed to a longer time (it also varies according to the record length you indicate). However, if I probe the value of delta_t within the Fetch waveform vi, the correct values are displayed.
10-08-2012 06:56 PM - edited 10-08-2012 07:24 PM
Also I think I should mention, upon further inspection it looks like within the Ch 1+2 graph it looks like only Channel 1's values are affected by this- and by extension so is the top right plot, Ch 1 Auto-correlation (which makes sense) Channel 2 seems to be plotting okay.
10-08-2012 07:33 PM
I've also made a chart of how the values change:
Requested time record # of points expected delta_t actual delta_t new time record
1e-5 10000 1e-9 5e-9 5e-5
1e-4 50000 2e-9 50e-9 2.5e-3
2e-3 1000000 2e-9 1e-6 1s
1e-3 499980 2e-9 500e-9 .25s
2e-2 1000000 2e-8 10e-6 10s
1e-2 1000000 1e-8 5e-6 5s