08-14-2012 09:08 AM
Hello, I'm trying to make a program running a stage with two signals. A linear for the Y position and an oscillating signal for the x position.
The X signal starts with 0 amplitude and is increased until it gets the desired amplitude. Afterwards the Y signal drives the stage up and down, and finally the amplitude goes from 300 to 0 again.
The thing is, when it is increasing, the change in the amplitude is continuous, but when it is decreasing there is a sudden jump. I don't see that on Labview, but on the oscilloscope it gets stuck for a moment to a certain value and then it jumps. I have checked the values sent , and are the same ( only reversed, from 0 to 300, and then from 300 to 0 ) during the increase and decrease of the amplitude.
I have no idea what I am doing wrong or what could i change to solve this, any idea is more than welcome.
Thank you very much in advance,
marc
08-22-2012 05:21 AM
Reposted here:
http://forums.ni.com/t5/LabVIEW/Avoiding-jitter/td-p/2127098