Hey,
I'm trying to compute derivative of measured analog data on FPGA using Newton's quotient method. I have a loop that samples the data from the hardware at given period. On that loop I have a shift register and then I calculate the difference between current and previous data point and divide that with sampling time. The problem is, that when my input is a sine from a function generator, I get 2*pi*frequency (of the sine) too large values for the derivative. It would be easy to divide the result with the 2*pi*frequency, but the frequency (and the signal altogether) can't be known beforehand. So what am I doing wrong? Something with the sampling time i.e. dt.