04-01-2006 04:59 AM
I have completed a data acquisition project (LV7.1) for running latency tests on some older devices we have. The project works great but some of the equipment are displaying a problem that I would like to create a work around for. The latency test is to determine the time it takes the equipment to respond to a a control input. There is indeed movement in the response signal but trying to determine when the movement starts is problematic because of the poor signal being captured from the device.
I have attached a bmp which shows part of a report plot for an actual return signal. What I would like to have LabView do is to introduce an 'averaging line' through the curve (I have hand drawn it in on the attachment) when a switch is selected by the user. This would at least allow us to determine if the timings themselves are within the test tolerance while we try to track and correct the source of the bad signal.
I have tried all kinds of experiments with the 'best fit' vi but I can't get the result I want. Maybe someone can give me a pointer to head me in the right direction.
Thanks
HR
04-01-2006 07:37 AM
04-01-2006 08:20 AM
You could also try this thread.
http://forums.ni.com/ni/board/message?board.id=140&message.id=14560&query.id=315#M14560
04-04-2006 03:45 AM
Thanks Uncle - I am now experimenting with median filters. Your signal processing trend jpg shows exactly what I am looking for. I am using LV 7.1 instead of 7.0 and therefore I cannot fully open your vi. Do you know what the toolset you mention is referenced as in 7.1 ?
HR
04-04-2006
05:02 AM
- last edited on
03-07-2025
03:39 PM
by
Content Cleaner