12-18-2023 08:17 AM
Hi! This is probably a very naive question, but the setup is the following:
- I have a 1x NI-USB-6002 receiving digital information on channels P0.0 - P0.4. This information is relayed to the PC via the USB cable, processed/recorded via a Simulink script and it generates a matrix file as an output.
I understand that there is a delay between the time it takes from the device to detect any change in the input ports to the time is transmitted/written in Simulink, but I don't have a clear idea of the range of how long that takes (1 ms, 10 ms?).
Based on some empirical tests, it seems to be in the range of <10 ms, but I was wondering if there's any way in which I can measure more accurately the delay ? Or overall, to reduce the delay.
If more information is needed, please tell me. Apologies in advance for ignorance, I'm not an engineer and I'm still learning about the hardware and its limitations.
12-18-2023 08:59 AM
It could be in the order of ms but it is non-deterministic due to the nature of the USB protocol.
12-19-2023 04:10 AM
Thank you for answering. I understand that it may not be possible to determine the precise latency at each sampling point, but do you have any recommendation as to how I could more accurately determine the range of this delay?