12-02-2005 11:58 AM
I have hit a roadblock trying to accurately time stamp serial data. In my application the test is the serial communication to the device. I need to accurately time stamp each byte as it is sent and received with millisecond accuracy. I am using Windows XP, LabVIEW 7.1, NI 232/8, and the latest version of NI’s VISA Drive, Version 3.4.1.
I have tried several methods for time stamping the data including using VISA Events for characters at the serial port on high speed polling all with the same result. If the baud rate is 19,200 BPS than the by time would be .572916 milliseconds per byte (5.2083e-5 per bit * 11 bits). When I review the time stamps I have 8 to 11 bytes received in a milliseconds when there should be about 2 per millisecond. My conclusion is that I am time stamping the data as it is pulled from the Serial FIFO Memory and not time stamping the data as it comes into the serial port.
So the issue is there a way the serial data can be accurately time stamped using Windows XP? Can it be done if I were to use a real time OS? Should I consider using trying to do this in hardware like a digital I/O with a level shifter to get the serial voltages?
Attached is my code that I am currently using to time stamp the data. I have it so one com port will transmit the data and another com port will receive the data. You select the com ports used, the length of the data, and the baud rate. This will require a serial null modem cable to connect the two com ports together.
Thanks for any assistance that can be given on this issue,
Matt
12-03-2005 02:25 PM
12-03-2005 04:03 PM
Its not impossible....
but it sure ain,t going to be easy.
Continue to use the serial port the way you are but add an anlog input device doing continuous double buffered acquisition.
The acquired signal is time stamped and can be decoded for the data coming in. You know what should be there, because the serial port tells you that. Find the fragments of the analog input signal that matches up with each character and then all you have to do is decide if the time stamp for any particular character is the time associated with assertion of the start bit, or the dropping of the last stop bit.
I you asked me to do it, I would ask you to give me a week if there a clear cut intervals that allow me to re-sync the two acquisition paths. Two weeks if the is no break in the action.
If duplicating the UART in software is not to your liking, a UART on a prototyping board maybe easier.
That reminds me!
A couple of year ago I wrote a driver for an SDLC application. The board I used (Comm-Tech?) required I explicitly handle the I/O operations. Nah, I think the chip had an 64 byte buffer.
So it looks like UART in hardware or software is the way you will have to go if you really need to time stamp bytes.
Let me guess. Uncle Sam is involved in this requirement?
He seems to come up with most extreme application req's.
12-03-2005 04:58 PM
Maybe this will help.
http://www.adontec.com/smon_e.htm
Does anyone make a gps receiver board with serial ports on it?? Maybe you could read gps data in the same loop with the serial port info.
12-05-2005 03:37 AM
One small point that might help you to get closer (but still with jitter) is to got to system settings and disable the buffer.
We use external hardware (µC with RSXXX and USB that do the timestamping ) . And at least you can do it with a scope
12-05-2005 10:56 AM
Thanks all for your suggestions. Looks like what I am getting from all of use is to use external hardware to timestamp the events accurately like an Analog I/O or a scope. This requirement is not true government driven by protocol testing related to gaming products. I fully understand that Windows is not a real time environment and may not be a good choice for this application.
I need a solution that will run on XP or LV Real Time, CPU utilization friendly because multiple instances may be run in the future, and can make reasonable good intercharacter delay measurements. What I would like to know if someone has tried to do this with LV Real Time and what were their results? Can I get microsecond timing with Real Time?
Thanks,
Matt
The suggestion to use Super Monitor will not work:
Windows NT/2000/XP
SuperMonitor doesn't support Windows NT/2000/XP because it's not possible to catch data and events in real-time on these systems. In Windows NT/2000/XP some events will be missed others will be reported in wrong order. In fact you get data that is already buffered. So, it's not possible to record the protocol data as received and apply exact time stamps or monitor every single change in the flow control (RTS/CTS, DTR/DSR, XON/XOFF). In a different situation Windows would be too busy handling interrupt events in real-time and multi tasking would suffer.
12-05-2005 12:33 PM - edited 12-05-2005 12:33 PM
Yes RT would do it.
The method I outlined below should work. The device I worked with (years ago and far far away) was an ESCC-PCI interface from CommTech.
Their technical manual was for the most part a detailed discussion of the chip they used. The discusion of the chips operation included detailed discussion of how it actually parses the data line.
In your case you do not need the full UART capabilities just the "AR" part.
You could use the write up fo ra comm chip of that type to guide you through designing your code.
Note: Watch your high end throughput. Over sampling requires sampling 10X the baud rate.
Ben
BTW:
Comm-Tech's support was top notch! You talked directly with the driver developer.
Message Edited by Ben on 12-05-2005 12:35 PM
12-05-2005 05:21 PM
Ben,
I sent Commtech technical support an email and they said they had nothing to assist in making accurate RS-232 timing measurements. I assume the card you are refereeing to is the ESCC-PCI-335. I have looked over the manual and have found some timing information but need to review it in detail if I am going to give it a try. I have put a call into NI's Technical Support to see what they have to say on this whole timing thing so I will wait to see what they suggest.
Thanks for your inputs,
Matt
12-05-2005 06:03 PM - edited 12-05-2005 06:03 PM
I was only pointing at Comm Tech because of the details in the tech manual re:how the chip does its job. There are sufficient details in the chip spec that you could actually code the operation of the chip in software. I was mentioning that as to what you do with the data as it is read in from a AI device of some type. After all of the details have already been accounted for in the chip design including over-sampling.
By over-sampling your input signal you not only get the noise rejection but you can also get the timing of your line state change resolved to better than a bit cell time interval (using LabVIEW RT of course ! )
Message Edited by Ben on 12-05-2005 06:05 PM
12-06-2005 04:30 AM