LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Millisecond Time Stamping of Serial Data

I have hit a roadblock trying to accurately time stamp serial data.  In my application the test is the serial communication to the device.  I need to accurately time stamp each byte as it is sent and received with millisecond accuracy.  I am using Windows XP, LabVIEW 7.1, NI 232/8, and the latest version of NI’s VISA Drive, Version 3.4.1.

 

I have tried several methods for time stamping the data including using VISA Events for characters at the serial port on high speed polling all with the same result.  If the baud rate is 19,200 BPS than the by time would be .572916 milliseconds per byte (5.2083e-5 per bit * 11 bits).  When I review the time stamps I have 8 to 11 bytes received in a milliseconds when there should be about 2 per millisecond.  My conclusion is that I am time stamping the data as it is pulled from the Serial FIFO Memory and not time stamping the data as it comes into the serial port.

 

So the issue is there a way the serial data can be accurately time stamped using Windows XP?  Can it be done if I were to use a real time OS?  Should I consider using trying to do this in hardware like a digital I/O with a level shifter to get the serial voltages?

 

Attached is my code that I am currently using to time stamp the data.  I have it so one com port will transmit the data and another com port will receive the data.  You select the com ports used, the length of the data, and the baud rate.  This will require a serial null modem cable to connect the two com ports together.

 

Thanks for any assistance that can be given on this issue,

 

Matt
Matthew Fitzsimons

Certified LabVIEW Architect
LabVIEW 6.1 ... 2013, LVOOP, GOOP, TestStand, DAQ, and Vison
0 Kudos
Message 1 of 15
(8,175 Views)
Just as a general comment, I would think that the lower millisecond domain is out of reach as a RULE on Windows platforms, since the operating system causes "jitter" in the millisecond range when windows "goes away" to service other threads/processes.

As far as I know, there is not a standard built in time-stamp mechanism in the "AUART" chips on motherboards, and as such I dont think you can hope to get the serial port hardware to perform time-stamps for you either.

...

Please beware that my comments are not absolute truths, and I might be mistaken in one or more of my statements. 🙂
---------------------------------------------------

Project Engineer
LabVIEW 2009
Run LabVIEW on WinXP and Vista system.
Used LabVIEW since May 2005

Certifications: CLD and CPI certified
Currently employed.
Message 2 of 15
(8,149 Views)

Its not impossible....

but it sure ain,t going to be easy.

Continue to use the serial port the way you are but add an anlog input device doing continuous double buffered acquisition.

The acquired signal is time stamped and can be decoded for the data coming in. You know what should be there, because the serial port tells you that. Find the fragments of the analog input signal that matches up with each character and then all you have to do is decide if the time stamp for any particular character is the time associated with assertion of the start bit, or the dropping of the last stop bit.

I you asked me to do it, I would ask you to give me a week if there a clear cut intervals that allow me to re-sync the two acquisition paths. Two weeks if the is no break in the action. Smiley Wink

If duplicating the UART in software is not to your liking, a UART on a prototyping board maybe easier.

That reminds me!

A couple of year ago I wrote a driver for an SDLC application. The board I used (Comm-Tech?) required I explicitly handle the I/O operations. Nah, I think the chip had an 64 byte buffer.

So it looks like UART in hardware or software is the way you will have to go if you really need to time stamp bytes.

Let me guess. Uncle Sam is involved in this requirement?

He seems to come up with most extreme application req's.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 3 of 15
(8,143 Views)

Maybe this will help.

http://www.adontec.com/smon_e.htm

Does anyone make a gps receiver board with serial ports on it?? Maybe you could read gps data in the same loop with the serial port info.

Message 4 of 15
(8,138 Views)

One small point that might help you to get closer (but still with jitter) is to got to system settings and disable the buffer.

We use external hardware (µC with RSXXX and USB that do the timestamping )  .  And at least you can do it with a scope Smiley Surprised

 

 

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


Message 5 of 15
(8,118 Views)

Thanks all for your suggestions.  Looks like what I am getting from all of use is to use external hardware to timestamp the events accurately like an Analog I/O or a scope.  This requirement is not true government driven by protocol testing related to gaming products. I fully understand that Windows is not a real time environment and may not be a good choice for this application.

I need a solution that will run on XP or LV Real Time, CPU utilization friendly because multiple instances may be run in the future, and can make reasonable good intercharacter delay measurements.  What I would like to know if someone has tried to do this with LV Real Time and what were their results?  Can I get microsecond timing with Real Time?

Thanks,

Matt

 

The suggestion to use Super Monitor will not work:

Windows NT/2000/XP
SuperMonitor doesn't support Windows NT/2000/XP because it's not possible to catch data and events in real-time on these systems. In Windows NT/2000/XP some events will be missed others will be reported in wrong order. In fact you get data that is already buffered. So, it's not possible to record the protocol data as received and apply exact time stamps or monitor every single change in the flow control (RTS/CTS, DTR/DSR, XON/XOFF). In a different situation Windows would be too busy handling interrupt events in real-time and multi tasking would suffer.

Matthew Fitzsimons

Certified LabVIEW Architect
LabVIEW 6.1 ... 2013, LVOOP, GOOP, TestStand, DAQ, and Vison
0 Kudos
Message 6 of 15
(8,101 Views)

Yes RT would do it.

The method I outlined below should work. The device I worked with (years ago and far far away) was an ESCC-PCI interface from CommTech.

www.commtech-fastcom.com

Their technical manual was for the most part a detailed discussion of the chip they used. The discusion of the chips operation included detailed discussion of how it actually parses the data line.

In your case you do not need the full UART capabilities  just the "AR" part.

You could use the write up fo ra comm chip of that type to guide you through designing your code.

Note: Watch your high end throughput. Over sampling requires sampling 10X the baud rate.

Ben

BTW:

Comm-Tech's support was top notch! You talked directly with the driver developer.

Message Edited by Ben on 12-05-2005 12:35 PM

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 7 of 15
(8,079 Views)

Ben,

I sent Commtech technical support an email and they said they had nothing to assist in making accurate RS-232 timing measurements.  I assume the card you are refereeing to is the ESCC-PCI-335.  I have looked over the manual and have found some timing information but need to review it in detail if I am going to give it a try.  I have put a call into NI's Technical Support to see what they have to say on this whole timing thing so I will wait to see what they suggest.

Thanks for your inputs,

Matt

Matthew Fitzsimons

Certified LabVIEW Architect
LabVIEW 6.1 ... 2013, LVOOP, GOOP, TestStand, DAQ, and Vison
0 Kudos
Message 8 of 15
(8,067 Views)

I was only pointing at Comm Tech because of the details in the tech manual re:how the chip does its job. There are sufficient details in the chip spec that you could actually code the operation of the chip in software. I was mentioning that as to what you do with the data as it is read in from a AI device of some type. After all of the details have already been accounted for in the chip design including over-sampling.

By over-sampling your input signal you not only get the noise rejection but you can also get the timing of your line state change resolved to better than a bit cell time interval (using LabVIEW RT of course ! Smiley Wink )

You also may want to look into UART specs. I have not looked at these in 20 years but I'd suspect there is a signal that changes states when a character is properly framed.
 
I said before this is no small task. Hardware manufactuers have worked for years to isolate the character timing from the CPU so you are facing a challenge.
 
The more time I think about this the higher my estimate is.
 
Keep us posted on what support says.
 
Ben

Message Edited by Ben on 12-05-2005 06:05 PM

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 9 of 15
(8,065 Views)
How about a FPGA approach?
I have no idea if a (U)AR(T) has been already realized in LabVIEW-FPGA, but I asume it has been alredy done in pure VHDL ....
 
This company helped us with our serial protocol tasks:  http://www.vector-informatik.com/index.html
 
Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


Message 10 of 15
(8,057 Views)