
Hello Everyone,
Today I made a short program to interface with a GPS antenna over the serial port, and parse the incoming strings. That part seems to work great. However, when I decode the GGA sentence to retrieve the UTC time (hhmmss.ss) and try to convert it to a time stamp, the value seems off.
First, when I'm watching the UTC time count up, one second of displayed time is longer than 1 second of me counting out loud. Counting out loud isn't super accurate, I know, but it takes almost 2 seconds of real time to pass before 1 second updates on the display. Not sure why. I'm thinking it has something to do with my loop timing? The expected sample rate coming from the antenna is 5 hz. I have a wait function built into the program to time the loop, and if I try any value I try besides 200mSec, then the sentences don't read correctly at all. Am I getting data correctly by setting a wait function, or should I be doing something else.
Also, the time just doesn't match up with my computer time at all. When I first ran it, it was like 10 minutes fast. But I let it run for a little while and now it's 1.5 hours fast. I'm not sure if it jumped to that time, or worked it's way up. Either way, if the seconds truly are updating slower than 1 second of real time, I'm not sure why the timer is fast?
I'm pretty confused at this point. This is my first time using GPS, so if anyone with some experience could help, I'd appreciate it.
Thanks,
Alex