I took a look at your code. There is possibly some error being introduced in the way the time is being checked. A few different scenarios could occur. One would be LabVIEW would start the loop, check the tick count, and then wait for a trigger. Then the next time the loop runs it might get a trigger and then check the tick count. This is because the timing is not sequenced in the dataflow in the VI. I modified the code by placing the Tick Count VI inside a sequence structure and placing it after a sample has been acquired. Now the tick count will only be checked after a trigger is received and the data is read. This should give you more accurate readings.
One other thing to note is that the Tick Count VI is just returning an integer millisecond value. Therefore it doesn't have much resolution and because of rounding you might see some data points +- a millisecond or two. I ran this on my system and I had accurate data with only +-1ms most of the time. But since this timing depends on the software speed it could vary slightly from computer to computer. If any other programs or processes are running they could potentially slow down the VI just enough to give you some timing differences.
If all you want to do is make sure you are triggering and acquiring data at very precise points you could perform other tests that would take software timing out of the loop. For example you could measure a sine wave at the exact same frequency of your square wave. Then if you are successfully triggering at 10Hz you should always measure a constant voltage from the sine wave. This is just one suggestion and many other tests could be done as well.
Have a great day,
Brian P.