OK, I see. The resolution of the time channel should not be the problem. I actually could reproduce that myself testing with the file crash.dat in the DIAdem library folder. The second of the crash channels produces a 3ms of 3.07 sec.
But if you have a closer look at the online help you will find out that this is perfectly allright. According to the description of the implemented algorithm (NHTSA_FORTRAN-Code), the Xms "computes the maximum value that the linear interpolation of a time series,..., *meets or exceeds* for an interval of at least 3 ms,..."
Thus, if you calculate a 3ms value the length of the resulting interval will be at least 2ms, but might be longer as well.
This is how the Xms calculation is defined within Crash analysis. If you need th
e function to behave differently I am afraid you have to figure out another way of calculating.