T0 is the time from the trigger position to the first point of the data. For example, if your data set was 1msec long and the trigger position was at 50%, the first data point would have time -0.5ms. You can think of your trigger position as the 0 point on a time axis. T0 is the position on the time axis of the first data point. Note that on some digitizer models (e.g. 5112), the trigger resolution is greater than the scan clock period, so you will usually get a T0 which is not an integral multiple of your digitizing frequency.