Does this model of DAQ have any sort of anti-aliasing filtering capabilities? I am noticing that when recording the DAQ binary data (through C++) that sometimes it seems like the DAQ may be doing some filtering/smoothing of the signal. I have a potentiometer device hooked up to it, that has a max voltage of 4.5 and a min of .5. I noticed that in the Binary data, sometimes when the voltage value transitions between the max and the min (in either direction), there will be an intermediate value that seems to be the average of a few data points around .5 and 4.5. For example, the data should look something like this: Sample 1 = .5000001, Sample 2 = 4.49999, Sample 3 = 4.499998. But, sometimes my data looks like this, Sample 1 = .5000001, Sample 2 = 2.78549, Sample 3 = 4.499998. Is there some sort of smoothing algorithm being applied here?