01-16-2016 04:32 AM
Hello
As I reported before, in my experiment I use the Histogram from Signal Express Time-Domain Measurements. The inputs to the histogram are integers, positive as well as negative, representing multiples of a time interval of exactly 0,5 ns. I noticed from the resulting 1D array in the output cluster that only even values of the integers are recorded. In order to test this I built a simple test vi, see attachment, which shows the same result. I can record also the uneven values only if I increase the number of channels by 1, but then the step dt2 changes and it becomes more difficult to assign exact time values to the bins of the histogram.
Is there anything I do wrong or do not understand?
01-16-2016 10:32 AM
01-16-2016 02:25 PM
I think the general histogram vi should be able to do what you are looking for. I'm not sure about the inner workings of the Histogram express vi, but I have attached a snippet below using general histogram.
01-18-2016 04:17 AM
I am afraid I did not express clearly what I meant.
To be more explicit: the integers at the input represent time differences between two events in multiples of 0,5 ns. The histogram measures how many times any difference occurs. The result will show the distribution of the time between events through the number of counts in the bins. I should not have mentioned 'channels' where I mean 'bins'. In the ´test integers vi´ you will find the configuration of the histogram by right mouse clicking. The 1D array only shows even numbered bins. I agree that the suggestion of Kelle to use the General Histogram from Probability& Statistics would have been more appropriate, but I thought the Signal Express Time Domain Measurement vi would be suited and would be much simpler.
Thank you for your interest.
01-22-2016 01:08 PM
Hello
I have compared the Time Domain Histogram and the General Histogram in a Test vi, see attachment. Although configured in exactly the same way, the results of the two Histograms are different. Is it well known that the Time Domain Histogram, with dt=0.5 and dt2=1, adds the content of two bins together (with the exception of the last two bins) and shows zero's in the bins in between? This would mean that the resolution with respect to the values at the input is reduced by a factor of 2, if compared to the General Histogram.