LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

What is the relationship between sample rate and samples per channel?

Hi there, I am using data translation DT9816 and my task is to save the data that input to analog input channel (in V) into csv file.I have a problem about what is the relationship between sample rate and samples per channel?

this is my problem,

 

when I input sample rate 1.000Hz and samples per channel 1.000, the data is correct but the problem is the time only showing every 1000 samples( at sample 1,1001,2001), so the time increasing (1000x1ms=1s) for every 1000 samples and then,

 

when I input sample rate 10.000Hz and samples per channel 1.000, the data is correct and also the problem is the time showing every 1000 samples( at sample 1,1001,2001), the time is (1000x0,1ms=100ms) for every 1000 samples and then

 

And last when I input sample rate 1000Hz and samples per channel 1 or below 1000, the output is false. ( I think maybe we cannot input samples per channel below 1000) because the time for each sample should be 1/1000=1ms

and the other problem is the program not starting from 0

 

I am really confused about this sample rate and samples per channel problem, because I have test it all the way for 1 weeks(and still not solved yet).

 

I Have attach all the files needed included my block diagram.

I hope someone can help me because it really make me stress.

 

Note: SR=sample rate, SC=samples per channel

0 Kudos
Message 1 of 4
(3,424 Views)

and this is the block diagram

0 Kudos
Message 2 of 4
(3,423 Views)

I don't have the sub-VI's that control your DAQ-Device, but it seems like you read out 1000 samples at once, and then you add a timestamp to this set of samples.

You have to add a timestamp to each single set of read data...

 

 

Take a look at the sub-VI's to get your information to the relationship between samplerate and samples per channel, or ask the manufacturer!

 

Also take care about your selfbuillt timestamp!

"The base reference time (millisecond zero) is undefined. That is, you cannot convert millisecond timer value to a real-world time or date. Be careful when you use this function in comparisons because the value of the millisecond timer wraps from (2^32)1 to 0."

Maybe you use the "Get Time"-block to get your timevalue

0 Kudos
Message 3 of 4
(3,389 Views)

Hi Citras,

 

why do you start a new thread for the same questions as before?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 4 of 4
(3,386 Views)