LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sampling rate conflict (specified vs actual)

Solved!
Go to solution

I am using NI 9234 to acquire my sensor data using labVIEW 8.6. I am been using labView for only the past two weeks, so please bear with me as my knowledge is so basic. I am trying to read multiple channels with time. My problem occurred when I finished my VI, I found out the whenever I change my sampling rate in the VI code, it still faster (more sampling rate than whatever I specify).

 
I attached my VI code which is DAQmx to read voltages and save the readings to an excel file then I added an equation to convert my voltage to resistance because my water level sensor should be calibrated in term of resistance (this equation was provided in the sensor's manual).
I tried to check the manual of the NI 9234 in order to see the min sampling rate which I found that:
(by the way my application does not reqire high sampling rate, so it ok to have 10 to max 100 readings per sec)

Data rate range (fs) using internal master timebase
Minimum....................................1.652 kS/s
Maximum................................... 51.2 kS/s      

Data rate range (fs) using external master timebase
Minimum....................................0.391 kS/s
Maximum................................... 52.734 kS/s
 
I thought those might be the reason. So I was trying to save my data as ( voltage vs time) or (resistance vs time) in order to know what is the timing rate that I am getting as results of my test ( I meant voltage in the 1st column and time in the other column in the excel sheet).
So, now I need a help or guidance to first upgrade my code to save readings vs time to know exactly what sampling rate I am getting. Do I need a sub vi or it is just a simple problem. I really checked so many examples but I did not get an answer. (Maybe I was looking the wrong direction)
2nd: Is it a common to have such problem of sampling rate differences with the while loop. Should I change the while loop to timed loop.
 
Thank you, 

 
Mo
0 Kudos
Message 1 of 8
(4,102 Views)

Thank you MIG for the reply.

As I checked your 1st link, I found that the allowable sampling rate for NI 9234 is ranging from 1.652 to 51.2 kS/s. Does it mean that I can not have a sampling rate of less than 1652 readings per sec (cycle), let's say 100 readings or less per sec ? I am just not sure about that. I have a very basic knowledge about DAQ systems.

 

0 Kudos
Message 3 of 8
(4,007 Views)

Yes, it would appear so.

I am not familiar  with the NI 9234, but according to the link there are only 31 allowable sampling rates, and when using the internal timebase, 1.652 kHz is the lowest.

0 Kudos
Message 4 of 8
(3,999 Views)

Ok. I am just wondering what is the difference between internal and external master timebase. How to make sure what r u using? I mean can I use external master timebase? and what should I change in the VI code or which properties?

 

0 Kudos
Message 5 of 8
(3,993 Views)
Solution
Accepted by alrowaimi

I've never used an external timebase with a DAQ card, so it's a little outside my area of knowledge. Perhaps you could refer to the manual to see if it's possible.

 

Personally, I wouldn't bother. If you want a lower sampling rate than 1.652 kHz you could always decimate down to a lower rate. For example, if you sample at 1.652 kHz and then take every 16th sample, you would then end up with an effective sample rate of 103.25 Hz.

 

If you want exactly 100 Hz you could do as they suggest in the link and use the "Resample Waveforms (continuous).vi" to resample your data.

 

0 Kudos
Message 6 of 8
(3,979 Views)

Thank you MIG. I will try to do it that way to get lower sampling rate. In fact I am changing my complicated VI code from using DAQmx to DAQ assistance since it easier to deal with for a beginner and I will see what happens with the new code, because I used to have an error due to my samples to read. I think I get timed out every time I run the test because I misunderstood my default min sampling rate. Thanks

 

0 Kudos
Message 7 of 8
(3,964 Views)

Although i agree with MIG that downsampling is the correct way to go, if you're using such low speeds as 10-100Hz you can read single samples and software time it, it's accurate enough for those speeds. 🙂

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 8 of 8
(3,943 Views)