10-05-2008 11:27 PM
im lookin after to measure the digital frequency of 1MHZ(in order to measure outputs in usecs) using labview 8.2 with the help of USB-6008 in windows.it has been noted as the windows resolution supported only upto millisecond resolution.but some statements had stated that it is possible with some modifications in API query perfomance frequency and kernel32.dll files.but im totally a beginner for this concept.
1.what should i do to measure in microseconds resolution using windows?
2.what is the alternative to this process?
3.was usb 6008 is sufficient if not which one else ll be best?
4.what is API functions how to call them?
waitin for guidance,
with regards
siddharth..
10-06-2008 12:30 AM - edited 10-06-2008 12:33 AM
It is possible to measure 1MS/s in windows... but not with a USB-6008. It can only measure upto 10KS/s.
If you have a look in th ehardware selection page you can sort by sample rate...http://sine.ni.com/nifn/cds/view/main/p/sn/n1:7690,n24:USB/sb/-nigenso3/lang/en/nid/1036/ap/daq
Looks like a USB-5132 can do it, but it's about 5 times the price of USB-6008. Considering it can sample 5000 times faster, thats not too bad.
Answer 1. 2. 3. Buy a USB-5132
4. DAQmx
10-06-2008 12:56 AM
thanks troy
is it possible to obtain output in microseconds using the device.i thought it was OS dependant to display in micro seconds
with rgards
siddharth
10-06-2008 01:06 AM
In LabVIEW under Windows you can obtain/use the value of a software timer in milliseconds.
In LabVIEW under a real-time operating system you can obtain/use the value of a software timer in microseconds.
The data acquisition hardware can obtain data at whatever rate it's capable of and just passes it to the operating system.
The rate that the data has been acquired at is irrellevant. It is just a number representing the time between samples (delta T).
It could be 1000 samples 1 femtosecond apart.
10-06-2008 01:51 AM
10-06-2008 07:04 AM - edited 10-06-2008 07:07 AM
If you use a USB-5133 (100MS/s version) you will get accuracy down to 10ns. That is 0.01 microseconds.
The hardware uses it's own clock. What the operating system can do is irrelevant.
10-06-2008 11:12 PM
thanks troy placed an order on 5133,
can u help me with interfacing c codes into labview modules
with regards
siddharth
10-06-2008 11:52 PM - edited 10-06-2008 11:54 PM
10-07-2008 11:39 PM
Oh, by the way... I wasn't recommending what hardware you should be using. I was just giving an example of hardware that could measure much faster than you needed in Windows.
The USB-5133 takes analog measurements very quickly but only at an 8 bit resolution (not very good detail)
It may not be what you need.
If you wanted to only take digital measurements, another device might be better suited and/or cheaper.
NI has so many options in DAQ devices, you need to compare the device specs with your requirements.
I presume you have more requirements than "measure outputs in usecs".
Are you trying measure a frequency? How accurately? +/- 1%?
Are you trying to capture digital data? Is there a clock signal available to synchronize with?
Maybe something with a timer/counter would be better?
10-16-2008 05:01 AM
thanks for that info
i trust +/-1 will not affect much
im setting a clock of 10Mhz
wats gonan be the prob with the daq 5133
wat else ll be useful then