LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

measuring 1MHZ in windows with usb 6008

im lookin after to measure the digital frequency of 1MHZ(in order to measure outputs in usecs) using labview 8.2 with the help of USB-6008 in windows.it has been noted as the windows resolution supported only upto millisecond resolution.but some statements had stated that it is possible with some modifications in API query perfomance frequency and kernel32.dll files.but im totally a beginner for this concept.

 

1.what should i do to measure in microseconds resolution using windows?

 

2.what is the alternative to this process?

 

3.was usb 6008 is sufficient if not which one else ll be best?

 

4.what is API functions how to call them?

 

waitin for guidance,

with regards

siddharth.. 

0 Kudos
Message 1 of 11
(4,048 Views)

It is possible to measure 1MS/s in windows... but not with a USB-6008. It can only measure upto 10KS/s.

If you have a look in th ehardware selection page you can sort by sample rate...http://sine.ni.com/nifn/cds/view/main/p/sn/n1:7690,n24:USB/sb/-nigenso3/lang/en/nid/1036/ap/daq

Looks like a USB-5132  can do it, but it's about 5 times the price of USB-6008. Considering it can sample 5000 times faster, thats not too bad.

 

Answer 1. 2. 3. Buy a USB-5132

4. DAQmx

Message Edited by Troy K on 10-06-2008 03:33 PM
Troy - CLD "If a hammer is the only tool you have, everything starts to look like a nail." ~ Maslow/Kaplan - Law of the instrument
0 Kudos
Message 2 of 11
(4,035 Views)

thanks troy

is it possible to obtain output in microseconds using the device.i thought it was OS dependant to display in micro seconds

with rgards

siddharth 

0 Kudos
Message 3 of 11
(4,032 Views)

In LabVIEW under Windows you can obtain/use the value of a software timer in milliseconds.

In LabVIEW under a real-time operating system you can obtain/use the value of a software timer in microseconds.

 

The data acquisition hardware can obtain data at whatever rate it's capable of and just passes it to the operating system.

The rate that the data has been acquired at is irrellevant. It is just a number representing the time between samples (delta T).

It could be 1000 samples 1 femtosecond apart.

Troy - CLD "If a hammer is the only tool you have, everything starts to look like a nail." ~ Maslow/Kaplan - Law of the instrument
0 Kudos
Message 4 of 11
(4,024 Views)
so the accuracy in microseconds s not possible with even usb 5132 in windows...right.......
0 Kudos
Message 5 of 11
(4,010 Views)

If you use a USB-5133 (100MS/s version) you will get accuracy down to 10ns. That is 0.01 microseconds.

 

The hardware uses it's own clock. What the operating system can do is irrelevant.

Message Edited by Troy K on 10-06-2008 10:07 PM
Troy - CLD "If a hammer is the only tool you have, everything starts to look like a nail." ~ Maslow/Kaplan - Law of the instrument
0 Kudos
Message 6 of 11
(3,987 Views)

thanks troy placed an order on 5133,

can u help me with interfacing c codes into labview modules

with regards

siddharth

 

0 Kudos
Message 7 of 11
(3,971 Views)
I've called a few dlls in LabVIEW before, but you'd probably be better off posting another specific question about calling external code if you have one.
Message Edited by Troy K on 10-07-2008 02:54 PM
Troy - CLD "If a hammer is the only tool you have, everything starts to look like a nail." ~ Maslow/Kaplan - Law of the instrument
0 Kudos
Message 8 of 11
(3,967 Views)

Oh, by the way... I wasn't recommending what hardware you should be using. I was just giving an example of hardware that could measure much faster than you needed in Windows.

 

The USB-5133 takes analog measurements very quickly but only at an 8 bit resolution (not very good detail)

It may not be what you need.

If you wanted to only take digital measurements, another device might be better suited and/or cheaper.

 

NI has so many options in DAQ devices, you need to compare the device specs with your requirements.

I presume  you have more requirements than "measure outputs in usecs".

 

Are you trying measure a frequency? How accurately? +/- 1%?

Are you trying to capture digital data? Is there a clock signal available to synchronize with?

Maybe something with a timer/counter would be better?

 

Troy - CLD "If a hammer is the only tool you have, everything starts to look like a nail." ~ Maslow/Kaplan - Law of the instrument
0 Kudos
Message 9 of 11
(3,926 Views)

thanks for that info

i trust +/-1 will not affect much

im setting a clock of 10Mhz

wats gonan be the prob with the daq 5133

wat else ll be useful then 

 

0 Kudos
Message 10 of 11
(3,860 Views)