LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

BNC 2120 with wildly varying temperature

Hello all,
I'm attempting to take temperature readings from a BNC-2120 with a type K thermocouple. When I do, although the average voltage is about where it should be (~.0009V), the smallest step this voltage takes during measurements is .0003 volts, which corresponds to 14 degrees F. As a result, when I graph the thermocouple voltage of room temperature for a few minutes, there are only three voltage readings I will ever get, which are all the same out to as many decimal places as I have showing up. I doubt this is a mechanical error, so I was curious what would cause this behavior. I'm using Labview 7 on a Mac with OS 9 and I can attach my .vi if someone thinks it will help, but am skipping that for now because in my case it is a lot of work.

Thanks for any help
-Plowman
0 Kudos
Message 1 of 9
(3,786 Views)
BNC-2120 is just a connection box. You haven't said what data acquisition board you are using. That is what will determine the minimum resolvable input voltage change.

Either (1) you need a different DAQ board or (2) you need to set it to a more sensitive range. The specs will tell you.
0 Kudos
Message 2 of 9
(3,778 Views)
I have a NI PCI-6052e DAQ card, which should be more than sufficient in terms of accuracy. Do you know how to navigate to change accuracy settings in labview 7? I poked around the DAQ wizard and I don't think the device is installed to labview's liking.

-Plowman
0 Kudos
Message 3 of 9
(3,772 Views)
Type K is about 40 microvolts/ degree C and the minimum voltage step on that board is 35 microvolts. If I read the specs correctly. So the board is marginal unless you are satisfied with 1 C resolution.

The full scale range is an optional input to DAQmx Create Channel (AI-Voltage-Basic).vi.
0 Kudos
Message 4 of 9
(3,767 Views)
+/- 1 degree C is fine for what I'm doing, but I still can't figure out how to adjust this threshold to its most sensitive level.
0 Kudos
Message 5 of 9
(3,752 Views)
The range is set with the DAQmx Configure Channel, just like gwd said. It's the minimum and maximum value inputs. Based on what you provide, DAQmx will automatically set the range to accomadate your signal. There is also a min and max value that you can set in the DAQ Assistant.
 
0 Kudos
Message 6 of 9
(3,744 Views)
Thanks for the help.

Unfortunately, I'm on mac os 9, so the best I can get in terms of DAQ software is NI-DAQ 6.6.1, which I just installed fresh to be sure. The options for my DAQ card with that software are limited, and none of them seem to include an option to change the precision range.

Interestingly enough, under one of the applications included in the software I downloaded, there is a utility that shows a simple graph of a given channel over time. This graph is exactly as accurate as I would like, so I know the card is doing the reading, I just need to figure out how to get it. I wish I could just upgrade or move this card to a windows machine, but in the mean time does anyone have experience doing this sort of thing on older systems?

-Plowman
0 Kudos
Message 7 of 9
(3,738 Views)
In traditional DAQ, you use AI Config and that also has a control called input limits. Turning on Context Help and reading the description of the VI and the controls/indicators is recomended.
0 Kudos
Message 8 of 9
(3,735 Views)
I don't have any DAQ hardware, but I just confirmed that NI-DAQ 6.6.1 with LV7 on Mac OS 9 does have a limits input as Dennis mentioned. Not every function is supported on every board, but I think you should be able to change the input range on that board. Look up the specs on the board to find out what ranges are available.

Lynn
0 Kudos
Message 9 of 9
(3,725 Views)