11-30-2005 02:09 PM
12-01-2005 03:13 AM
You might be better off posting this question in the LabWindows/CVI forum (as this is the LabVIEW one).
In LabVIEW (traditional DAQ) you pass values for your expected maximum signals to the "AI Config.vi" function, and the DAQ device's amplifier will be set to a range just higher if available - it is probably much the same procedure in CVI (?).
If you don't know your expected signal size, you would have to start by specifying a very large signal range e.g. +/-10V, then take a measurement and then use the measured value (perhaps with say +20% safety margin) to set the range for a further measurement - this should ensure the best resolution.
Mark.
12-01-2005 04:38 PM
Hello milkb1,
To add to Mark's response, in DAQmx you set the gain of your amplifier by setting the minimum and maximum of your signal when you create your channel using DAQmxCreateAIVoltageChan. If you need to change these settings after creating the channel, you can use DAQmxSetChanAttribute to set your maximum and minimum value. One thing to be aware of is that you cannot change these values while your task is running, so you will need to stop the task, set the values, and restart the task.
Remember that the NI-DAQmx C Reference Help File installed at Start >> Programs >> National Instruments >> NI-DAQ includes help on all NI-DAQmx functions and properties and is searchable.