LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Cut off of scaled signal

Solved!
Go to solution

Hello again.

Guided by one of good Knights of NI I used a DAQ assistant VI to retrive two signals from USB-6210 and to scale them according to my units. I did it programatically before, but still I have the same problem. When I supply from my calibrating device a voltage signal in steps by 1V starting from 0V to 10V all working correct (see attached screenshoots) as long as I switched off custom scaling in DAQ assistant settings. But when I select Temp scale, prepared as a Map range -40 deg C to 125 deg C the result is cut off above some 50 deg C even if I supply exactly the same pattern of voltage from calibrator. The cut of the scaled signal appears above some 5 V at input. The settings seems pretty easy and I do not see to much room for my mistake but some how it does not work. I tested also when only one channel on the device is selected - the same result. I tested also another channel (my preset called viscosity) and have the same cut off above 5 V at input. Both channels are set to differential configuration with input range 0-10V. Any Ideas what is going on with it?

0 Kudos
Message 1 of 4
(212 Views)
Solution
Accepted by Pikey

Hello, again.

 

     Despite urging from several of us to provide clear and complete information to help you with what should be a simple DAQmx task, you continue to provide minimal information, and rarely any useful LabVIEW code.

 

     Here is a suggestion that will enable us to teach you how to solve your DAQmx issues, but it will involve a little bit of work on your part.

 

  1. We know you are using a USB-6210 and are using it to get a reading of Temperature.  As I recall, the 6210 has A/D inputs that accept voltages in the range -10v -- +10v.  To read "Temperature", you need to connect some other instrument that returns a voltage when (presumably a "probe" is) exposed to temperature over some range, say -45ºC to 125ºC.  Please provide us with the specifications of the temperature probe you are using (a copy of its Manual or Fact Sheet would be useful).  How are you wiring it to your 6210?  [Tell us the pins you are using].
  2. Create a "LabVIEW Forum" project that takes 10 reading of Temperature at 0.1 Hz (temperature doesn't vary very quickly) and saves the readings in a variable "Temperature" (presumably an Array of something).
  3. Notice I used the word "Project" in the previous point -- you need to create a LabVIEW Project inside of which you have a VI called something like "Read Temperature".  This should have an Error Line running through all of the VIs you put in this Project -- there might only be one function, the Dreaded DAQ Assistant, set up to take 10 readings at 0.1 Hz and otherwise scaled and specified as you need for Temperature.  When you run the program, it should take 100 seconds (each sample takes 10 seconds, and there are 10 samples), so it would be helpful to have a cup of very hot water that you could put the temperature probe into when you start the Program so we can see that you get a changing set of readings.
  4. When you finish the "Read Temperature" routine and see a range of voltages, "preserve" those output values by going to the Edit menu and choosing "Make Current Values Default".  That "preserves" them for us to see.
  5. Here's the most important step -- you need to send the Project Folder (not just the one VI) to the Forum.  Note that the Project folder should contain the LabVIEW VI you just wrote, and a "Project File" (extension .lvproj).  Close the Project, close LabVIEW, and open File Explorer to find the Project Folder.  Right-click it and choose "Send to", and "Compressed (zipped) folder".  This will create a .zip file of the entire Project, which you should attach with your Reply.  Also attach the "Temperature" information mentioned in Point 1, above.

Bob Schor

Message 2 of 4
(188 Views)

After configuring the custom scale, you should set the Scale Input Range.Max = 125 and Scale Input Range.Min = -40.

dsbNI_0-1754319736048.png

 

If you leave Max = 10 and Min = 0, the device will not use the full input range you are expecting. If you set the signal input range to the input range expected (given your custom scale), the readings will not rail unexpectedly.

Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
Message 3 of 4
(151 Views)

I used some of your points as they were very helpful indeed. Honestly, I'm building my VI for special, one-time use, so I simply need a working application. I managed by abandoning the built-in scaling tool and simply adding a mathematical formula module to linearly convert the voltage to the units I need. It may not be elegant or correct, but it's fast and effective. I might return to this topic someday, but for now, thank you for the valuable tips.

0 Kudos
Message 4 of 4
(95 Views)