10-28-2015 01:40 PM - edited 10-28-2015 01:44 PM
I'm trying to understand how the MIN and MAX values within Labview are calculated or if they are accessible (I am coding in .NET but it still works all the same). For instance, on my UI am using a Linear Scale (customscale) and currently have the user enter in the Slope and Y-Intercept, then the MIN and MAX. I would prefer that I could code the MIN and MAX values of the channel based on the calculated linear scale, thus eliminating the need for the user to manually enter MIN and MAX on the UI.
For testing purposes I calculated a Slope and Y-intercept of a 0-30mm (units) scale in the attached jpeg. A correct MIN and MAX value would be 0 (MIN) and 30 (MAX). I purposely entered in a very high MAX in order to throw a DAQ exception error.
How is the MIN and MAX within the hardware calculated? Is there an easy way for me to get the value by invoking methods or accessing certain properties? I suppose its not that big of a deal that the user has to manually enter the MIN and MAX but I would like to automate that part if I can.
10-29-2015 07:35 AM
The MIN and MAX settings are used by the hardware to adjust its scale, so that you can have the best resolution for your measurement. Yes, you can change those values with property nodes or NI DAQmx.
God Bless