12-28-2006 10:57 PM
12-29-2006 02:54 PM
12-29-2006 11:03 PM - edited 12-29-2006 11:03 PM
Message Edited by sfjd on 12-29-2006 11:04 PM
12-30-2006 05:42 AM - edited 12-30-2006 05:42 AM
Message Edited by sfjd on 12-30-2006 06:12 AM
01-01-2007 04:38 AM - edited 01-01-2007 04:38 AM
Rajeevan,
In continuous mode, you have selected Samples to read as 10 and scan Rate as 1K.
Here, you are losing out on acquired samples and hence the error
So, either increase the samples to read ( For ex: like 100 or 500) or decrease the scan rate to avoid the error that you are currently getting
Message Edited by devchander on 01-01-2007 04:39 AM
01-10-2007 08:23 AM
01-11-2007 12:43 PM
Hello sfjd.
I think part of the problem resides with your minimum and maximum value selection. They are currently configured for +-6 mV, but the value you are expecting from the 100 tonne measurement is also 6 mV. Thus, any amount over this value will be railing the voltage. I would recommend increasing these limits and see if this helps you get more accurate readings. Secondly, I would recommend calibrating the load cells for the full range that you wish to measure over. This will correct any error in the readings and should help a more accurate measurement to be taken.
Good luck on your application!
Brian F
Applications Engineer
National Instruments
01-12-2007 07:44 AM
01-15-2007 05:46 PM
Hello Rajeevan.
You are correct in the way that you are doing calibration via MAX and this should be enough to give you the performance that you need.
The range set by configuring the Max and
It is important to make sure that the 2 mV/V configuration is accurate since this is your scaling factor. Even though you may be reading completely valid values pre-scale, if the scaling is off, you will then see incorrect values post-scaling.
Auto zero mode is a procedure for eliminating offsets generated by an amplifier stage. More information on how to use this feature can be found in the following two knowledge base articles. They also have great information on the general process of setting up a transducer measurement in MAX.
Measuring a Full-Bridge Absolute Transducer with the SCXI-1520 or PXI-4220
http://digital.ni.com/public.nsf/allkb/fd9f22608fdc939a86256b5a0075705d
Removing Large Initial Offset for Load Cell or Strain Measurements with SCXI-1520 or PXI-4220
http://digital.ni.com/public.nsf/allkb/DA2ABCFA3D4C73FD862571080008B8B1
Filter frequency allows you to place a filter on the data to eliminate high frequency data that is not expected to come from a transducer measurement.
As far as your question regarding your LVDT, how are you controlling the fact that the displacement is only 2 mm? If you extend the measurement to a larger distance, does your accuracy increase? Or do you still see the 7.5% error in measurement? Please also refer to this knowledge base article for some extra information on using the 1540 with an LVDT.
Using the SCXI-1540 with a Device that Requires External Excitation
http://digital.ni.com/public.nsf/websearch/5FB0106294CE483886256A02006091A5?OpenDocument
Finally, what type of chassis are you using? Are you using a combo chassis? Are there any other SCXI modules in the chassis along with the 1520 and 1540’s?
Have a great day and let us know how your application progresses.
Brian F
Applications Engineer
National Instruments
01-15-2007 08:39 PM
Dear Brian,
The 2 MV/V sensitivity is specified by the load cell manufacturer HBM in their data sheet(Load Cell type C6A). When I check the Autozero mode to Once and low pass filter frequency to 1 kHz, I get for no load a load of about 850 kg(f). When the load is increased to 10 tonnes, it showed 11.2 tonnes, for 20 tonnes 21.2 tonnes, etc. But for 50 tonnes it showed 50.3 tonnes and for 100 tonnes it showed 99.8 tonnes. It means that initially there is a difference of about 1.2 tonnes but at some intermediate value the error is negligible. How is this happening. My measurement requires to apply only about 10 tonnes of load ( I don't have any load cell with low range that is why I am using 100 tonnes). Since the DAQ output shows such erroneous results how can I depend on this measurement ?
The chasis is PXI -1052, DAQ is multifunction M series 6251, SCXI-1520 and SXCI 1314, SCXI 1540 and SCXI 1315.
The problem is that when I repeat the calibration I am not able to reproduce the same value which I got previously. I am quite frustuated with this measurement. What could be the problem? Is it a problem of the UTM(Universal Testing Machine) which applies the load, or the problem of the load cell, or te problem of the indicator used to read the load from UTM (it is digital display indicator from HBM) or the problem of NI? I could not sort it out yet?
Also I have an experimental programme for which I wrote a VI in LabVIEW 8, which needs to be run continously for two hours. I have 19 channels to read and this I embedded in DAQ assistant. Will there be any memory problem. Since I see my excel file 4 MB for running almost one minute I don't have any idea about whether the program will crash or not? Anu syggestions?
A signal splitter is used to read individual signals and display. I would like to see the load versus lvdt readings in a graph like display. One problem is that it shows only connecting straighlines rather than a smooth curve. I tried single acquistion mode as well as continuous mode. But of no use. I also used time delay of one second inside the block diagram. Sampling rate 1000 and frequency 1k. Any help?
REgards,
Rajeevan