LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

CRIO sensitivity

Hello, everyone.

I need a way to solve the problem. The problem I have is CRIO sensitivity.

First of all, the hardware is CRIO 9063 and ni9232.

The sensor connected is 10 V/g sensitivity.

When I have a 1g sine wave from the sensor calibrator, the data stores a +/- 3,000,000 sine wave.

This was done at fpga vi.
However, when I try to save data using DAQ Assistant in Windows OS vi, the value +/- 10 is normally stored.
(10 because sensitivity is 10 V/g)

 

I am currently using fpga with RT, so I would like to make an accurate calculation.

 

What's wrong with me?

SGL was used to share one FIFO with other data (the number of bits).

Note, the NI 9234 module will output normally.

SOR.png

 

 

0 Kudos
Message 1 of 2
(1,951 Views)

Reading analog inputs gives you an I16 value.  That has a range of -32768 to 32767.

 

So if that analog input has a range of +/-10 V,  you need to divide the number by  65536 and multiply by 20 to get the input as volts.

 

Since it is 10V/g, you may need to divide by 10 to get the value in g's.

0 Kudos
Message 2 of 2
(1,922 Views)