LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Modeling the analog input of a NI PCI-6221 DAQ device

Hi,everyone!

 

I've been familiar with LabVIEW for a while. The current task is to model the analog input of DAQ NI PCI-6221 (futher it is planned to simulate the whole device) but I don't have an idea how to get started. 

 

The following is to be modeled:

  • work in different ranges ( ±0.2 V, ±1 V, ±5 V, ±10 V)
  • work with multiple channels
  • occurance of inaccuracies (gain error,offset error, non-linearity, noise)
  • buffer operation

At least for now I have modeled the AI Absolute Accuracy according to the specification https://www.ni.com/pdf/manuals/375303c.pdf 
I also attach a draft of the program, where quantization and sampling are emulated (resampling is used). The Sine Waveform generator is static and behaves strangely.

 

I would be grateful for any help in creating and improving Vi.

 

Regards

Download All
0 Kudos
Message 1 of 1
(1,127 Views)