Hi,everyone!
I've been familiar with LabVIEW for a while. The current task is to model the analog input of DAQ NI PCI-6221 (futher it is planned to simulate the whole device) but I don't have an idea how to get started.
The following is to be modeled:
- work in different ranges ( ±0.2 V, ±1 V, ±5 V, ±10 V)
- work with multiple channels
- occurance of inaccuracies (gain error,offset error, non-linearity, noise)
- buffer operation
At least for now I have modeled the AI Absolute Accuracy according to the specification https://www.ni.com/pdf/manuals/375303c.pdf
I also attach a draft of the program, where quantization and sampling are emulated (resampling is used). The Sine Waveform generator is static and behaves strangely.
I would be grateful for any help in creating and improving Vi.
Regards