LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Unpacking I32 into 10bit format , then from 10bits to 16bits Integers using MSB

Solved!
Go to solution

Hi Dave,

 

Thank you so much. I think that would be what I'm expecting to see from the data and your vi works perfectly. Would you mind to briefly explain how did you convert that into the waveform graph? (I'm having a difficult time to understand the logic behind converting the packed data to 10bits then 16bits just by reading your vi)

 

Also, those regularly artifacts are actually related to the number of samples recording and the timing of the triggers in the settings (and it ain't missing data). 

 

Marcus

0 Kudos
Message 11 of 12
(366 Views)

Hi marcus,

 


@hobmarcus wrote:

I'm having a difficult time to understand the logic behind converting the packed data to 10bits then 16bits just by reading your vi


  1. The data is read as I32.
  2. Those I32 values are converted into boolean arrays and concatenated into a large 1D boolean array.
  3. That large array is reshaped into a 2D array of N rows and 10 columns.
  4. Each row (= 1D array of 10 columns/elements) is converted into an I16 numeric value.
  5. So you got your array of I16 values.

 

(Maybe you need to do some bit shifting in the I16 values to move those 10 bits into the MSB part of the 16 bits. Right now those bits only represent the LSB part of each I16 value…)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 12 of 12
(362 Views)