LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to Convert 1D Array of Digital Waveform to Decimal Numbers in LabVIEW?

Hello NI Community,

I am working on a LabVIEW application where I acquire digital signals using DAQ Assistant. My goal is to convert a 1D array of digital waveforms into two separate decimal (U32) numbers.

Current Approach:

  • I am reading 16 digital lines (0-15) as a 1D array of digital waveforms.
  • I split the array into two parts:
    • First 8 lines (0-7)
    • Second 8 lines (8-15)
  • I need to convert each 1D subarray into a U32 decimal value.

Issue:

  • The "Digital Waveform to Binary U32" function requires a single digital waveform, not an array.
  • I am unable to directly connect the 1D array of waveforms to the function.

Question:

How can I correctly convert each subarray of digital waveforms into a U32 number?
Are there any LabVIEW functions or workarounds I should use?

Any suggestions or example VIs would be greatly appreciated!
Thanks in advance.

0 Kudos
Message 1 of 3
(233 Views)

I cannot seem your Vi unless you do a "save for previous" before attaching (LabVIEW 2020 or below)

 

"decimal" if a formatting convention, not a datatype.

 

If you have an array and a VI that operates on an element of the array, just wrap a FOR loop around the operation.

0 Kudos
Message 2 of 3
(197 Views)

You get a chain of digital signals and want to convert that to an integer?

There's a Digital to Binary array. Combine that with Boolean array to number.

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 3 of 3
(152 Views)