LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

receive bitfield

Hi GerdW!

 

The prototype of the function i am calling is like this:

 

void ExampleFunction(int inputA, int inputB, myType output);

 

 

The type "myType" is like this:

 

 

typedef struct

{

     int a :4;

     int b :8;

     int c :4;

} myType;

 

It means the type "myType" is 16 bit long (because of the bitfields).

 

if I define it without the bitfields, it looks like the struct below:

 

typedef struct

{

     int a;

     int b;

     int c;

} myType;

 

In this case the type "myType" is 48 bit long, considering int = 16bit.

 

When I use the type "myType" without bitfields, LabView can build the shared lib call functio.

 

When I use the struct with bitfields, LabView can't build the shared lib call function.

 

Then, it is clear that LabView is not being able to handle the bitfields.

 

So, i am trying to find a way to make LabView handle the "myType" with bitfields, because this is the output of the DLL.

 

0 Kudos
Message 11 of 21
(1,850 Views)

Hi kito,

 

it seems you know more about C than me Smiley Wink Today I learnt how to define a bitfield...

 

Ok, serious: how is C storing those bitfields?

When it packs together all the bits you can use my approach from message 4 - you just receive a U16. When it uses single ints with just limited number of bits you can still access them seperately by LabView.

 

All you need to know is how your C compiler is storing the bitfields in memory. Then mimic that memory layout by a LabView cluster (or a single int, if possible) and you're done...

Message Edited by GerdW on 11-02-2009 04:29 PM
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 12 of 21
(1,840 Views)

Hi GerdW,

 

The Labview cannot create a cluster like that. Do you have any alternative for the cluster?

 

Regards

 

Kito

0 Kudos
Message 13 of 21
(1,833 Views)

Hi kito,

 

"It means the type "myType" is 16 bit long (because of the bitfields)."

 

Use a U16! And the code from message 4...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 14 of 21
(1,831 Views)

Hi GerdW, I tryed to that but the Labview cannot recognize the bitfields......and as I told you the code can't be changed.

In my opinion the Labview can't handle in a low level (bit). Otherwise the wizard would have succeeded to do that.

Anyway......thanks for your helping

 

Regards

 

Kito

0 Kudos
Message 15 of 21
(1,805 Views)

I'll step in and ask ...

 

What if you wrap the call to this dll with another dll that reconstructs the inputs and outputs into datatypes that LV can call?

0 Kudos
Message 16 of 21
(1,802 Views)

that's the way I will do......But I really would like that Labview could handle bitfields.....

Again, thanks for you helping

 

Regards

 

Kito

0 Kudos
Message 17 of 21
(1,799 Views)

Hi kito,

 

have you only tried the wizard or really tried to receive an U16? No, LabView doesn't recognize the bitfields - it's your turn to get the needed bits from the U16 (as shown in message 4).

Otherwise try Raven'sFan suggestion!

Message Edited by GerdW on 11-02-2009 10:05 PM
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 18 of 21
(1,799 Views)

Hi GerdW,

 

I tryed both and I couldn't have success on that.

 

By the way, Where is the Reaven's approach?

 

Regards

 

Kito

0 Kudos
Message 19 of 21
(1,794 Views)

Hi kito,

 

read message 16.

Make a wrapper that splits your bitfield into 3 seperate outputs...

 

Btw. what is "no success"? LV crashing? Not expected output? Have you examined the output in binary display?

Message Edited by GerdW on 11-02-2009 10:13 PM
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 20 of 21
(1,790 Views)