11-02-2009 09:21 AM
Hi GerdW!
The prototype of the function i am calling is like this:
void ExampleFunction(int inputA, int inputB, myType output);
The type "myType" is like this:
typedef struct
{
int a :4;
int b :8;
int c :4;
} myType;
It means the type "myType" is 16 bit long (because of the bitfields).
if I define it without the bitfields, it looks like the struct below:
typedef struct
{
int a;
int b;
int c;
} myType;
In this case the type "myType" is 48 bit long, considering int = 16bit.
When I use the type "myType" without bitfields, LabView can build the shared lib call functio.
When I use the struct with bitfields, LabView can't build the shared lib call function.
Then, it is clear that LabView is not being able to handle the bitfields.
So, i am trying to find a way to make LabView handle the "myType" with bitfields, because this is the output of the DLL.
11-02-2009 09:27 AM - edited 11-02-2009 09:29 AM
Hi kito,
it seems you know more about C than me
Today I learnt how to define a bitfield...
Ok, serious: how is C storing those bitfields?
When it packs together all the bits you can use my approach from message 4 - you just receive a U16. When it uses single ints with just limited number of bits you can still access them seperately by LabView.
All you need to know is how your C compiler is storing the bitfields in memory. Then mimic that memory layout by a LabView cluster (or a single int, if possible) and you're done...
11-02-2009 09:38 AM
Hi GerdW,
The Labview cannot create a cluster like that. Do you have any alternative for the cluster?
Regards
Kito
11-02-2009 09:40 AM
11-02-2009 02:41 PM
Hi GerdW, I tryed to that but the Labview cannot recognize the bitfields......and as I told you the code can't be changed.
In my opinion the Labview can't handle in a low level (bit). Otherwise the wizard would have succeeded to do that.
Anyway......thanks for your helping
Regards
Kito
11-02-2009 02:57 PM
I'll step in and ask ...
What if you wrap the call to this dll with another dll that reconstructs the inputs and outputs into datatypes that LV can call?
11-02-2009 03:00 PM
that's the way I will do......But I really would like that Labview could handle bitfields.....
Again, thanks for you helping
Regards
Kito
11-02-2009 03:01 PM - edited 11-02-2009 03:05 PM
Hi kito,
have you only tried the wizard or really tried to receive an U16? No, LabView doesn't recognize the bitfields - it's your turn to get the needed bits from the U16 (as shown in message 4).
Otherwise try Raven'sFan suggestion!
11-02-2009 03:07 PM
Hi GerdW,
I tryed both and I couldn't have success on that.
By the way, Where is the Reaven's approach?
Regards
Kito
11-02-2009 03:11 PM - edited 11-02-2009 03:13 PM
Hi kito,
read message 16.
Make a wrapper that splits your bitfield into 3 seperate outputs...
Btw. what is "no success"? LV crashing? Not expected output? Have you examined the output in binary display?