LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How does LabVIEW convert string to char* when passing them to a Call Library Node?

I have a program which calls C++-based libraries which themself use external libraries (Qt). I pass strings to my library through the Call Library Node, and was wondering how the formatting goes? I have to interpret the char* according to Latin1, UTF-8, UTF-16, ... to convert them to a string which the Qt libraries understand. I need to use char* instead of LV strings, because the library is indepent of LabVIEW.

It seems that interpreting the char* as Latin1 (default) does not work on Korean systems (for one of our customers), which is understandable when you know that the Latin character set does not know Korean signs. Anyone knows how the char* should be interpreted then? In other words, for non-Latin languages, what exactly is passed to the DLL?
0 Kudos
Message 1 of 4
(4,326 Views)
I don't think that we reinterpret your string in anyway, just reformat the data so that it can be passed to your dll. 
 
So assuming you are getting text from, say, keyboard editing, the text should be in the ANSI codepage that the system is running under.  That is, if you are running Windows using an English locale, it will be codepage 1252 (Windows Western, Latin 1), if you are running Windows with a Korean codepage iirc it will be codepage 949, Unified Hangul code.
 
If you are filling the string with data that you get from an instrument or some other fashion, it could really be anything.
 
Here is a list of the codepages that Windows knows about
 
 
 
I do have some experience with Qt as well, and I think that the function you are looking for to create, say, a QString from this is:
 
QString::fromLocal8Bit
 
but I am not 100% certain about this as I am not a Qt expert.
 
Jeff Peters
LabVIEW R & D


Message Edited by jpeters on 04-02-2008 12:38 PM
0 Kudos
Message 2 of 4
(4,316 Views)
I already thought about using QString::fromLocalBit8(), but then I have to change the code in quite a lot of places (I construct my object directly based on a char*). In fact, the string is a file path, and korean characters in the path cause the problem.

I was thinking about creating QTextCodec object, based on the type (Latin1, UTF-8, EUC-KR, ...) , and setting it with the static method QTextCodec::setCodecForCStrings(). If I do this, using QString(const char*) should use the right charater set. I wait for some feedback from the customer, otherse I might need to go to the Local8Bit option.
0 Kudos
Message 3 of 4
(4,310 Views)
fromLocal8Bit seems to do the trick, and the changes were not so big in the code. Thanks for the help.
0 Kudos
Message 4 of 4
(4,281 Views)