12-07-2009 06:54 AM
Hello,
I have compiled the below into a .dll using CVI. It converts the hex input to a floating decimal. Calling this .dll from TestStand 3.5 returns the decimal number 15. The output from the CVI I/O window is 15.380562. The input is "417616C8". In TestStand when theFloat is declared as a unsigned 32-bit integer it returns "15" If assigned a 32-bit Real Number (float) it returns "IND" Any ideas why I cannot return the whole 15.380562?
Thanks in advance,
Steve
long __declspec(dllexport) HexToFloat(char HexInput[])
{
float theFloat;
sscanf (HexInput, "%x", (int *) &theFloat);
//scanf("%x", (int *) &theFloat);
printf("Hex input 0x%08X, %f\n\n", *(int *)&theFloat, theFloat);
return theFloat;
}
12-07-2009 07:56 AM
You have defined the function as type long. Try using float (or double) instead.
double __declspec(dllexport) HexToFloat(char HexInput[])
JR
12-07-2009 08:20 AM