Sven Bone wrote:
> The DLL reads somes parameters from an INI file and converts the read
> strings with the "atof" command to a double value which is internally
> used in the DLL. That's works perfectly in the C environment. But when
> I use the same DLL in LabView, the conversion from a string to a
> double parameter fails. All values are rounded. (e.g. 0.85 --> 0 or
> 1.34 --> 1).
Seems like a decimal point problem to me. Are you sure the string passed
to the DLL is in the correct format? If that is the case your local
settings are probably an issue. Not sure what ANSI says but the MSVC
runtime libraries certainly use the localized settings of the number
format. You can set that in the Control Panel->International Settings.
If this difference is happening o
n the same computer, you are in for
some serious head cratching as why that DLL does use a different atof
implementation when testing in C or LabVIEW.
> I have no idea what's going wrong. Another problem is, that I can't
> debug the DLL in the LabView environment, which makes it very
> difficult to locate the bug.
Well, you can debug in LabVIEW 6.0.x. After that setting the VI to
non-debug is supposed to generate normal exceptions instead of catching
them in LabVIEWs own exception handler but somehow it does not seem to
work that way for me.
Rolf Kalbermatter
>
> Thanks for your help in advance.
Rolf Kalbermatter
My Blog 
DEMO, Electronic and Mechanical Support department, room 36.LB00.390