05-28-2018 09:31 AM
wiebe@CARYA wrote:
@Blokk wrote:
Yep, based on the OP's name, he is from Germany, and the German decimal sign (,) often ruins things. Over the last years my German colleagues learned that if they want to get help from me, they should use English Windows and English LabVIEW 😄
(I know that you can change the decimal sign in the Region settings, or you program this around in LabVIEW)...
Programming LabVIEW in Europe 101:
1) Use %.; in format to string and scan from string,
2) set "use system decimals" to false.
When dealing with files, databases, etc.. When you output a string to the used, you should use what the uses choose, e.g. the system decimal point. Very confusing the first few programs you make.
There is an option in LabVIEW to globally ignore the system settings (always use "."), but really everyone should just use 1) and 2). Everyone, not just everyone in Europe.
Yes, IF you are the programmer, then you can easily avoid such issues. But there was not a single case when I got burned by an "official driver" from a manufacturer which used wrong decimal point handling, thus the driver was throwing errors under German OS...
05-28-2018 09:34 AM
All fetch commands say:
Reads the current return loss value. It does not provide its own triggering and so must be used with either continuous software triggering (see “:INITiate[n]:[CHANnel[m]]:CONTinuous?” on page 89) or a directly preceding immediate software trigger (see “:INITiate[n]:[CHANnel[m]][:IMMediate]” on page 88).
Doesn't explain reading -1.123456 iso -15.123456, but I guess that's just a testing artifact (e.g. mistake)?
05-28-2018 09:40 AM - edited 05-28-2018 09:40 AM
yes it was an example but a real example. I'm actually running the VI and the value displayed on the powermeter screen is -50.6531263 dBm but the indicator shows -5.6531263!
05-28-2018 09:40 AM
@Blokk wrote:
wiebe@CARYA wrote:
@Blokk wrote:
Yep, based on the OP's name, he is from Germany, and the German decimal sign (,) often ruins things. Over the last years my German colleagues learned that if they want to get help from me, they should use English Windows and English LabVIEW 😄
(I know that you can change the decimal sign in the Region settings, or you program this around in LabVIEW)...
Programming LabVIEW in Europe 101:
1) Use %.; in format to string and scan from string,
2) set "use system decimals" to false.
When dealing with files, databases, etc.. When you output a string to the used, you should use what the uses choose, e.g. the system decimal point. Very confusing the first few programs you make.
There is an option in LabVIEW to globally ignore the system settings (always use "."), but really everyone should just use 1) and 2). Everyone, not just everyone in Europe.
Yes, IF you are the programmer, then you can easily avoid such issues. But there was not a single case when I got burned by an "official driver" from a manufacturer which used wrong decimal point handling, thus the driver was throwing errors under German OS...
Those are often made by interns with 0 experience with LabVIEW, programming, the device, or most likely all of them.
05-28-2018 09:42 AM
@J.Badr wrote:
yes it was an example but a real example. I'm actually running the VI and the value displayed on the powermeter screen is -50.6531263 dBm but the indicator shows -5.6531263!
What does the string indicator say?
05-28-2018 09:48 AM
wiebe@CARYA wrote:
@J.Badr wrote:
yes it was an example but a real example. I'm actually running the VI and the value displayed on the powermeter screen is -50.6531263 dBm but the indicator shows -5.6531263!
What does the string indicator say?
I've never had a device that removed numbers from the middle of values.
Could there be some auto attenuation going on here? Maybe the display shows the absolute power, while the value you read shows the attenuated value (e.g. attenuation would be -45 dBm)?
05-28-2018 09:54 AM - edited 05-28-2018 10:06 AM
No i don't think that there is attenuation mode because i'm using this device (Agilent 8163B) only to acquire power from the power sensor (ch1). Here in attachment capture for the indicator display format
05-28-2018 10:11 AM
We need to establish if the error is between the string coming in and the integer being displayed.
So what is the string data coming in, at the time of conversion to the integer? It was on the FP of the VI you send earlier, but this screen shot doesn't show it.
Are you sure the 2nd digit is missing, or is the displayed value simply a completely different parameter?
I don't see how a missing 2nd digit would ever be useful to anyone, and I find it hard to belief you're the first to notice that.
Are you sure you need to use fetch and pow, not read\sense, mon\ret?
I'd read the manual over and over, I'm sure the answer is in there somewhere.
05-28-2018 10:27 AM
yes the 2nd digit is missing ! i'm reading the manual. The answer should be there.
Thank you very much for your interest.
05-28-2018 10:39 AM
Also read the manual from page 195, there is a simple C example code, which also shows the difference between the "Fetch" and "Read" commands. Maybe it can help you to properly implement what you want in LabVIEW...