Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

Error with Thor Labs piezo controller in LabVIEW

We recently purchased a piezo controller from Thor Labs (model MDT693A, using an RS-232 port).  While attempting to configure it to work with LabVIEW 8.0, something strange happened.  The controller was working fine with a LabVIEW VI that only had basic communication options (Basic Serial Write and Read.vi), so we chose to try a slightly more advanced one (Advanced Serial Write and Read.vi) to see if it would give us more flexibility.  Upon sending the appropriate command to set all 3 voltage channels to 20 volts [AV20\r], all three channels jumped to 158 V.  The controller has 3 settings for the maximum output voltage (controlled by a switch on the back), and this was at the 150 V max setting.
 
Clearly, this was unexpected.  We tried shutting off both the piezo controller and the computer and restarting them several times.  This helped reduce the voltage a little bit--at the 150 V max setting, the X channel is 20 V while the other two are both around 75 V.  We tried different maximum voltage settings, and found interesting results.  At the 100 V max setting, all three are around 50 V.  At the 75 V max setting, all three channels read about 38 V.  This is with all of the front control knobs at their minimums and no external controls.
 
Furthermore, the controller no longer responds to software control, either from within the previously functioning LabVIEW VI (Basci Serial Write and Read.vi) or from the software controller included with the MDT693A.  The error message we receive from the included software is "Error in detecting V limits.  Check connections or Port Settings.  Cycle AC power to console."  We've tried those suggestions--what can we do to correct this problem?  (We are running Windows 2000 Professional and have NI-VISA 3.6 installed, along with LabVIEW 8.0.)
 
Appreciatively,
Michael Johnson
Arkansas State University
0 Kudos
Message 1 of 12
(8,480 Views)

I also have problems to work with this piezo controller and Labview. If I write something it doesn't give an error, but when I try to read it reads the same characters I have written. If I write the character "I" the first time after swithing on the piezo controller, it gives the product information, but only the first time and using this character.

If you could send me information about how it works, I would be very grateful.

Thanks

0 Kudos
Message 2 of 12
(8,268 Views)

This only happens with LabView. We have never experienced this using LabWindows CVI or Measurement Studio (Visual Studio c++) There is something in the serial protocol that LabView manifest that corrupts the uController int he MDT693A or the MDT694A. We have tried to do this in our lab and no matter what have never been able to corrupt the firmware. It happens very rarely but it does happen and the uController needs to be programmed (or more so calibrated) which can be done in the field or send back.

 

0 Kudos
Message 3 of 12
(7,921 Views)

This is an issue that only happens with a LabVIEW application. We have never seen this happen with another application program with a different programming language. In the application, with the serial protocol, something corrupts the MDT693A uController and the calibration of the voltage output of one, or all channels become corrupt. We have tried a multitude of time here at the labs to duplicate this error and have never been able to capture the problem. We have made many changes to protect the uController from being attacked (Atmel 128). It rarely happens but once it does the MDT693A needs to be re-calibrated. This can be done in the field (DVM is required) or the unit can be send back to Thorlabs for the calibration process. This may also happen to the MDT694A.

0 Kudos
Message 4 of 12
(7,671 Views)

 

 

Message Edited by leichner@thorlabs.com on 11-02-2009 07:33 AM
0 Kudos
Message 5 of 12
(7,326 Views)

Using the serial (RS232) communication in LabView may on rare occations corrupt the MDT uController. We have literally thousands of these controller in the field running under LabView but a few develop this problem. It is something LabView sends other than a general read and write protocol that gets into the uController and corrupts the calibration file. If the HV voltage at power up ramps up to half of the Vlimit max it is an indication the calibration data is corrupt. It can be calibrated in the field but a good suggestion at this time is to get in touch with Thorlabs and ask for the calibration procedure or send it back.  

0 Kudos
Message 6 of 12
(6,751 Views)

Hello. I have the same problem. How can i solve it?

0 Kudos
Message 7 of 12
(5,892 Views)

Attach the code you are having problems with and a snip of the error.  What version of LabVIEW are you using? What version of VISA is installed?


"Should be" isn't "Is" -Jay
0 Kudos
Message 8 of 12
(5,852 Views)

Two things come to mind.

 

1. Since it is very recently pointed out on info-labview, that VISA does something weird with bytes of value 0XFF.

<http://info-labview.org/ILVMessages/2012/10/Info-LabVIEW_2012-10_0057.html>

 

2. Timing.  I have had more headaches with serial port receivers that lose data or overwrite data that comes in too fast.  THis is true even with flow control on.  I have a stepper controller that I had to set arbitrary and large delays to make sure that it works most of the time.  Long strings etc.  Or many short strings without delays can be problematic.  Unfortunately it is very hard to track this down for an intermittant error and in one case I have reduced the lockup of the device by a couple of orders of magnitude but not eliminated it.

 

3. Timing, part deux.  And this is only because I have massively started testing USB-Serial devices.  The system response for these are all over the map.  The system may delay reads for many mSecs.   It is unclear if the system delay then signals a timeout (falsely I might add) with characters maybe partially read or not.  For example if sending 16 M characters and checking for a response, almost all come within 13 mS (with 99% fof them within 2 mS that is the latency setting of the USB device).  But there is 1 at 17, 1 at 18 and 3 at 20 mSec.  This kind of jitter can be infrequent but bad if one has not planned for it.

 

I can understand that the non-RT system can have much jitter depending on other system tasks etc.  But it should not mark the transaction as a timeout when the characters had actually arrived.  The question is when the VISA read gets around to doing the read does it check the timout first and then check for characters or the reverse.  THis was a bug back in LV XX in the old "Serial Port with Timout" VI that took several versions to get fixed.

LabVIEW ChampionLabVIEW Channel Wires

0 Kudos
Message 9 of 12
(5,833 Views)

3. Timing, part deux.  And this is only because I have massively started testing USB-Serial devices.  The system response for these are all over the map.  The system may delay reads for many mSecs.   It is unclear if the system delay then signals a timeout (falsely I might add) with characters maybe partially read or not.  For example if sending 16 M characters and checking for a response, almost all come within 13 mS (with 99% fof them within 2 mS that is the latency setting of the USB device).  But there is 1 at 17, 1 at 18 and 3 at 20 mSec.  This kind of jitter can be infrequent but bad if one has not planned for it.

 


see here for links to get out of that type of hell.  Those latency timers and buffer sizes can be tweaked.


"Should be" isn't "Is" -Jay
0 Kudos
Message 10 of 12
(5,819 Views)