Real-Time Measurement and Control

cancel
Showing results for 
Search instead for 
Did you mean: 

FPGA PWM motor control "disabled by drive fault"

Hello Ian,

 

We contacted the motor vendor and found that the start-up current can reach 30A! So I believe Sam is in the process of making sure that the power supply doesn't allow for that kind of spike on an equivalent requested torque. I see your comment as being along the same track which helps significantly.

 

Sam can update us on where we are.

 

Kind Regards,

Michael S.
Applications Engineer
NI UK & Ireland

0 Kudos
Message 11 of 17
(1,570 Views)

Hi Michael,

 

I may have misunderstood your post(?), but I don't agree the drive fault is caused by the power supply sourcing too much current - actually the opposite.

 

Relying on the power supply to limit the motor current won't work with the 9505 module.  The correct way to limit current is with closed-loop control of the PWM ratio (based on the drive's current feedback measurement).  There are built-in examples in LabVIEW of how to do exactly that.

 

Regards,

Ian

0 Kudos
Message 12 of 17
(1,562 Views)

Hello Ian,

 

I agree, and I do believe that Sam is using the Current Loop example as a starting point, hence looking at other possible sources of the problem.

Let's wait for Sam's feedback.

 

Thank you for your response, it is much appreciated.

 

Kind Regards,

Michael S.
Applications Engineer
NI UK & Ireland

0 Kudos
Message 13 of 17
(1,560 Views)

Hi everyone,

 

I've had a lot of thoughts today and a lot is happening. Firstly i have an idea that my control simulation may be able to limit the voltage. This simulation produces a 5 row state with the 5th element as the current. If this current rises too high it can be coerced down to say under 12amps and then this signal can be fed into the linear quadratic regular to create a voltage which in turn creates a pwm signal that will not draw enough current to disable the drive.

 

The voltage is not too low as i have experienced when this happens on the power supply and everytime it has failed the voltage has been over 6 volts. You are right though that if i start it/turn it down below 6 volts then the driver also disables itself.

 

The thought at the office is that it may be a good bet to go down current control instead of voltage(velocity) control. This will mean that the dynamic model will have to be changed to exclude the voltage as an input and include the current but it shouldn't be too hard. Then a pid innner control loop can be used to limit the current at a certain value and this should work.

 

I also have a 200micro henry inductor to use in series beteen the 9505 driver and motor if needs be to decrease the "spike" effect of the motor.

 

I have also just ordered a more powerful driver. This can recieve an anolgue signal and generate a pwm from it. This will be used if other work around solutions at lower current cannot be found, the link to the motor is below. The program would then involve outputting the desired control voltage through an analogue NI module and into this driver.

 

http://www.dimensionengineering.com/Sabertooth2X25.htm

 

The things that i am working on at the moment is to test the current setup with the current limit in the simulation. However the simulation which outputs a duty cycle percentage has stopped working and just wont produce an output even when fed with values. I have included my zipped work below which shows the main project ballbot_fpga. The simulation that wont work is labelled simulation_simple_notworking. I don't know why this wont work, if you could have a look that would be very much appreciated. The original simulation is called gggggggggg which has a system and sensor box which act to create 2 sensor values. The two sensor values in the simple simulation are just slider controlls and replace this. I used to have the controll outside of the simulation loop and have the loop run for a short time in a continuous while loop, this produced a pwm but the current seemed innacurate. The main simulation called ballbot_control_sensor_simulation also does nto work. When turned on there is no motor voltage and then when a small contorl is used the output from the simulation seems to ramp up until it continously outputs the maximum duty cycle. The ramp only last a second or so. Even when the controls are turned back to zero(i.e no sensor input, it is balaced) the voltage is still high-it should return to zero. It appears it is getting stuck on the value or something.

 

Sorry for the ultra long update! Any help anyone can give on this problem or the design as a whole is greatly appreciated. Thanks for all your help so far.

 

Many Regards,

 

Sam Jackson

 

0 Kudos
Message 14 of 17
(1,550 Views)

forgot to attach the files!

0 Kudos
Message 15 of 17
(1,547 Views)

Ok i just realised why the simulation "doesn't work". When im putting in the two sensor values into the control algortihm it creates a voltage to correct it. The next loop the sensor values i'm inputting through control sliders on the front panel are still the same so the system still thinks it's unblanaced so applies more voltage and so on until its reached it's maximum. In reality the sensor values would decrease and change quickly meaning the state space would get different values that were representative of what happens. When setting the controls instead of letting them change due to nature as it were the state space representation of the system is not correct as the corrective nature is not helping and hence it just tries to apply more power!

 

When having the control sliders outside of the loop and setting the simulation to run quickly inside a continuous while loop means the state space/control algorithm just gets a value each time, outputs a correcting voltage and then on the next loop is "reset" so that changing the control will just change the output desired. 

 

When using the real system the first simulation above will work as the corrective action will reduce/change the sensor inputs and they won't remain constant. The second type could be used but would involve more computing and it is better to have the simulation run continuously always creating values rather than stopping and starting and resetting.

 

Glad i got that one sorted!

0 Kudos
Message 16 of 17
(1,540 Views)

@Ian C wrote:

Hi Michael,

 

I may have misunderstood your post(?), but I don't agree the drive fault is caused by the power supply sourcing too much current - actually the opposite.

 

Relying on the power supply to limit the motor current won't work with the 9505 module.  The correct way to limit current is with closed-loop control of the PWM ratio (based on the drive's current feedback measurement).  There are built-in examples in LabVIEW of how to do exactly that.

 

Regards,

Ian



Just to update the forums with our progress,

The 9505 will indeed limit the current within a certain range, however if a current spike of the magnitude of 30A hits the module in between loop iterations, the first thing it will worry about is not letting itself fry! So it will indeed cut off and not be able to regulate the current. The current loop will work within reasonable constrains of current, and it should not be relied upon to limit the extreme high currents that servo motors tend to want to draw.

 

For the case of a ballbot (if anyone else is thinking of building one!) stepper motors would probably be better, or otherwise make sure your drive can handle the currents that correspond to the torque needs of the motors.

 

With Sam we are in the process of testing the setup with AO and DIO modules with a different drive instead of the 9505.

 

Hope this helps,

Kind Regards,

Michael S.
Applications Engineer
NI UK & Ireland

0 Kudos
Message 17 of 17
(1,495 Views)