LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading RPM through FPGA and control from RT system

I have following setup.

one front motor as input rpm. one servo motor to control the speed ratio between front motor to the output shaft(rear rpm). Now i am using the servo to control ECVT to have a fixed output rpm to the rear shaft. My program only activates the required logic if the input rpm is in the range of ECVT's speed ratio range (0.9 to 2.6)  to correct the output rpm. 

I have one hall effect sensor to read the rpm from  output shaft with one pulse per revolution. 

I am using NI 9401 to read the sensors (the rear rpm). I have been able to read the rpm with preety good accuracy at low speed (through FPGA). (i.e if the change in rpm is not too high) but if the change in rpm is too high the reading somehow fluctuates a lot ( sometimes an error of 300 rpm). 

again I am using an logic to control my servo to control the speed ratio between my front shaft and rear shaft. but somehow response time and error correction of this logic highly depends on precision of rpm measurement of rear rpm. So somehow having a stable rpm reading very crucial to the whole control. I have tried using MEAN PTbyPT vi to take a sample of 100 as average rpm. but I still am not able to solve the rpm measurement issue. (you could refer to the pictures below. I have run some tests and get the graphs for the analysis. but it doesn't appear to be convincing enough, since it works ok sometimes but other times, it just runs randomly. 

Also that my logic doesn't appear to response well to the rpm changes.

other issue is if I take MEAN of output rpm, for smaller samples the response time somehow works ok but the readings are not very precise, for higher samples the readings are more stable, not very precise and also the response time and response efficiency of the control for ECVT somehows becomes slower, let's say not very effective.

What i am trying to say is, Is it possible that because of not precise reading of RPM, my code is not responding well? or should I move on to some other logic such as PI, PID, or FUZZY? could anyone please provide some advice on this issue? 

1. how to get a better RPM reading (i.e. more stable reading without loosing too much precision)?

2. is the control logic  requires any change or the problem is actually because of bad rpm reading?

0 Kudos
Message 1 of 3
(3,565 Views)

Hi kumar,

 

So somehow having a stable rpm reading very crucial to the whole control. I have tried using MEAN PTbyPT vi to take a sample of 100 as average rpm. but I still am not able to solve the rpm measurement issue.

You calculate your RPM value by analyzing a boolean signal of your hall sensor.

This is done in a loop running at 1kHz: so Nyquist says you can detect at max 500rpm!

Additionally you apply a running average over 100 samples so you will have a delay of ~100ms in your mean value!

Why don't you apply the servo control logic in the FPGA? It seems pretty easy to set some boolean flags to control your servo…

 

1. how to get a better RPM reading (i.e. more stable reading without loosing too much precision)?

Well, it would help to get more pulses from your hall sensor per revolution!

For simple application I like to receive 24 ppr, other applications even use encoders with 4096 ppr!

 

- Why are there two controls named "Front RPM"? Why not use just a slider with its numeric display set to visible???

- Why are there indicators without a label?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 3
(3,535 Views)

I was able to get rid of unwanted jumps in the reading by adding a rule based filters. Probably not the best idea but withe the limited resources that I have it works pretty OK. 

Thank you for your suggestions. In future, I will keep it in my mind to use a encoder with high resolution. 

0 Kudos
Message 3 of 3
(3,481 Views)