LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

FPGA 7833R: Monitoring data coming from DMA FIFO in the Host VI

Hello to everyone:

I'm running a PID in a NI-7833R FPGA and monitoring it using a waveform chart in the Host VI. The point is that I have an odd behaviour for the monitoring: there are three signals to plot (Error signal, control signal and Setpoint) and what happens is that there is a continuous swap between the colors of the three, except for given values of the "Number of points" control.

Could someone here tell me how to sort out this ?

I attach both block Diagrams: Host VI and FPGA VI

Thank you and sorry for the spam of the screencapture software:)




Download All
0 Kudos
Message 1 of 7
(6,272 Views)
One more comment:

The funny ad standing in the middle of the FPGA VI hides the digital PID VI, which has been taken from the examples of LV 8.0.

Thank you again.
0 Kudos
Message 2 of 7
(6,267 Views)
Hi Zermelo-
 
The problem you are seeing is due to the FIFO overflowing on the FPGA side.  The short story is that when the FIFO fills up, it times out on the FIFO write.  This causes a data point to be lost on the FPGA side, however, the Host side does not know this and does not compensate when doing the decimate array.
 
For a more detailed explanation see this post:
 
 
As stated in that post, the easiest way to fix the problem is to put a timeout of '-1' on the FIFO write on the FPGA, this will guarantee that no data is lost, however your timing on the FPGA will be affected.
 
If you still have questions after reading that post, let me know-
 
Dustin
0 Kudos
Message 3 of 7
(6,258 Views)
Hello again:

I've read the discussion , and I must say that I had set both timeouts (read and write) for the FIFO to -1. The Number of elements is set to maximum for the FIFO  size, which is 32767 and the elements remaining indicator shows values close to zero. The full indicator is always false

I cannot afford to slow down the FPGA since I need the PID running at maximum speed , and the alternative solution purposed in the attached example may cause latch to the loop, what is not possible in my case, since I need deterministic behaviour.

Is there any way to check that the problem is FPGA overflow?


0 Kudos
Message 4 of 7
(6,232 Views)

Zermelo,

What is the acquisition and PID loop rate you are running on the FPGA? If that rate is very high, you could edit your FPGA code so that only every Nth set of data is passed to the DMA FIFO, reducing the amount of data passed to the host. Even if you reduce it by a factor of 10 you will still have enough data on the host to see what is going on in your control loop.

Secondly, you shoudl try to reduce the amount of processing on the host side so that the host is able to keep up with reading and processing the DMA data and avoid DMA buffer overflows.

1. Move the code to calculate the PID gains and output range outside of the While loop.

2. To calculate the setpoint and PID offset use a simple Mulitply node (multiply by 3276.8) instead of the Expression Node.

3. Modify the code that processes the DMA data. Use the Array Reshape function to directly convert the 1D array into a 2D array. Then Transpose the array and use the Divide node to scale the data (instead of using the Expression Node). See the following example.

 

Message Edited by Christian L on 02-21-2007 10:22 AM

authored by
Christian L, CLA
Systems Engineering Manager - Automotive and Transportation
NI - Austin, TX


  
Message 5 of 7
(6,220 Views)
Hello Christian.

The PID loop rate is 6 uS.

I must say that I agree on points 2 and 3 of your explanations, but concerning with the first one : I have to modify the gains during the tuning process, so I don't think I can place the code  which deals with the gains calculation outside the while loop.

But still, do you think that the problem I have is FPGA overflow?

Let me thank you both the discussion


0 Kudos
Message 6 of 7
(6,203 Views)
Hi again:

I was looking the code for the example attached, and I don't understand why you advice me to convert the 1D array to 2D and then transpose. How does this saves processing for the Host?

Forgive me, but I'm quite new with labview as well.

Thanks!
0 Kudos
Message 7 of 7
(6,198 Views)