Real-Time Measurement and Control

cancel
Showing results for 
Search instead for 
Did you mean: 

Why is FPGA so slow?

Solved!
Go to solution

Hi,  I am pretty new to doing Real-Time and FPGA programming.  I've been reading a LOT about speed issues, such as certain modules slowing down the rest unless using their own loops, and using fifo's to pass data faster.

 

But for now, I'm just doing simple reading and a simple LED command.  If you look at my code that I've attached, you'll see the reads in one loop and the LED command in it's own loop.  I've got the LED just turning on/off every 1/2 second. (The Tick counts are set to mSec, 8bit)  But when the code runs on my FPGA, the LED only flashes every other 12.5 seconds.  That's basically 25 times slower than it should.

 

I also have a tick counter sending it's value to my real-time code, which also has it's own tick counter, and both are getting logged.  When I look at the two tick counter's data side by side, it shows that the real-time tick counter time is perfect, but the fpga's tick counter time is way slower.  It ends up being 25 times slower.

 

Why is the FPGA reporting that it's running so slow?  IS it going slow? Or is the tick counters not working correctly? Am I not understanding how this works?  Even if there are slower i/o modules causing the fpga to run slower, the tick counters should still report the exact time right?

 

Does this possibly make sense as to why I'm getting this behavior?

 

thanks,

-Jes

0 Kudos
Message 1 of 6
(5,210 Views)

Hi Jes,

 

You'll need to configure your Tick Counts to be at least 16 bits in order to avoid overflowing the tick count (at 255) before you get to 500 msec. I don't understand, though, how you can ever get into the LED toggling case with this case, as you should never be able to have a U8 greater than 500. Is this the exact code you ran the test with?

 

Jim

Message 2 of 6
(5,204 Views)

Jim,  sorry about that....back when I did the tests, I had those tick counts on ticks and I had 32 bit selected.  I also had the 500 wait time set to 500000000.  I just tested that code now and yes, it's not even blinking at all.  I'll test it just setting them back to 16 bit for now and see what happens. But with the new info I just mentioned, should it still be running 25 times slower?

 

thanks,

-Jes

0 Kudos
Message 3 of 6
(5,198 Views)

Jim, 

 

I just ran the code switching only the tick counts to 16 bit and the LED is flashing perfectly! thanks!

 

But back when I had ticks selected and 32 bit selected and the wait time at 500000000 would it not have worked correctly?  Maybe there is something about tick count that I don't understand.

 

thanks,

-Jes

0 Kudos
Message 4 of 6
(5,197 Views)
Solution
Accepted by topic author Jesse_Larsen

Hi Jes,

 

Given those numbers, I think what you were seeing is correct. Assuming a 40 MHz clock, the calculation is:

 

500 Mticks / 40 Mticks/sec = 12.5 sec.

 

The 32-bit setting doesn't affect the timing behavior, it just controls how high the tick counter can go without rolling over.

 

Jim

 

0 Kudos
Message 5 of 6
(5,194 Views)

Jim,

Yes! that is what was confusing me!  I was not considering 40 mticks per second.  Now my calculations are all making sense.  Thank you!

-Jes

0 Kudos
Message 6 of 6
(5,192 Views)