08-02-2022 10:32 AM
Hi,
I'm programming an FPGA in execution mode = Simulated (I don't have hardware yet). It's the first time for me with a simulated FPGA. I typical use real hardware.
The final goal is to make the FPGA perfom some calculation and send results to the real-time software (CompactRIO) through a DMA FIFO.
At the moment, my FPGA code is composed by a loop only, timed with the Loop Timer VI (See figure below). My problem is that the loop timing seems to be ignored.
I'm using ms as unit of measurement just for testing.
The measured loop time is correct. I mean that the time loop setpoint and the measured loop time are equal. However, the Loop timer seems to be ignored and the actual time loop is very very very faster than the expected one (I can see this by looking the indicator "counter").
I know that I'm working with simulated hardware but in this way the PC's CPU is very high and loop times are "out of control".
I don't need a precise time loop at this point of development and testing but I would need at least to speed down the loop time.
Am I missing something?
Thanks in advance
08-03-2022 02:12 AM - edited 08-03-2022 02:15 AM
Form LabVIEW help ( https://www.ni.com/docs/en-US/bundle/labview-2020-fpga-module/page/lvfpgahelp/running_fpga_vi_on_emu... )
If you use certain FPGA resources and you execute the FPGA VI in simulation mode using simulated I/O, the resource uses simulated time instead of real time. Simulated time might be faster than real time depending on the number of events that occur during the simulation. For example, if you add a Wait (Simulated Time) VI to the block diagram and set the delay to 1000 ms, LabVIEW does not attempt to delay one second of real time. Instead, LabVIEW delays as long as necessary before executing the next scheduled action in the simulation.
The following resources use simulated time on the host: