LabVIEW FPGA Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

Hi How about facility of import and export of I/O Label in FPGA-Real time project as shown image instead of manually renaming each I/O

 

IO Label.jpg

I've searched but can't see anything similar - please add a method for setting the timeout for FPGA nodes. This includes the 'Open FPGA reference' and FPGA IO nodes.

 

If you disconnect a cRIO FPGA (e.g. NI 9148) from the network, it takes 20-30s for the IO node or Open FPGA reference to execute. This is really bad for the user experience as if they try to exit their application in this time it may take half a minute for the application to exit. It also means you may have to wait that length of time to realise that your FPGA has disconnected under most use cases (you can obviously have an external watchdog loop to check that the node is executing in a timely manner)

 

Please allow me to configure the timeouts for these nodes similar to the TCP/UDP or VISA nodes. They are very similar in how they operate to the FPGA nodes (i.e. a hardware device driver which is susceptible to disconnects!) so I don't understand why these have been omitted.

 

I wouldn't mind having to set the timeout as part of opening the FPGA reference and then internally have it use the same timeout for other IO nodes as follows:

2014-09-29_17-16-29.jpg

 

 

While attempting to debug NI1483 issues, I found it necessary to make modifications to the NI1483 CLIP.  In LabView 2014 and earlier, it's not possible to maintain your own IO Module CLIP directory.  One must maintain all IO Modules within the IO module search path (<National Instruments>\Shared\FlexRIO\IO Modules folder ).  This can be done by copying an existing IO module to a new path within the <National Instruments>\Shared\FlexRIO\IO Modules folder, then editing the *.tbc file to rename the "model" key.  The main issues with this approach are the potential lack of administrator permissions and the difficulty of maintaining source control in a non-project related system directory.

 

The suggestion is thus:

 

1. Give the user an option to select the path of the IO module under the IO module Properties General Category (When Enable IO Module is selected).

 

That's it!

 

I often work with the FPGA in hybrid mode because the Scan Interface covers most of the project requirements 90% of the time.  When NI added support for the SGL datatype to the FPGA module in 2012 (?), they overlooked user-defined variables.  There is currently no built-in support for typecasting a SGL to U32, so passing SGL data back to the host requires FP controls or using custom typecasting solutions (see SGL typecast) on both the FPGA and host layers.

 

Please add SGL as an option for user-defined variables.

 

 

What I would like to see when transferring FPGA Projects from one target machine to another, that I will not be forced to recompile the Vi code when testing the FPGA code whilst passing parameters within the front panel when executed on the new target., Work arounds like creating a small host front panel to allow parameter changes kind of defeats the object of a FPGA Front panel execution.

We need a way to simply reinterpret the bits in our FPGAs.  I currently have a situation where I need to change my SGL values into U32 for the sake of sending data up to the host.  Currently, the only way is to make an IP node.  That is just silly.  We should be able to use the Type Cast simply for the purpose of reinterpreting the bits.

Hello all.

 

It is time-consuming that we have to compile all LabVIEW FPGA code even if there is tiny little change on FPGA code.

I understand there is sampling probe, Desktop execution node and simulation tools to reduce such time.

 

Our customer in Japan, would like to use incremental compile function also on LabVIEW.(Please see below)

I agree his opinion.

 

http://www.youtube.com/watch?v=9v50uCVdW3o

 

What do you think?

 

Eisuke Ono

Application Engineer at National Instruments Japan.

 

 

 

How amazing yould it be to have the ability to visualise resource usage on a FPGA target using a similar view to that shown above (courtesy of Windirstat)

 

I only recently shaved a significant portion off my FPGA usage by finding out that I had a massively oversized FIFO in my code for almost a year without noticing.  I feel that this kind of visualisation (with mouse over showing what is actually occupying the space) with differentiation between Registers, LUTs, BRAM, DSPs and so on would greatly aid those of us trying to squeeze as much as possible out of our FPGA designs.

 

I think providing this information based on the "estimated resource utilisation" i.e. before Xilinx optimises stuff away would be OK.  I'm not sure if the final resource utilisation can be mapped as accurately in this way.

 

It would also be nice to see CLIP utilisation and NI-internal utilisation at a glance as this is apparently hugely different between targets.

 

Shane.

Xilinx log window should use a fixed-width font.

 

Which of these two string indicators with identical content is easier to read?

 

FPGA Xilinx Log font.png

 

Spoiler
The one on the left is Courier, the one on the right is the default Application font

When you are using same code on different boards, it would be a big help if you could set the "FPGA VI Reference" indicator as "Adapt to source". When I use dirrerent DMA on different target, then the wire break every time I change target:

 

My FGV for the FPGA reference looks like:

 

FPGA ref Adapt to source.png

The LabVIEW FPGA module has supported static dispatch of LabVIEW Class types since 2009. This essentially means all class wires must be analyzable and statically determinable at compile-time to a single type of class. However, this class can be a derived class of the original wire type which means, for instance, invoking a dynamic dispatch method can be supported since the compiler knows exactly which function will always be called.

 

http://zone.ni.com/reference/en-XX/help/371599H-01/lvfpgaconcepts/fpgaclassesinvis/

 

This is not sufficient for many applications. Implementations that require message passing or other more event oriented programming models tend to use enums and flattened bit vectors to pass different pieces of data around on the same wire. All of this packing and unpacking can automatically be handled by the compiler if we can use run-time dynamic dispatch to describe the application.

 

We call for the LabVIEW FPGA module to add support for true run-time dynamic dispatch to take care of this tedious, annoying, and down-right boring job of figuring out how to pack and unpack bits everywhere. Whose with me?

When accessing a FPGA control / indicator from the HOST, a ~property node is used, and the developer has to pick the right control / indicator from a list

 

When there's lots of items on the list, it can be a pain to scan the list for the right one.

 

At the moment they are listed in order they were created on the FPGA.

 

Could they instead be listed in alphabetical order? Or give the developer the choice?

 

 

I have recently started placing wire labels in my code to keep track of datatypes of wires flowing around my code.  This is very useful to understand what is going on on the FPGA since not all grey wires (FXP) are equal and slight mismatches can mean big trouble in the code.

 

As such I tend to label wires like "Phase FXP+25,0" and so on.

 

What I'd love to be able to do is to place a formatter in a Wire Label to be able to keep such labels up to date because at the moment it's probably more error prone than anything else due to some wires and labels not being synced any more due to code changes or bugfixes.

 

If I can set a wire label to "Phase %s" or similar to place the ACTUAL datatype in the label this would be amazing.

I find myself again and again having to memorise bit field index and other things in order to be able to debug FPGA code efficiently.  What I would really like to be able to do is to create an XControl with a compatible datatype (Say U64) and have this display and accept input in the form of human-readable information.

 

The data being transported is simply a U64 and the FPGA code doesn't need to know anything about the XControl itself.  Just allow a host visualisation based on an XControl to ease things a bit.

 

I've already started using LVOOP on FPGA and I think this could be another big improvement in the debugging experience.  Having an input XControl (or a set of XControls) for certain LVOOP modules on the FPGA just gets me all excited.

Many data streams contain information for multiple channels or multiple samples. Today one must pack this data into larger integer types or interleave the data manually into multiple writes to the DMA FIFO API. It would be much simpler if the DMA natively support cluster and array data types. The local FIFO, Memory, and Register APIs already support this; extend it to DMA.

The FPGA compilation results should be copied to a file in the folder with the bitfile.  This is needed to track the history of compilation results, especially useful when using source code control.  Right now they get overwritten with each recompile.

 

Adding a Post-build action VI to the FPGA build spec, would also enable something like this.

Hi,

 

I find Conditional Disable Symbols in FPGA code very useful especially in a R&D environment where needs and code change back and forth rapidly. However, I also find it hard to keep track of all these changes. I propose to add support for reading FPGA Conditional Disable Symbols from Host to enable VIs like "Is_FPGA_Function_A_Enabled.vi" that would allow for the Host program to know the state (hardware revision of sorts) of the FPGA bitfile and adapt.

 

I'll give an example to illustrate this proposal.

 

Function A is implemented in FPGA bitfile v1. Function B is implemented in Host and is based on multiple calls to Function A. Your boss now wants Function B implemented in FPGA for performance reasons with means to disable the code if required. For this you define a Conditional Disable Symbol in the FPGA project "FPGA_WITH_FUNC_B" and write FPGA code for FPGA bitfile v2. Switching between v1 and v2 is easy enough from the project manager, but for the Host side there's no way of knowing whether Function B is implemented directly in the FPGA or should be "emulated" via Function A as before. If you could do a check like "if FPGA_WITH_FUNC_B == TRUE" you could easily make the Host aware of this.

 

Regards,

solarsd

I am currently running into this issue where I have some constant data I'm trying to write to some DO lines. I want this data to be a constant array on my block diagram, so I create the array programmatically under the "my computer" target. I then change the indicator that is populated to a constant and move it to the FPGA. When I right click and set the array to be a fixed size, it replaces all my data with 0's. I propose the data should not be cleared if I set my fixed array size to be equal to the size which the array already is.

In LabVIEW FPGA 2011, only the base clocks enumerated in the project and clocks derived from the base clock(s) are available in the FPGA Clock Control. I’d like LabVIEW to show the top level clock in this control as well.

 

Consider designs with nested components that both CAN and CANNOT be optimized with the single-cycle timed loop. If the domain of the SCTL does not match the top-level clock domain that contains it, you seem to pay a heavy performance penalty. I presume it’s due to the clock-crossing logic under the hood. Thank you, by the way, for dealing with this for me!  For example, consider this VI:

 

2013-03-04_164826.png

 

The While Loop will take more ticks (a few hundred more in cases I’ve seen) to execute than if the Clock Control constant was set to 200MHz (assuming you could compile). So, just set the TLC and the clock control to be the same, right? Sure, except when you change the top-level clock and a few hours later, when the compile is finished, realized you forgot (gasp) to change a clock constant and the code doesn't fill its timing requirement anymore.

 

Project Clocks:

2013-03-04_163653.png

 

LabVIEW 2011 Behavior:

2013-03-04_163818.png

 

Desired Behavior:

2013-03-04_163818b.png

 

Thanks!

 

-Steve K

This might have already been asked, but I couldn't find any posts.

 

X-Nodes are huge in comparison to the size of a subvi or most anything else on the block diagram of code, so lets shrink them down.

 

Can we remove the Read/Write box?  We already have the little triangle to tell us the function/direction.

Can we use the node name instead of the generic term Data/Element?  It's already there.

 

From there we can model it after a property node using references instead of error lines or we can model it after the IO node which is a little cramped but gets the job done.  Both options retain a purple/pink bar to help identify it's X-node-iness.

 

X-nodeShrink.png