LabVIEW Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

There are some analytic software that adds connections to Excel.  JMP comes to mind - it allows 2-way data transfer, and allows excel to run some of JMP's analysis routines.  Those packaged routines are vetted by phd's and can't be quickly damaged by a tired user. 

 

This details how JP morgan lost a few billion dollars because they were using Excel for hard math, and someone had an error in an equation.

 

Excel is likely more common than both Javascript or SQL for use in the planet. 

 

It might be very powerful to be able to import an excel spreadsheet, specify inputs, specify outputs, and get it translated to "g".  This could allow it to run thousands of times faster, and the visual language would allow a technical vetting that can be very hard for humans to do on hundreds and hundreds of pages of tables of numbers or formulas. 

 

This would be a motive for banks like JP Morgan (or their peers) and their multi-billion dollar business units to use LabVIEW.  Stunning speed, optimization, and scale.  Stunning accessibility to error correction.  Packages that, once wrapped, stay wrapped, but can plug in where they need to.

 

Other add-in or connectors for Excel:

It is unlike the TDM-importer in that it sends data to LabVIEW, the data is transformed or processed, and a numeric, text, or image result is returned to Excel.

When you have a probe to se the data in an array, you se all elements in one row (in the Value column) and just one element in the Probe Display area. If you have a large array, it is difficult to get a good overview.

 

My suggestion is that in the part of the probe window called Probe Display, you should be able to pull the bottom right corner and see several elements simultaneously.

 

Probe 4.png

 

An extra feature would be to also show the array size anywhere in the probe window.

The Report Generation Toolkit includes Excel Easy Table that allows either Text (2D String Arrays) or Numeric (2D Dbl Arrays) to be written to Excel.  The function is written as a Polymorphic function to handle the two types of input.  However, when processing numeric input, an inner function called "Append Numeric Table to Report (wrap)" converts the numeric data to a String Array using the format string %.3f.  This is, in fact, a Control input for the function, but its caller does not wire the input, forcing the numeric data to be truncated to three significant digits.

 

I suggest that the default either be changed to %.15f (or something similar) to preserve the precision of the input data, or the Format String be "brought out" to the User (but there are no free Connector slots) to allow the User to control the precision.

It would be useful if there was functionality within the VI Analyzer, under the style section, that checked for overlapping on the block diagram. This would allow you to be able to check readability and check for some mistakes. For example, it is possible to copy and paste while loops on top of each other. VI Analyzer should be able to tell you that there are 2 while loops overlapping to help with style and debugging. 

Hey there. I'm working on a transparency vi which overlays 2 images (U32RGB and U8 Grayscale with User Palette). After I found out about the Resample Functionality, I thought the transparency Issue or its parts would be easy. But this was not the case. I'm missing following functionality:

Merging 2 Images or ROIs with a transparency factor
extracting the color image from a grayscale single image with user palette (my solution attached)
multiplying a color image with a floating constant  (my solution for the U32RGB Array attached) (integer values are not suited for small numbers)
subtracting a color image FROM a constant without creating an image from the constant / image inversion (does not work with my U32RGB images)

I've got my solution but i'm convinced it is slow.

Download All

I only mean that this should apply to the sub vi's that come with LabVIEW. I was putting together a vi that is execution time sensitive. I had a choice between the IMAQ Histogram and IMAQ Histograph. I could get the result i needed from wither one but I was forced to try each,  run a few times, and clock each one. There are many such "which of these two similar options is fastest" choices we make for every program and knowing which upfront would be very helpful.

As the title says.

 

double click on a *.rsl file should open the Vi Analyzer results window

 

We have a CI server that runs VI Analyzer and posts the rsl files as artifacts, downloading the rsl file to my code base works great to find and fix errors. The only thing missing is the double click.

The current error-case only allows two states, when the error-cluster is wired: "No error" or "Error".

 

My suggestion is, to allow any number of cases which depends on manually defined error-codes (see attached picture). The error-case must be enhanced so that error codes can be treated separately in individial cases.

 

Previously to handle a specific error code, first the code must be read from the error cluster and than be wired to the case. This is to be omitted.

 

Optimized_Error_Case.png

I very much like the formula parse and evaluate vi's. For me writing a formula is easier and I am making less mistakes writing formulas than wiring numeric nodes. Specially when the formula is taken from literature.
Unfortunately, the parsed formula is much slower than using standard numeric nodes. Browsing through the formula nodes, I notice that the formulas are parsed down to the same standard numeric nodes (add subtract etc.). Still the formula parsing method is much slower because of many case statements that have to be executed before arriving at the  level of the numeric building blocks.
I think from the status where the formula parsing blocks are now, it would be feasible to have the formula parsing blocks generate vi's using only numeric nodes so the formula parsing nodes will have the same performance as the standard labview mathematics. The best solution would be to include it in the building/compiling of the code.

 

Arjan

 

 

The idea is to change Equal? function in a way, that it will be configurable, and will have one input as function "Equal To 0?". Sometimes you need to evaluate number of loops execution in While Loop (or not just it), and when you put standard Equal? function, some of wires will be not aligned in a straight line (either which is connected to Index output, or which is connected to Loop Condition), and you need to move up/down one of terminals.

So, you can see it from the attached picture.

Idea.PNG

I'm currently trying to simulate figure two in the paper 'An Electronically Controllable Capacitance Multiplier with Temperature Compensation'.  Any assistance would be much appreciated!

 

 

In Diagram Disable Structure, if there is an option provides "Selector Terminal" that could help to control programmatically. This could be optional selection in RCM (Right Context Menu).

 

Connecting Enable/Disable as Constant/Control to Case Structure would not do all purpose as the functionally of Diagram Disable Structure Enable/Disable individual does.

 

Diagram Disable Structure have unique behaviour if this Selector Terminal is given as optional that will add even more vale to the same.

 

As programmer myself felt that why do this be controllable, because while developing big codes we use to much of diagram disable structure , when we are suppose to change to Enable or Disable, I has to search for around my code. If this has controllable option that help to do my work even more efficient specially while testing code. 

yrfj.png

 

Can NI give LabVIEW developers an option to use a straight line for plot legend rather than zig-zag lines? See attached illustration. Will be great if the legend customization can also include separating the legend lines and expanding them so that plot labels can be relocated on-top of those legend color lines.

 

Anthony Lukindo

 

 

 

Hiiiiiiiiiii......

 

I am not sure whether anyone had posted this requirement before, but I would like to have a Zoom In- Zoom Out functionality in LabVIEW.

 

Some times when code becomes too crowded its difficult to analyse which wire is going where.

It would be nice if we could write more information to the channels saved in a UFF file. Some information that is read in from a TDMS file is lost when using the VIs to write out a UFF file (universal format, ascii or binary).

For instance, I process an image and I get overlay results (even without knowing exactly the overlay objects drawn) and I want to copy this image with its overlay in a bigger image at a specified offset...

IMAQ ImageToImage 2 seems not to do the job...

Wouldn't it be nice to just check/ uncheck existing sub-VIs of parent classes in the Inheritance Dialog Box, since all of the sub-VIs in child classes have to have the same name anyway. Instead of manually add them one by one.

With only a few VIs and a few child classes the saved work is huge.

 

PS: I did not find a similar idea here, so bare with me if it already was put up. The AES guy at the UGM did not know about anything easier then copy and paste at the level of WinExplorer and later add the files to the project.

It would be very useful if we could have same QuickDrop PlugIn with the same shortcut depending of the selection object that we have made in "Block Diagram" or in "Front Panel".

 

For example:

- Imagine "Ctrl+C" short cut, this would be useful for lots of QuickDrops that comes to my mind.

  • Copying to clipboard a bundle by name text.
  • Copying to clipboard a unbundle by name text
  • Copying to clipboard a selected case.
  • etc....

 

It would be useful to have a configurable tool for generating swept values for RFSA/RFSG mainly.

 

This is actually present in Signal Express. You can insert and even nest the "sweep"-step here, but as far as I can see SigEx doesn´t have support for RFSA/RFSG.

So the next thing coming to my mind was to export the configured sweep as LabVIEW code from SigEx. Unfortunately the resulting Express VI cannot be converted into a regular subVI.

 

It would be very useful for debugging and prototyping to get this feature!

 

kind regards

 

Marco Brauner AES NIG Munich

Why doesn't LabVIEW Provide any Simulation for real time hardwares.... What i mean to say that Just as if we build some hardware circuits using microcontroller ... We can simulate the values in almost actual environment using some simulation software such as proteas.... I came up with this when i tried generating a PWM Signals as i ordered the Digital I/O card from LAbVIEW which did took time for shipping...