LabVIEW Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

Two recommendations:

 

1) Add an OSI Read VI for N wavelength spectra that returns each wavelength spectrum as a waveform (i.e. an N x number of channels array of waveforms). This will provide the time stamp of each wavelength spectrum, and will also be more efficient that reading one set of channel wavelength spectra at a time.

 

2) Add a VI that applies the wavelength scaling to wavelengths.  This will allow the user to apply their own power spectrum peak search algorithm to wavelength spectra, whilst still applying the current wavelength scaling algorithm. This will free the user from having to use the OSI peak search algorithm.

 

We have found that the peak search algorithm utilised by NI-OSI performs poorly when the peaks in the power spectrum are poor quality, and that a much simpler algorithm yielded far better performance. Our algorithm was to simply report the maximum power wavelength in the sensor wavelength range if it was above a minimum threshold. This is in significant contrast to the NI-OSI algorithm that searches for peaks across the entire power spectrum using four algorithm tuning parameters, identifying all of the detected peaks that fall within the sensor wavelength range, and then reporting the one with maximum wavelength.

 

 

For reference, we have a PXIe-4844, and are using NI-OSI 16.0.

When performing a single point read on an XNet session, you will receive the value of the signal that was last read, or the Default value as defined by the database if it has never been read.

 

This type of functionality is sometimes useful, but more often I'm interesting in knowing what the last reading was, if the reading is relatively recent.  The problem with the NI implementation is that you have no way of knowing (with this one session type) if the device is still broadcasting data or not.  I think a useful property might be to have a way of specifying an amount of time, that a signal's value is still valid.  If I haven't seen an update to a signal in 2 seconds for example, I can likely assume my device is no longer communicating and the reading I get back from the XNet read should return NaN.

 

I had a small discussion on this topic and posted a solution using an XY session type here, which demonstrates the functionality I am talking about.  I'd like this to be built into the API.

What I propose is to have functionality built into the XNet API that is similar to the DAQmx API.  I'd want a separate subpalette where functions like Start, and Stop logging can be invoked which will log the raw data from an XNet interface into a TDMS file.  Maybe even some other functions like forcing a new file, or forcing a new file after the current file is so old, or of a certain file size.  On buses that are heavily loaded, reading every frame, and then logging it can be use a non-trivial amount of resources, and having this built into the API would likely be more efficient.

 

XNet already has a standard for how to read and write raw frames into a TDMS file that is compatible with DIAdem, and has several shipping examples with LabVIEW to log into this format.

To generate a VI or set of VIs with a general driver to get low-end FPGA boards to work with LabView FPGA. Parameters will only come from the users to make this dynamic, this would be the total count of I/Os FPGA type, location of I/O items (eg. buttons) in the FPGA board, etc. It would be a bit of work, but also would pay off at the end. Doing such is no more than an extension of LabView if done well, let's say written in an xml file plus it would be a very powerful tool for researchers, and would generate more sales to use LabView FPGA for more researchers, university students, and engineers who want to test a few things without full initial commitment to NI tools.

 

 

Measure range is not commited to SMU instrument while the output is in off state. niDCPower commit VI should change range of device even when output is in off state.

 

In cases where UUT has external voltage applied to it, SMU device will give an overvoltage error even if UUT voltage is within SMU voltage range. This occurs due to the range settings not being commited when the output has been turned off.

Hi

 

Is it possible that the contents within the instr.lib to be defaulted to read-only every time LabVIEW launches? VIs that is drag-drop from the pallet onto block diagram is currently modifiable and may cause unintentional code modifications, especially due to the 'save all' function and hasty/improper shutdown. The extend of the damage may be inherited over time.

 

Also propose to default modified instr.lib VIs saves to be in active project folder instead of the instr.lib.

 

Hope to see these features in future versions.

In MAX if you create a virtual channel, you can select the channel and then go to the connection diagram tab to see a preview of how to hook up a sensor to a particular channel.  It doesn't work for all hardware and channels but it seems to support more and more hardware with each release of DAQmx.

 

This is super useful for troubleshooting, and a great way to just see if something is hooked up right.  Another use case I can see is to instead of selecting an existing virtual channel to preview, pick a DAQ card (by model) then have a listbox of physical channels.  Selecting the physical channel can then change the image showing how to hook up to the input or output.  Here is the existing diagram in MAX.

 

Basic Connection Diagram.png

 

Additionally this can be improved by being able to select multiple physical or virtual channels and see them at the same time, which can give a feel for what pins are being used, and what channels are available.

 

But what I REALLY want, is the ability to generate these images, for my own use programmatically.  Either for documentation, or in the application.  If a technician is troubleshooting a tester they could click on something in my application which shows the connection diagram, and opens a test panel.  Or if I have a calibration routine I can show step by step instructions on how to calibrate the I/O with diagrams that are made on the fly for the I/O that they are calibrating.  These might be for virtual channels, or tasks that aren't saved in MAX.  I'd really envision an API where I give a piece of hardware by the model number, provide a terminal block if applicable, and then give a physical channel on that hardware and out would come the connection diagram image.  It would be even better if I could provide an array of physical channels so one image could be generated which shows all the connections to that piece of hardware.

 

NI clearly put effort into making the user experience in MAX, and the DAQ assistant easy to use, please just allow us to make user experiences that are also easy to use for our users.

 

EDIT:  Oh it seems that there is some support for multiple physical channels on one image, as shown by the document here.

Recently, user cprince inquired why all of the possible Mouse Button presses were not available in LabVIEW.  In the original post, it was not clear whether this referred to "normal" mouse functions (as captured, say, by the Event structure) or whether this referred to the Mouse functions on the Input Device Control Palette.  The latter has a Query Device, which shows (for my mouse on my PC) that the Mouse has 8 buttons, but the Acquire Input Data polymorphic function for Mouse input returns a "cluster button info" with only four buttons.

 

The "magic" seems to happen inside lvinput.dll.  If, as seems to be the case, the DLL can "see" and recognize 8 Mouse Buttons, it would be nice if all 8 could be returned to the user as a Cluster of 8 (instead of 4) Booleans.

 

Bob Schor

There are currently two NI toolkits which add a software layer on top of the automotive CAN bus.  

 

The Automotive Diagnostic Command Set (ADCS) adds a couple of protocol standards like KWP2000 (ISO 14230), Diagnostics on CAN (ISO 15765, OBD-II), and Diagnostics over IP (ISO 13400).  This is a pretty handy API when all you need is one of these protocols.  But often when you need to communicate to an ECU, you also want have a normal Frame or Signal/Channel API where you get raw frame data, or engineering units on signals.

 

The ECU Measurement and Calibration (ECU M&C) adds XCP, and CCP capabilities on top of CAN allowing to read and write parameters based on predefined A2L files.  This again is super handy, if the A2L you provide can be parsed, and if all you need to do is talk XCP or CCP to some hardware.  But often times you also need to talk over a Frame or Signal/Channel API.  And what if you need to also talk over some other protocol.

 

Every time I've had to use one of these toolkits, it has failed me in real world uses, because you generally don't want just one API you want severaal.  And to get that kind of capabilities you often are going to close one API sessions type, and reopen another, then close that and reopen the first.  This gets even more difficult when different hardware types either NI-CAN or NI-XNET are used.  The application ends up being a tightly coupled mess.

 

This idea is to rewrite these two toolkits, to be as pure G implementation as possible.  The reason I think this would be a good idea, is because with it you could debug issues with file parsing, which at the moment is a DLL call and hope it works, and it would allow for this toolkit to be used, independently of the hardware.  Just give it some raw frames, and read data back.  NI-XNET already has some of this type of functionality in the form of their Frame / Signal Conversion type, where it can convert from frames to signals without needing hardware.  If these toolkits could support this similar raw type of mode then it would allow for more flexible and robust applications, and potentially being able to use these toolkits on other hardware types, or simulated hardware.

The current bluetooth VIs (as of LV 2014) don't support communication with the new protocol Bluetooth 4.0, referred as Bluetooth Low Energy (or Bluetooth Smart).

 

New VIs dedicated to BLE or adding support on current VIs is needed for all developers of this new bluetooth stack.

 

I think that will be useful to have acces programmatically at Ni-Visa Driver Wizard

to search all connected device and to have as output a list with all connected device with PID and VID.

And if you whant  to  be possible to create a driver for a specific pheriferal from this list.

 

The VI "VISA Lock async.vi" should be reentrant to allow locking a connection while waiting for another connection.

 

Problem example:

Imagine COM1 is locked and for its lock is waited with "VISA Lock async.vi". While waiting, COM2 shold be locked (from somewhere else, with "VISA Lock async.vi"). Because the lock VI is not reentrant, this call is blocked until COM1 call finishes (because of success or failure). This is independent if COM2 is locked or not.

 

LV Versions with this behaviour: from at least 8.6 to 14

Logical shift in LabVIEW discards the shifted out bit. Logical shift functions in assembly labguage settings shift that value to an overflow bit register allowing the bit in question to be tested and program flow to be altered.

 

This kind of function is useful when dealing with low level protocols at the bit level, or dealing with digital devices that have a parallel interface.

 

I suggest the logical shift function have an additional output that contains the boolean value of the bit just shifted out of the number.

 

Logical shift should be able to output a single boolean, or an array of boolean. For the single case, the value would be the last bit shifted out with other bits being lost. For the array output, all bits would be captured with the first bit shifted out being the 0th index of the array.

It would be really nice if the 8452 supported JTAG.  With I2C and SPI the only communication protocols that are widely used are RS232 and JTAG.  With the addtion of these one could test the peripherals of most dsp's and embedded controllers as well as create their own programmers and debug tools.

Imagine going into a customer site, after a 2 hour drive. Or worse a 2 hour drive after a 6 hour flight.

And being able to connect your Android smart phone to the instrument under question.

Run a test program, or calibration program from a special GPIB controller that plugs into your phones USB/Charger port.

 

How much easier would that make field service?

LabVIEW users should be able to deploy programs to the Intel Edison Module 

 

The Intel Edison is a very small single board computer with a dual-core Atom processor, 1GB RAM and built-in WiFi/Bluetooth LE.

Functionality can be added by connecting breakout boards, so called blocks. Many of these blocks are already available, like ADC, GPIO, Arduino, PWM...

 

In my opinion the Intel Edison is very well suited for stuff like embedded control, robotics, the Internet-Of-Things, and.

Thats why I posted this idea to convince NI to support it in LabVIEW!

 

 

VDB

Right now the Labview interface for arduino is available but it is only facilitating us to communicate to the board's i/o pins through Labview, better would be if we can code using Labview and run the code on the arduino board. In that way we would be able to process faster as compared to running code on the pc plus people not aware of coding can also play along using this stuff.

There are a plethora of timestamp formats used by various operating systems and applications. The 1588 internet time protocol itself lists several. Windows is different from various flavors of Linux, and Excel is very different from about all of them. Then there are the details of daylight savings time, leap years, etc. LabVIEW contains all the tools to convert from one of these formats to another, but getting it right can be difficult. I propose a simple primitive to do this conversion. It would need to be polymorphic to handle the different data types that timestamps can take. This should only handle numeric data types, such as the numeric Excel timestamp (a double) or a Linux timestamp (an integer). Text-based timestamps are already handled fairly well. Inputs would be timestamp, input format type, output format type, and error. Outputs would be resultant format and error.

Wouldn't it be great if National Instruments could support AUTOSAR SWC-development in LabView. The AUTOSAR standard with its module-based approach fits perfectly for LabView. I am convinced that your company could do a great job in implementing an easy-to-use environment for this emerging standard. I have worked with the tools from Vector and in my opinion everything there is very messy and illogical.

 

Best regards

Mats Olsson

It would be very useful if we could have same QuickDrop PlugIn with the same shortcut depending of the selection object that we have made in "Block Diagram" or in "Front Panel".

 

For example:

- Imagine "Ctrl+C" short cut, this would be useful for lots of QuickDrops that comes to my mind.

  • Copying to clipboard a bundle by name text.
  • Copying to clipboard a unbundle by name text
  • Copying to clipboard a selected case.
  • etc....