LabVIEW Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

I am a big fan of the built in logging added to the daqMX library, and would like to see NI build upon this model of adding common dq programming tasks without the need to do anything more than flip a few properties on a task.

 

The set of properties I would like to see is a section on task.read called processing that would apply signal processing to the incomming signal prior to reading.  Some examples would be filtering (low, High, band, selectable windows....) FFT, sthresholing, edge detection (returns booleans instead of analog signal).  If this was implemented in a slick way the driver could possibly offload the processing to a FPGA folowing the cRIO model.  Since this is a property added to a task, the task could decide if there are hardware resources avaliable or software processing only. 

 

The programmer can simply select the processing for the task type and now have a daq task with built-in signal processing simplifying the code development cycle (IMHO).

 

I would like to see the ability to select which network interface UDP writes are written to. Presently the UDP method in labview allows you to select interface for reads but writes automatically go out the "default interface" having a system consisting of a wired card, wireless card and a virtual interface is bringing about this need. I can manually turn off the virtual in the network control panel but there should be a way to just output to the device one wants to

DAQmx allows you to change between active and open collector modes per line on cards that only support per port changes.  It is suggested that LabVIEW or DAQmx give a warning when trying to do this to tell you that the device only supports per port changes.  

 

Thanks!

Kira T

I'd like to have a way to give the FPGA on my PCIe card direct access to a block of the host PC's RAM.  At the moment, the FPGA is limited to its internal RAM and whatever might be on the PCIe card. With my PCIe-7841, I have about 1MB available to the FPGA.  If I need more, I have to use DMA FIFO transfers - the FPGA can use one FIFO to ask the host for some data and the host can send it to the FPGA in another FIFO.  This is a lot of overhead compared with simply using a memory method node to access the FPGA RAM.

 

So how about a method to allocate a block of memory in the host's RAM that the FPGA can access directly over the PCIe bus with minimal involvement by the host.  For simplicity, it will probably need to be limited to a contiguous block so that there are no gaps in the addresses - the FPGA would only need to know the start address and the number of bytes in the block.  Ideally safeguards should be established to ensure the FPGA doesn't access memory outside the allocated block, but leaving that to the LabVIEW programmer would be fine.

It would be nice if LabVIEW RT could give more output during the startup process of a LabVIEW real-time executable. For example if the rtexe is started at all or not, if there are missing dependencies during the loading process or other usable stats.

 

In my case I had problems with the German codepage. The build process was done without failure. But the rtexe didn’t start at all. I used special German characters like “öäü” in some VI names. And the rtexe couldn’t be loaded for that reason. But I didn’t get any message.

 

So please improve the debug output for LabVIEW RT.

Well, I think the title says it all...

 

There are many threads on the NI website about this certain topic, but none of them really shows how to deal with this "problem" correctly!

 

It is a very common task to synchronize a AI signal (let's say 0-10 V from a torque sensor) with a Ctr signal (e.g. an angular position of a drive which causes the torque). How do I correctly display the torque over the drive angle in a X-Y graph?

 

It would be great if NI offers a reference example in the LV example finder, how to solve such a task elegantly and efficiently.

 

I'm not sure if this is the appropriate place for this suggestion, but anyway...I would love to see this in the LV example finder!

 

Regards

A:T:R

We have quite a few LabVIEW users here, but not many of us have the application builder or the experience to use it.  So I get many requests to build an executable and installer for others.  Each time I have to take their DaqMX tasks (in the form of a *.nce file from their machine), import it into my MAX and then when creating the installer essentially re-create the same file.  Can an option be added to the Hardware Configuration tab to allow you to select a NCE file instead of create one?

 

Thanks,

-Brian

 

This isn't a Labview-specific idea but there isn't a category for hardware, so I'll post it here and let the moderators decide what to do with it.

 

I'd really like to see a motion controller that works with laptops.  I use a laptop as my main dev computer and it's really a hassle to have to load Labview on the target computer and transporting source code back and forth so I can do dev testing using the real hardware.  Ideally it would be functionally identical to the current controller cards but I'd be okay if it lost a few features or didn't perform quite as fast.

 

(I do use a simulated controller when I can, but for many situations it's just not helpful.) 

I distribuute a lot of code, and sometimes it's difficult to tell my users what they need to install in order to run that code.  It would be nice if I (or a user) could run a built in LabVIEW utility that tells me what a given VI needs to run.

 

For example, do I need DAQmx, Mathscript, Robotics?

Hello everybody!

 

There is a simple way to trigger linescan camera acquisiton controlled by an encoder, as is shown in LL Trigger Each Line From Encoder.vi example.

 

However, it is not possible to do it for frame cameras in a similar fashion.

 

Therefore I suggest that there is a way to configure "IMAQ Configure Trigger3.vi" in order to acquire a frame from a camera every N ticks from the encoder.

 

[idea thanks to a customer's request]

 

What do you think?

Have a great day!

Zenon

In short:

If the code tries to initialize and use port 10 as a serial port it should work without any issues as long as the hardware indeed supports this.

 

Background:

By default VISA will associate ASRL 10 with parallel ports and you have to manually modify the visaconf.ini file to be able to use it as a serial port instead. This is very cumbersome and not very intuitive for the users (if a serial port is installed and grabs port 10 (the fact that it can and will do this is alone a reason not to have the current behaviour) they do not understand why it does not work).

 

Serial ports are still quite common compared to parallel ports, if VISA has to dedicate an alias to parallel ports like this it makes more sense to choose a number less likely to conflict with serial ports - like COM99 or higher..

 

 

Over the years I’ve created a lot of LabVIEW DLL’s for various projects and use DLL’s for different reasons. It took me awhile to figure out why some of my applications were jerky and sluggish. The reason was the DLL’s were running in the UI thread. After changing the thread setting to run in any thread my problems went away. I found in the literature that when the DLL’s runs in any thread it runs in the calling VI’s thread. Is there a reason the default thread is the UI? I would like to see the default thread be changed to “Run in any thread”.

 

Call Library Function Thread.PNG

It would be great to determine properties of a disk drive i.e. type and size. There have been a number of my applications that would benefit from knowing the difference between a local drive and a network drive. Drive Types are shown below.

Drive Properties Fig 1.GIF

 

It is simple in LabVIEW to get a list of disk drives on a given computer as shown:

Drive Properties Fig 2.GIF

There are round about ways to get drive information such as command prompt and the system exec.vi and registry vi’s. These methods require a lot of overhead and programming. There is currently no simple function in LabVIEW that I can find that will return the properties of a disk drive.

 

This idea is more of a request for National Instruments to include a new VI that will get Properties of a Disk Drive. This new VI should be similar to the existing Get File Info VIs.

Currently the "Bytes at port" property only applies to serial ports. If you are trying to write a generic driver or communications package that supports multiple connection types you cannot use the "bytes at port" since it will not work for anything but a serial connection. There is a related idea here which proposes the "Bytes at port" can be added to the event structure. It also suggests that this be expanded to other connection types. My idea suggests that at a minimum the VISA property node can be expanded.

The Problem


Doing a long finite acquisition in DAQmx results in the manner shown below results in a all the data from the acquistion residing in a 2D array of waveforms that the user must rearrange to begin working with.  Since a 2D array of waveforms is not really useful with any processing functions in LabVIEW, why not come up with an automatic way to get the datatype you want (a 1D array of waveforms with the new samples appended to the Y array of the appropriate channel.)

regular tunnel.png


2Dwaveform.png

 

The Solution


Give the user the ability to create a "waveform autoindexing tunnel" via a context menu option.  This tunnel would automatically output the appended waveforms, 1 per channel.  This could be done behind the scenes in the most memory efficient way possible so as to save users the headache of trying to modify the 2D array they currently get.

waveformTunnel.png

 

1Dwaveform.png

 

 

Zoomed in Images


regularZoomed'.png

 

 

waveformZoomed.png

 


It seems like it shouldn't be too much to ask for a proper Smith chart (with markers and everything!).  Seems like it's already hiding somewhere...

This is something a few power users have asked me about. There's no Instrument Driver or VIPM Idea Exchange, so I thought I would post it here.


What if VIPM could manage Instrument Drivers from IDNet?
There are a few key benefits this would offer us...

  • download IDNet drivers directly from VIPM 
  • track which version of a driver you are using for different projects and revert when necessary 
  • wrap up ID dependencies in a VIPC file for use at a customer site
Install Other Version.png
Get Info.png 

There are express VIs for Analog/Digital/Counter channels.How about express VIs for configuraing and acquiring data from an RS232 port?

Packed project libraries are new with LV 2010 and seem to be a great way to share code.  One idea to make them more user friendly for the end user of the library would be something that would give the project library developer the ability to specify driver dependencies for their library. 

 

If the end user of the library did not have the right drivers installed, they would receive a warning or maybe a broken run arrow if they tried to use it.  The warning could be very descriptive by telling them exactly what drivers they are missing.  This seems like a better solution that just getting all these arbitrary errors because LabVIEW can't find subvis called by the packed project library.

 

Here's a mockup of what the window for this might look like in the packed project library build specifications (borrowed from the additional installers window).

 

packedlibrary.png