LabVIEW Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

Hi!

 

We are trying to work with an OPC server with some array tags defined. We can access the array tags perfectly.

 

If you request a single value from that OPC server (through an OPC client like Wonderware, Ifix, Control Maestro (Wizcon), WinCC, etc….), the OPC server sends a timestamp with the time in which that value was acquired. I have confirmed that there's no way to request single values from an array tag (neither with DSC, nor with DataSocket).

 

I don't know how hard to develop would this feature be, but I think it would be great.

 

Regards!!

Between program executions it without resetting the cards the pxi cards retain there previous settings. For example if I set ao0 to 5v, stop my program, restart it (without a reset) the card still generates 5v. with digital cards and one particular series of cards (6255 is an example though) the ao channels can be set as internal channels then measured wrt gnd. surely it should be possible to interogate any analogue pxi card for there current value (min and max etc are available).

 

 

I have some VME hardware that uses A16 addressing only, so I can communicate with them using VXI, but they do not support VISA at all.  After some time spent conversing with NI support, it appears that VXI has been abandoned, and all low-level VME register access through the NI VME-MXI-2 controller card must be done through VISA now.  I have been able to add VXIin.vi and VXIout.vi (from the old VXI libraries in NI-VXI 3.6) to the latest version of NI-VXI in LabVIEW 32-bit to get the hardware working through VXI communication.  However, these VXI vi's were 32-bit only, and never updated to 64-bit, so I am stuck running LabVIEW 32-bit on my 64-bit OS.  Updating these vi's to 64-bit would be greatly appreciated.

 

Thanks,

Rich

Estoy tratando de hacer una adquisición de datos mediante la placa Arduino mega, pero no se bien como deba hacerlos en labview, porque no debo usar el toolkit de arduino, si no tomar los datos de texto desde la placa arduino y convertirlos a número, y despues discriminar cada una de las señales, ya que estoy leyendo 6 de las entradas de Arduino, agradecería sus comentarios y ayuda, gracias

The current tcp/ip testing api is problematic for testing ethernet connections.  The behavior of the function set does not allow you to bind to an ip address unless the ethernet connection is active in windows.  This means that a unit under test must be powered before binding to the connection as a host.  Additionally, powering off the unit causes the host connection to be forced to completly cycle its startup (re-bind the listener and junk all the old connections, foricbly by polling because you cannot bind to the port when no connection is present.)  

 

This behavior can be achieved in C program, but cannot be called by labview in a dll because all dlls exist in the same thread.  Windows tcp function calls rely on the calling thread to determine the connection, which means if you need multiple connections it simply cannot work with dll calls.  

 

Suggestion:

 

Implement TCP/IP function calls so that a listner connection can be bound to an IP address that does require an active etherenet port and that is persistent across plugging/unplugging.

 

 

I've posted an idea, but didn't add any pic to explain:
http://forums.ni.com/t5/ideas/v2/ideapage/blog-id/labviewideas/article-id/16975/tab/rich

 

 

I've attached an image with this idea. Sorry for the duplicate posts.

These are simple examples but can get more complicated. For example, usage 2 can have more input (e.g. 4) it will have 16 possibilities which have different simplified code.

 

USAGE 1: User wires some input on the block diagram, when the simplify button is used, it simplifies the complicated code to a simple one ( noting that it is not the unique solution as shown in USAGE 3)

USAGE 2: If for the application, the user changes the value of the input and he gets different output; this table can get the simplified; or the "best fit" for the number of data present.

 

In http://forums.ni.com/t5/ideas/v2/ideapage/blog-id/labviewideas/article-id/16975/tab/rich, I wanted to compare that  Partial fraction expansion vi is not widely used except for certain applications yet it is important and beneficial.

Same goes for this idea.

 

Best


 

There are alot of applications that require the use of Booleans, and when the boolean expression gets complicated or large it will be a headache on the programmer.

LabVIEW should try having a De Morgan law "function" or at least a way to simplify complicated Boolean expressions. Although it won't be used as much, it will be a handy tool for any electronics/programming course.

For instance, Partial fraction expansion isn't used regularly but when it's used it helps save time.

 

These are simple examples but can get more complicated. For example, usage 2 can have more input (e.g. 4) it will have 16 possibilities which have different simplified code.

 

USAGE 1: User wires some input on the block diagram, when the simplify button is used, it simplifies the complicated code to a simple one ( noting that it is not the unique solution as shown in USAGE 3)

USAGE 2: If for the application, the user changes the value of the input and he gets different output; this table can get the simplified; or the "best fit" for the number of data present.

 

In http://forums.ni.com/t5/ideas/v2/ideapage/blog-id/labviewideas/article-id/16975/tab/rich, I wanted to compare that  Partial fraction expansion vi is not widely used except for certain applications yet it is important and beneficial.

Same goes for this idea.

 

Simplifying Boolean.png

Might have missed an obvious method when researching this but here goes.

 

I do a lot of Time of Flight experiments and so had to design a system to calculate the time at which I get a signal relative to a trigger using 2 channels on a counter card. As multiple events may occur after one trigger it has to measure the time taken (on the order of nano - microseconds) for each signal event relative to a trigger. If you have only one event occurring after each trigger you can use 2 edge separation but for multiple events it becomes harder.

 

I have a solution but it involves treating the onboard clock as the signal for each channel and treating the actual signal as a clock finding the 'absolute' time of each pulse and then subtracting, whilst accounting for the 32 bit clock restarting etc. Having a multiple edge separation function would make it easier.

I am a big fan of the built in logging added to the daqMX library, and would like to see NI build upon this model of adding common dq programming tasks without the need to do anything more than flip a few properties on a task.

 

The set of properties I would like to see is a section on task.read called processing that would apply signal processing to the incomming signal prior to reading.  Some examples would be filtering (low, High, band, selectable windows....) FFT, sthresholing, edge detection (returns booleans instead of analog signal).  If this was implemented in a slick way the driver could possibly offload the processing to a FPGA folowing the cRIO model.  Since this is a property added to a task, the task could decide if there are hardware resources avaliable or software processing only. 

 

The programmer can simply select the processing for the task type and now have a daq task with built-in signal processing simplifying the code development cycle (IMHO).

 

This isn't a Labview-specific idea but there isn't a category for hardware, so I'll post it here and let the moderators decide what to do with it.

 

I'd really like to see a motion controller that works with laptops.  I use a laptop as my main dev computer and it's really a hassle to have to load Labview on the target computer and transporting source code back and forth so I can do dev testing using the real hardware.  Ideally it would be functionally identical to the current controller cards but I'd be okay if it lost a few features or didn't perform quite as fast.

 

(I do use a simulated controller when I can, but for many situations it's just not helpful.) 

Hello everybody!

 

There is a simple way to trigger linescan camera acquisiton controlled by an encoder, as is shown in LL Trigger Each Line From Encoder.vi example.

 

However, it is not possible to do it for frame cameras in a similar fashion.

 

Therefore I suggest that there is a way to configure "IMAQ Configure Trigger3.vi" in order to acquire a frame from a camera every N ticks from the encoder.

 

[idea thanks to a customer's request]

 

What do you think?

Have a great day!

Zenon

Over the years I’ve created a lot of LabVIEW DLL’s for various projects and use DLL’s for different reasons. It took me awhile to figure out why some of my applications were jerky and sluggish. The reason was the DLL’s were running in the UI thread. After changing the thread setting to run in any thread my problems went away. I found in the literature that when the DLL’s runs in any thread it runs in the calling VI’s thread. Is there a reason the default thread is the UI? I would like to see the default thread be changed to “Run in any thread”.

 

Call Library Function Thread.PNG

The Vision Acquisition Software seems to send all GigE attributes to the camera that have been set in the camera file. There should be an option to select which attributes are sent to the camera when connecting. This would allow the settings configured through vendor specific third party tools to persist when acquiring in MAX or LabVIEW.

 

This could possibly also alleviate the "Attribute value is out of range" error if MAX is only sending the attributes you specify.