LabVIEW Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

 

The LM3S9D96 Development Kit is a 32 bit ARM based microcontroller board that is popular and has several plug in boards. It would be great if we could programme it in LabVIEW. This product could leverage off the already available LabVIEW Embedded for ARM and the LabVIEW Microcontroller SDK.

 

The LM3S9D96 Development Kit costs $425 and is open hardware. The LM3S9D96 is an ARM Cortex M3 running at 80 MHz resulting in 96 MIPS of performance. By way of comparison, the current LabVIEW Embedded for ARM Tier 1 (out-of-the-box experience) boards have only 60 MIPS of processing power.

 

The LM3S9B96 Development Kit brochure (http://www.ti.com/lit/ml/spmt158e/spmt158e.pdf) already states, “The LM3S9B96 development board is also a useful development vehicle for systems programmed using tools such as Microsoft’s .NET Micro Framework and Embedded LabView from National Instruments”. So, the brochure already states that the board can be programmed using LabVIEW. Unfortunately, this is not so - not without a few months work. No one has done the Tier 2 to Tier 1 port and it would make the most sense for National Instruments to do this once for the benefit of all. Relatively little work to enable this interesting development board. And the marketing is already done!

 

Wouldn’t it be great to programme the LM3S9D96 Development Kit in LabVIEW?

With the advent of the IoT and the growing need to synchronize automation applications  and other TSN (Time Sensitive Networking) use cases UTC (Coordinated Universal Time) is becoming more and more problematic.  Currently, there are 37 seconds not accounted for in the TimeStamp Which is stored in UTC.  The current I64 portion of the timestamp datatype that holds number of seconds elapsed since 0:00:00 Dec 31, 1900 GMT is simply ignoring Leap Seconds.  This is consistent with most modern programming languages and is not a flaw of LabVIEW per se but isn't quite good enough for where the future is heading   In fact, there is a joint IERS/IEEE working group on TSN 

 

Enter TAI or International Atomic Time: TAI has the advantage of being contiguous and is based on the SI second making it ideal for IA applications.  Unfortunately, a LabVIEW Timestamp cannot be formated in TAI.   Entering a time of 23:59:60 31 Dec 2016, a real second that did ocurr, is not allowed.  Currently IERS Bulletin C is published to Give the current UTC-TAI offset but, required extensive code to implement the lookup and well, the text.text just won't display properly in a %<>T or %^<>T (local abs time container and UTC Abs time container)  We need a %#<>T TAI time container format specifier. (Or soon will!)

Stellaris Launchpad.jpg

 

I’ve already put up ideas, about 7 weeks ago, for four development boards that could be LabVIEW targets:

1)  LabVIEW for Raspberry Pi (current kudos 139)

2)  LabVIEW for Arduino Due (current kudos 74)

3)  LabVIEW for BeagleBoard (current kudos 49)

4)  LabVIEW for LM3S9D96 Development Kit (current kudos 15)

 

I wanted to leave it at that to gauge LabVIEW community/user interest, however an exciting new board has just been introduced, which is too good to leave out. It’s the Texas Instruments Stellaris Launchpad.

 

It’s very attractive for three main reasons:

1)    It is very easy to get LabVIEW Embedded for ARM to target this board (a Tier 1 port)

2)    The microcontroller is powerful with many useful on-chip peripherals

3)    The price is extraordinarily low.

 

The Stellaris Launchpad features are:

  • ARM Cortex M4 with floating point unit running at 80 MHz (100 MIPS)
  • 256 KB flash
  • 32 KB SRAM
  • USB device port (separate from the USB for programming/debugging)
  • 8 UARTS
  • 4 I2C
  • 4 SSI/SPI
  • 12 x 12 bit A/D channels
  • 2 analog comparators
  • 16 digital comparators
  • Up to 49 GPIO
  • and more

 

The most interesting feature is that it costs $4.99 including postage. Yep, just under five dollars!  Including postage!  I’ve already ordered two!

 

The Texas Instruments Stellaris Launchpad can be programmed using the free Code Composer Studio in C/C++ or the free Arduino IDE using Energia from github. Both great ways to program. It just needs LabVIEW as the third exciting programming option.

 

Wouldn’t it be great to program the Stellaris Launchpad in LabVIEW?

The current bluetooth VIs (as of LV 2014) don't support communication with the new protocol Bluetooth 4.0, referred as Bluetooth Low Energy (or Bluetooth Smart).

 

New VIs dedicated to BLE or adding support on current VIs is needed for all developers of this new bluetooth stack.

 

 

The files with extension (.txt), are one of the most used. But when you want to export data from a graph to a file (.txt), you can not make directly. It would be interesting to have this option.

 

 

EXPORT.png

TCP/IP, UDP/IP, IrDA
LabVIEW doesn’t have any VI or function to retrieve connection information/properties like IP address, local port or service name, remote port, ect.
In order to get connection information, you have to make a functional global vi to manage connection information by connection ID for later to retrieve.

Native function or VI will be great help.

The VISA test panel is a very valuable tool for troubleshooting instrument connectivity issues.

 

This used to be included with the VISA runtime, or at least with any installer that also included the VISA runtime.

 

Now I have to separately download and install the FULL VISA just to get this valuable tool. 

 

That makes installing a LabVIEW executable a multistep process as now I have to run two different installers. 

 

NI-MAX and the VISA test panel should ALWAYS BE included in any installer that includes the VISA runtime.

According to this document only 14 ideas from the idea exchange were implemented in LabView 2010.This is a fantastic start.

 

There are at least 100 more great ideas on the Idea Exchange that should be implemented in the next version of LabView. Keep listening to the users. Keep improving LabView in every way.

 

Smiley Happy

THIS IS A REPOST FROM THE MAX IDEA EXCHANGE BECAUSE WE DON'T ACTUALLY HAVE A DRIVER IDEA EXCHANGE AND I DONT KNOW IF THE MAX IDEA EXCHANGE IS CORRECT

 

Before I start, I want to make clear that I am fully aware that my suggestion is probably linked to some crazy amount of work....  That being out of the way:

 

I often have to switch between LV Versions and have on more than one occasion run into the rpoblem that different versions of LV work with MUTUALLY EXCLUSIVE sets of drivers.  This means that I cannot (_for example) have LabVIEW 7.1 and 2011 on the same machine if I need to be coding GPIB functionality over VISA because there is no single VISA version which supports both 7.1 and 2011 (image below).

 

 

Of course these days we just fire up a VM with the appropriate drivers but for much hardware (Like PCI or Serial or GPIB) this doesn't work out too well.

 

Why can't we have some version selection ability for hardware drivers.  Why can't I have VISA 4.0 and 5.1.1 installed in parallel and then make a selection of which version to use in my project definition?  I know that ehse drivers probably share some files on the OS level so it clearly won't work for existing driver packages but for future developmend it would be utterly magnificent to be able to define which version of a hardware driver (Or even LV Toolkit like Vision) should be used ina  project.

 

Shane.

Recently, user cprince inquired why all of the possible Mouse Button presses were not available in LabVIEW.  In the original post, it was not clear whether this referred to "normal" mouse functions (as captured, say, by the Event structure) or whether this referred to the Mouse functions on the Input Device Control Palette.  The latter has a Query Device, which shows (for my mouse on my PC) that the Mouse has 8 buttons, but the Acquire Input Data polymorphic function for Mouse input returns a "cluster button info" with only four buttons.

 

The "magic" seems to happen inside lvinput.dll.  If, as seems to be the case, the DLL can "see" and recognize 8 Mouse Buttons, it would be nice if all 8 could be returned to the user as a Cluster of 8 (instead of 4) Booleans.

 

Bob Schor

LabVIEW users should be able to deploy programs to the Intel Edison Module 

 

The Intel Edison is a very small single board computer with a dual-core Atom processor, 1GB RAM and built-in WiFi/Bluetooth LE.

Functionality can be added by connecting breakout boards, so called blocks. Many of these blocks are already available, like ADC, GPIO, Arduino, PWM...

 

In my opinion the Intel Edison is very well suited for stuff like embedded control, robotics, the Internet-Of-Things, and.

Thats why I posted this idea to convince NI to support it in LabVIEW!

 

 

VDB

Hello,

 

the current functionality doesnt allow to asynchronously call a method that has any dynamically dispatched inputs. This causes a need to create a statically dispatched wrapper around the dynamic method which then can be called.

 

This is a source of frustration for me because it forces you to have code that is less readable, and this doesn't seem to be any reason for having functionality like that. Since you allready need to have a class loaded in the memory to provide it as an input for the asynchronously called VI why not just allow to use dynamic dispatch there (the dynamic method is allready in the memory).

 

How it is right now:

DynamicDispatchAsynchCall0.png

DynamicDispatchAsynchCall1.png

 

Solution: Allow to make asynchronous calls on methods with dynamic dispatch inputs.

The current VISA read and write primatives do not have the ability to abort early. Under many circumstances if the timeout values are short this is not an issue but it can be an issue if a long timeout is required. The current work around is to use a short timeout value and loop continually ignoring the individual timeouts until a threshold has passed and then pass the timeout error out. This apporach requires the extra code to "monitor" the process of the communication. It also requires shift registers and associated logic to maintain the data. It would be desireable to simply set the timeout for the desired value and have a separate VISA property that can be set cause the current operation to abort.

There are currently two NI toolkits which add a software layer on top of the automotive CAN bus.  

 

The Automotive Diagnostic Command Set (ADCS) adds a couple of protocol standards like KWP2000 (ISO 14230), Diagnostics on CAN (ISO 15765, OBD-II), and Diagnostics over IP (ISO 13400).  This is a pretty handy API when all you need is one of these protocols.  But often when you need to communicate to an ECU, you also want have a normal Frame or Signal/Channel API where you get raw frame data, or engineering units on signals.

 

The ECU Measurement and Calibration (ECU M&C) adds XCP, and CCP capabilities on top of CAN allowing to read and write parameters based on predefined A2L files.  This again is super handy, if the A2L you provide can be parsed, and if all you need to do is talk XCP or CCP to some hardware.  But often times you also need to talk over a Frame or Signal/Channel API.  And what if you need to also talk over some other protocol.

 

Every time I've had to use one of these toolkits, it has failed me in real world uses, because you generally don't want just one API you want severaal.  And to get that kind of capabilities you often are going to close one API sessions type, and reopen another, then close that and reopen the first.  This gets even more difficult when different hardware types either NI-CAN or NI-XNET are used.  The application ends up being a tightly coupled mess.

 

This idea is to rewrite these two toolkits, to be as pure G implementation as possible.  The reason I think this would be a good idea, is because with it you could debug issues with file parsing, which at the moment is a DLL call and hope it works, and it would allow for this toolkit to be used, independently of the hardware.  Just give it some raw frames, and read data back.  NI-XNET already has some of this type of functionality in the form of their Frame / Signal Conversion type, where it can convert from frames to signals without needing hardware.  If these toolkits could support this similar raw type of mode then it would allow for more flexible and robust applications, and potentially being able to use these toolkits on other hardware types, or simulated hardware.

I think it would be extremely useful to add the possibility to have as a target in a project a Windows PC target.

This would help keeping well organized applications running on more than one PC, and would make debugging easier.

 

23126iBEF6B464E54D2588

This idea was submitted by a user who requested I post this for them.

 

As of now, the VISA resource controls only allow you to select resource names without their full Windows description.  You can select individual COM ports (COM3, COM5, etc), or pick from a list of alias names if you've defined aliases for your com ports.  But, it might be nice to give the user a configurable option which will provide the additional descriptive information that you can find in Windows Device Manager.  This could allow novice users to select the desired COM port based on the actual physical layer needed for the application.  Again, I'm pretty sure you can work around this by reviewing the different COM ports in Measurement & Automation Explorer, or even creating your own aliases to surface the additional information.  But, if I'm creating an executable, to be used on different systems by novice users, then I may not want them to have to go into MAX to be able to properly identify their desired port.

 

So, instead of asking the user to select a COM port from a list of items looking like this...

travisferguson_0-1641925866067.png

 

Maybe give an option in a property page for the VISA Resource Control that might look like this (this is a mark-up)...

 

travisferguson_1-1641925926951.png

so that an operator can pick from a more descriptive list like what you see in the Windows Device Manager...

travisferguson_2-1641925994204.png

 

Thank you,

 

If you are using TCP to communicate to a different code environment, you may want to set some of the socket options. For example, for responsive control, you will want to disable Nagle's algorithm. There is currently no obvious or easy way to do this. TCP Get Raw Net Object.vi in <vi.lib>\utility\tcp.llb will provide the raw socket ID, but you then need to call setsockopt() on your particular platform using the call library node. You can do this with the code provide here. A much better way would be adding a property node to the TCP reference that allowed you to set and query the options directly.

See Also

Simulated Devices in TestStand Workspaces

Project and Workspace Context in MAX

 

Link to those ideas in next post

 

We can already create tasks, channels and scales in a LabVIEW project but, We cannot then use MAX to run those Tasks and we must use MAX to create a simulated device on a development machine.  After a few projects the Max configuration becomes cluttered.  Deployment and importing of the hardware configuration can get problematic to say the least! 

 

On the other hand- if the Hardware, Data neighborhood and IVI session set-ups could be added to the project deployment would be a snap! just import the *.nce from the project without having to create one from MAX and exclude items not concerned for the project we are installing.

 

For integration and station troubleshooting the Sessions, Aliases, Tasks et al would be organized by application or project in MAX and fault identification has all the "tools" any repair tech could want to isolate a failure.

 

 

We have quite a few LabVIEW users here, but not many of us have the application builder or the experience to use it.  So I get many requests to build an executable and installer for others.  Each time I have to take their DaqMX tasks (in the form of a *.nce file from their machine), import it into my MAX and then when creating the installer essentially re-create the same file.  Can an option be added to the Hardware Configuration tab to allow you to select a NCE file instead of create one?

 

Thanks,

-Brian

 

It would be great to determine properties of a disk drive i.e. type and size. There have been a number of my applications that would benefit from knowing the difference between a local drive and a network drive. Drive Types are shown below.

Drive Properties Fig 1.GIF

 

It is simple in LabVIEW to get a list of disk drives on a given computer as shown:

Drive Properties Fig 2.GIF

There are round about ways to get drive information such as command prompt and the system exec.vi and registry vi’s. These methods require a lot of overhead and programming. There is currently no simple function in LabVIEW that I can find that will return the properties of a disk drive.

 

This idea is more of a request for National Instruments to include a new VI that will get Properties of a Disk Drive. This new VI should be similar to the existing Get File Info VIs.