LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Tell us about your large LabVIEW applications!

If anyone doubts that you can build a professional windows looking application with LabVIEW then take a look at GOOP Developer.

www.sciware.com.au


GOOP Developer has a Windows Explorer style user interface complete with popup menus that makes creating and managing your GOOP based project very simple and intuitive. The class framework used supports inheritance, virtual functions and the ability to bind a process. You can develop with GOOP Developer in any version of LabVIEW from 6.1 up.

GOOP Developer is about 700 vi’s and was written completely in LabVIEW. GOOP Developer and its associated class framework took approximately three years to research and develop to its final state.

I work for ICON Technologies as a Systems Developer and we use GOOP Developer and its associated class framework to develop our systems. The benefits are

1) Our development time is significantly reduced through ease of use.

2) Code re-usability. For example we can quickly extend the capabilities of an existing class through inheritance.

3) Our code becomes simpler through the use of inheritance and the ability to bind a process to a class.

With LabVIEW and GOOP Developer we can design and develop complex systems. Our current project is to develop a system for controlling a sample handling robot that will manage the handling and analysis of ore samples.

I have also come across the false argument from text based programmers that LabVIEW is not a general purpose programming language, I strongly disagree. The misconception seems to arise from the fact that you don’t need to be a formally trained programmer to develop your solution. Also, unfortunately a lot of the LabVIEW code that is developed by self taught LabVIEW programmers that gets reviewed by the IT Manager is not what would be defined as good structured code.

Message Edited by SciWare on 04-30-2005 09:53 AM

Kurt Friday
www.sciware.com.au
Message 11 of 105
(6,144 Views)
[unfortunately a lot of the LabVIEW code that is developed by self taught LabVIEW programmers that gets reviewed by the IT Manager is not what would be defined as good structured code.]

Kurt, you are right about some LV code not being good structured code. Don't we see a lot of that here in this forum? However, I have seen the same with VB, C, C++, Fortran, you name it. I have seen some horrendous VB code. No structure, lots of goto's, improper indentation, and such. It has nothing to do with the language used, its all about the programmer. But since LV is so easy to learn, you will naturally find more unstructured and bad code in LV than in other languages. On the other hand, I've seen great structures and object orientation in LV that compares to the best of C++ and VB.
- tbob

Inventor of the WORM Global
Message 12 of 105
(6,048 Views)
Hi Mike,

during the last years, we have been continuously developing the software for an optical surface analysis system. This includes the instrument control, data handling, visualization and analysis, image processing and external device control, consisting of about 1000 VIs and roughly 100MB Labview source code.
Some key features:
- instrument control via TCP/IP to an embedded system running up to 16 motors, 3 cameras and a bunch of A/D and DIO ports.
- video streaming over TCP/IP
- data generated by the instrument and displayed are numeric values, graphs, images and maps of data, handled by log files, charts, 3D graphs and so on.
- image display and processing is done with IMAQ Vision
- automation of the measurement routines may be done with a scripting language completely coded in Labview
- external devices are integrated via DataSocket, RS232, ModBus, USB, or DAQ

The GUI is split into several VIs that run independently, communicating mainly via Queues, LV2-style globals, etc. (and some standard globals here and there, but we are working on that 😉 With some effort to make the front panels NOT look too much Labview-like, which has partly to do with the prejudices you mention.
As the project was evolving over several years, the quality of the code is very different, depending on our improving skills, but also on the LV version that was originally used when a particular module was written. Sometimes one spends time developing something which is then included in the next version of Labview, so there is a tendency to start all over again. Regarding CPU load, memory usage or execution time, a lot of time was required to evaluate different realizations. Ok, there are some general guidelines, but they work best with the simple examples, not with complex real world applications.

For LV8, I am hoping NOT to see too many new functions. For me, what is really needed are tools to get the project organized in a better way. By this, we will see more of the large applications.

Dirk
Message 13 of 105
(5,579 Views)
Hi All !

Hey ! you guys are talking in terms of VI's. I always though that the code size was better known by the number of nodes.
Message 14 of 105
(5,907 Views)
Hi J.A.C,

you might be right, but in terms of managing a project also the number and organization of the VIs is important. If you would put 200 subVIs into one, you would not need any project tools but a 10 by 10 feet display...
Additionally, especially with many dynamically loaded VIs, it seems to be very tedious to get the "real" number of nodes, as you should also count the number of nodes in specific subVIs, but only once those that are frequently used across the entire platform.

Dirk
0 Kudos
Message 15 of 105
(5,911 Views)
Thanks Dirk,

Actually, at present i am working with one such code where i really need a 10 x 10 display ! :).

I am trying to deveolpe good LV programming style and w.r.t that i need to know what is the recommended number of SubVIs that a VI should have ?.

Also the project tool you are refering to is VI metric or VI profile tool ?.
0 Kudos
Message 16 of 105
(5,907 Views)
Hi J.A.C.

I guess there is no general rule for the number of subVIs. The overhead in terms of memory usage or execution speed induced by using a subVI should mostly be negligible (if their front panels are closed, or removed for the .exe). Try to keep a good separation of different layers (user interface, functional layers, hardware drivers, etc.), i.e. a low level driver VI should not be called directly from the main user interface VI. A few weeks ago I followed a discussion in a LV workshop about the (non-)usability of the VI Hierarchy window for large applications, and a good deal of the problem seems to arise from fact that there are often too many dependencies ("wires") extending over several layers, making the Hierarchy look chaotic.
Another thing that helps a lot is to have an error in/out cluster for ALL VIs, even if not handled internally. By wiring each error input/output you get a transparent flow control and timing and not too many sequence structures.
It is often a matter of discipline, not of ability, to produce good code. Hard enough, though.
With "project tools" I meant the entire collection available today, including VI metrics and Buffer allocation. But what I am really waiting for is something that helps me not to get lost with 20 VIs and diagramms opened at the same time...

Dirk
0 Kudos
Message 17 of 105
(5,885 Views)
If any of you guys want to see prime examples of monstrous-sized bad code written by a self-taught programmer, you just let me know!!! 😉
********************************************
Amateur programmer for over 10 years!
********************************************
Message 18 of 105
(5,517 Views)
The largest LV code was probably my last contract.

Unfortunately, due to NDA, I cannot describe many aspects of the work, but I can provide a brief overview of the work.

There were three "Test Software" Engineers working on automating the final stages of production for a wireless system. I worked on it for over 1 year. I believe we had over 750 vi's/sub-vi's when I left. It was trully a piece of art.

Basically, the system comprised of a number of cards each having their own dedicated CPU running Linux.
There was a central processor card with six RF tx/rx cards.

All cards had to be fully tested and RF tuned individually and as part of the final assembly, the system was automatically configured and RF tested as an assembled unit.

Due to the number of cards, efficiency (speed) had a high level of importance. Cards were tested in parallel, using TestStand as a sequencer.

The test software programmed various FLASH memory devices and FPGA using a number of programming devices (such as Xilinx Parallel cable, VisionProbe, I2C programmer, Ehternet).


Basically, there were many instances where I thought I'd reach the limits of LV, and to my surprise, solutions were implemented rapidly. Many thanks to this forum.. I still prefer LV over C/C++.

-JLV-
Message 19 of 105
(5,512 Views)
Spaceman: For some strange reason, I don't think you are kidding! 😉 You should see some of my earlier code when I was a self-taught beginner. I had sequences nested to umpteen levels with a few cases in between. Now that I look back on it, what a nightmare!
- tbob

Inventor of the WORM Global
0 Kudos
Message 20 of 105
(5,507 Views)