LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Do you still have fun using LabVIEW ?


@Ficare wrote:

Hello LabVIEW users,

do you still have fun, programming with LabVIEW ?

I attended to several presentation, and had the feeling it became sooooo serious 😅

Maybe LabVIEW programmers had complexes, and wanted to show, and over-prove, that we can do the same as other langages ?...

 

What are your feelings about that ?

 


Remember when NI Week had all sorts of cool demos that AEs used to build? When practice and learning came in the form of fun projects? I've been very happy to see Norm returning to these kinds of antics because LabVIEW very much needs it. The bottom-liners and KPIers sucked the fun out of everything.

~ Helping pave the path to long-term living and thriving in space. ~
Message 11 of 30
(433 Views)

@aeastet wrote:

I have been using LabVIEW for 25 years now. I have enjoyed using and developing with it. I have a unique perspective on this as I develop with LabVIEW everyday and I also mentor for FIRST robotics. LabVIEW used to be a powerhouse in FIRST robotics. Thousands of robotics teams with tens of thousands of students developing robots and code. Now, teams are made fun of for using LabVIEW. There is no support from NI. The code base is 10 years old. Teams do not want to use it because the code they need to be competitive does not exist unless some other team made it and happen to share it. NI has not added anything new for FIRST for many years. This probably began when they start looking at the money and stock holders over the product. Many think that after next year NI will not be part of FIRST any more. LabVIEW was on over 60% of the computer when I started. Now they are on 6% and going down every year.


I was just talking about this with someone.  I last mentored in 2012, but figured this would be the case.  Sad to see it confirmed. 

--
Tim Elsey
Certified LabVIEW Architect
0 Kudos
Message 12 of 30
(372 Views)

I'm finding "fun" is becoming inversely proportional to the size of the LabVEW program.  

 

NI makes some things very 'unfun'.  Example:  It's hard to convince yourself that calling a class object by (data value) reference using in-place structures is fun.   Holy clunky syntax Batman!   I mean, look at how you have to do error handling, if nothing else.

 

joshualguthrie2civ_0-1743438226467.png

 

Then to get you good and mad, you realize, that NI doesn't do it this way.  They call their class objects by reference in a much more sane way!

joshualguthrie2civ_1-1743438350653.png

 

 

But, oh no.. race conditions...  LOL

 

LV really can punish you if you can't do something using their normal "work-flows"... .. and it crashes a lot..  way too much, really...  30+ minute compile times get really unfun, when you realize that LV is only using one processor core...  or realize that you have to do some crazy and exotic thing to prevent some memory object you created getting nuked by the memory manager..

 

In all seriousness, this latest program I'm currently working on may be the Plant's last big hurah in LabVIEW (been using it almost 25 years now).  The reality was, the only reason LabVIEW was chosen this last time was because all the EE's here all have pretty extensive LabVIEW training.  But we are all openly question if it is still the correct tool to be using in Capital Test and Measurement applications.

 

Like aeastet said earlier, it feels like you are watching something die.  Worse still, something die that was once on the verge of greatness and then wasn't.  It had potential.

 

But for small, quick and dirty, disposable applications, it's a blast and remains fun.

 

 

 

 

0 Kudos
Message 13 of 30
(359 Views)

@joshua.l.guthrie2.civ wrote:

   Holy clunky syntax Batman! 


that's me every time, when I look at an advanced object-oriented architecture

😄

 

0 Kudos
Message 14 of 30
(326 Views)

@joshua.l.guthrie2.civ wrote:

I'm finding "fun" is becoming inversely proportional to the size of the LabVEW program.  

 

NI makes some things very 'unfun'.  Example:  It's hard to convince yourself that calling a class object by (data value) reference using in-place structures is fun.   Holy clunky syntax Batman!   I mean, look at how you have to do error handling, if nothing else.

 

joshualguthrie2civ_0-1743438226467.png

 

Then to get you good and mad, you realize, that NI doesn't do it this way.  They call their class objects by reference in a much more sane way!

joshualguthrie2civ_1-1743438350653.png

 

 

But, oh no.. race conditions...  LOL

 

LV really can punish you if you can't do something using their normal "work-flows"... .. and it crashes a lot..  way too much, really...  30+ minute compile times get really unfun, when you realize that LV is only using one processor core...  or realize that you have to do some crazy and exotic thing to prevent some memory object you created getting nuked by the memory manager..

 

In all seriousness, this latest program I'm currently working on may be the Plant's last big hurah in LabVIEW (been using it almost 25 years now).  The reality was, the only reason LabVIEW was chosen this last time was because all the EE's here all have pretty extensive LabVIEW training.  But we are all openly question if it is still the correct tool to be using in Capital Test and Measurement applications.

 

Like aeastet said earlier, it feels like you are watching something die.  Worse still, something die that was once on the verge of greatness and then wasn't.  It had potential.

 

But for small, quick and dirty, disposable applications, it's a blast and remains fun.


Here's a VIM that can reduce the boilerplate needed for doing something inside a DVR.

avogadro5_1-1743460572118.png

 

 

Download All
0 Kudos
Message 15 of 30
(299 Views)

@joshua.l.guthrie2.civ wrote:

I'm finding "fun" is becoming inversely proportional to the size of the LabVEW program.  

 

NI makes some things very 'unfun'.  Example:  It's hard to convince yourself that calling a class object by (data value) reference using in-place structures is fun.   Holy clunky syntax Batman!   I mean, look at how you have to do error handling, if nothing else.

 

joshualguthrie2civ_0-1743438226467.png

 


According to https://forums.ni.com/t5/LabVIEW-Idea-Exchange/Can-we-have-an-error-in-terminal-on-the-in-place-elem... you are making it more clunky than it has to be.

[...]
Also, just FYI, the errors of the two bordernodes are the same result; the only errors generated are by the read, but we duplicate the error for you on teh read and the write so you don't have to wire through the structure; there's no need to merge both of them as you did. 
[...]

 

0 Kudos
Message 16 of 30
(278 Views)

@joshua.l.guthrie2.civ wrote:

I'm finding "fun" is becoming inversely proportional to the size of the LabVEW program.  


I find that’s true for any programming language I worked with. The difference might be mainly that most other languages start with a much higher baseline of “unfunnyness” and the slope of increase is lower as the application grows larger. So the experienced pain over time often feels smaller in traditional languages but in absolute value you need to get in some fairly big applications to hit that crosspoint.

Rolf Kalbermatter
My Blog
0 Kudos
Message 17 of 30
(274 Views)

@joshua.l.guthrie2.civ wrote:

I'm finding "fun" is becoming inversely proportional to the size of the LabVEW program.  

 

NI makes some things very 'unfun'.  Example:  It's hard to convince yourself that calling a class object by (data value) reference using in-place structures is fun.   Holy clunky syntax Batman!   I mean, look at how you have to do error handling, if nothing else.

 

joshualguthrie2civ_0-1743438226467.png

 


That looks like you're trying to make it into a reference based class, basically. Maybe you should check out G#. 🙂

 

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 18 of 30
(265 Views)

@UliB wrote:

 


According to https://forums.ni.com/t5/LabVIEW-Idea-Exchange/Can-we-have-an-error-in-terminal-on-the-in-place-elem... you are making it more clunky than it has to be.

[...]
Also, just FYI, the errors of the two bordernodes are the same result; the only errors generated are by the read, but we duplicate the error for you on teh read and the write so you don't have to wire through the structure; there's no need to merge both of them as you did. 
[...]

 


 

Granted.

 

But you still have to merge two errors.  But wouldn't it be nice if the in-place had an error input (like the suggestion link suggested)?  It's a lot of block diagram real estate to do something simple.  To me, it screams, NI don't want you to do this, but you can if you want.

 

Regardless, it's frustrating it don't look like this (I get why, but something closer):

 

joshualguthrie2civ_0-1743509240263.png

 

In other languages, it's super easy.

 

I fear I took the thread off topic with my rant.  Sorry! 🙂

0 Kudos
Message 19 of 30
(224 Views)

@Ficare wrote:

do you still have fun, programming with LabVIEW ?


It's at least fun enough that I try my hand at Advent of Code with LabView every year...

Message 20 of 30
(213 Views)