02-25-2015 04:23 AM
I think 90% people compare apples with oranges.
LabVIEW can't be put in the same category as VS/C# or QT/C++.
LabVIEW is a platform/language/enviroinment more specialized toward:
1) hardware
2) FPGA
3) RealTime equipment
you can say what you want, but LabVIEW is BORN for solving these points, and it's better at doing this.
LabVIEW falls behind when one is going towards more "software specifics problems", like:
1) OOP is convoluted, definitely not a "first citizen", limited.
2) Large software are more complicated to manage, there is no MVC architecture, general OOP lack hurts a LOT here.
3) installers, updaters, user inteface limited and practically impossibile to customize (besides trivial color changes or kid-level-graphic), LACK OF UNICODE in string (strings in LV are ASCII, shame on NI, wake up, it's 2015), violation of patterns like "go dataflow, but hey there are also DVR, or local variables, or shared variables, or globals...."
Labview still has a place because has an ecosystem around.
You have hardware, you have drivers, you have MANY add-on already buildt on top of labview, you have tons of functions and libraries that you simply don't have in C#/C++.
My personal feeling is that LabVIEW should be dropped by NI and migrate to a new "software platform for the future".
I don't know if .NET is usable at "all levels", like from UI to low level operations (I guess no). I think VISA drivers are not even available as native .NET, so you have to P/Invoke unmanaged code which is a pain in the neck, and only 5% of programmers understand what they are doing in .NET.
So NI should "invent" a new language/platform, that has the power of C in term of "close to hardware", and the expressive power of the modern languages like C# or Apple Swift, and slowly port all the ecosystem there
Basically it's what is Apple doing with Swift: it's not convenient to patch Objective C, after 20+ years it's just time to restart a new platform.
it's what Apple already did 2 times when it transitioned from PowerPC to Intel, and from System OS to Mac OS X.
What to expect from LabVIEW 2015? the platform has already reached its limits
02-25-2015 07:03 AM
I don't see an issue with the apple/orange comparison in general. I just don't believe it was applied very well here. I used one example of text-based languages where the difference was very distinct. We could look at any two text-based languages and find applications where one works better than the other. That was the point.
I'm not trying to claim, nor have I ever, that LabVIEW is the best solution for every application. Arguing against that point simply doesn't make sense Alessandro. It'd be like me trying to argue against the idea that LabVIEW is the worst solution for every application. You haven't made that claim. Why would I try to negate it?
OOP was definitely an afterthought. Honestly, I avoid using it and work with other solutions.
I don't have any issue managing larger applications. MVC tends to be just as much of a hassle to keep organized.
I don't really use unicode for anything. But, I could see where it might bother others.
The last piece is a typical nonsense point. Locals/Globals/etc aren't a problem in themselves. They're a problem when used poorly by a programmer. This is no different than claiming C is problematic because a programmer creates globals when they should be passing by reference, etc. If we universally blamed the language for bad programmers, all languages would be "falling behind." Let's avoid making that point as it simply doesn't make sense.
How far from hardware do you honestly see LabVIEW as you list it being specialized towards hardware as a benefit? Are you only hoping it becomes more OOP oriented and using that synonymously with expressive?
02-25-2015 07:20 AM
@natasftw wrote:
I don't see an issue with the apple/orange comparison in general. I just don't believe it was applied very well here. I used one example of text-based languages where the difference was very distinct. We could look at any two text-based languages and find applications where one works better than the other. That was the point.
Jesus christ dude... you're splitting hairs.
LabVIEW = apples
Text based = oranges
90% of programmers would've got that oranges does not equal BASIC, Forth, Prolog, Lisp, Java, Smalltalk, etc, etc, etc, or any other obscure or speciality language.
I assumed that people would get, and I'm sure most did, that oranges = C/C++/C#, Ada, etc.
02-25-2015 10:06 AM
I think we should clarify three things here:
We can't talk about expressivity, abstraction and those languages technical features if both Languages have different code styles (i.e paradigms) : Expressivity in Java is totally different of Haskell Expressivity. Compare LabVIEW features (which is a kind of "impure" TPL Dataflow Language) with another mainstream programming language *features* is simple nonsense.
LabVIEW is oriented to specific field (Engineering and Scientific), *but* you CAN use it (doesn't mean you should) in another area. Personally, I like LabVIEW because i can focus on the problem instead of losing time to cover language pitfalls to express algorithms. Simplicity is the keyword and LV shines here. I feel most programming languages are losing this (Their aren't enforcing a single coding style and standards anymore) throwing more primitives, constructs and paradigms...creating a unmantainable mess and overcomplicating a thing which should be simple: Look at C++, C#, Scala... (Some people consider this as a advantage...)
LabVIEW is constantly growing in terms of "software engineering" (LabVIEW 8.2~2014) and community contribution with API's has increased over years.
Personally, i think LabVIEW should invest more on User Interface possibilities and Open Hardware to expand the field of influence.
02-25-2015 10:59 AM - last edited on 02-26-2015 10:28 AM by dcarva
The problem of LabVIEW is that when you try to go a bit outside of its "sandbox" it falls apart pretty quickly.
And there are many aspects that are needed today, besides the "UI".
I don't see any improvement in this area since .... the "event handler structure" introduction (labview 7 maybe?)
I have every single LV since 6.1, up to 2014.
In the 2014 the lack of improvements and the signs they are on a dead end are evident, because they put in some previously add-on packages (like "PID and Fuzzy functions", or deploy to touch panel) to compensate the lack of stuff... They implement stupid ideas taken frm the forum lol.
There is ZERO sign of strategy here, where are they going?
LabVIEW is not simple, it's a growing mess like other mature languages. It sticks around only because of ecosystem (hardware, drivers, FPGA, the compact RIO stuff, and the other softwares and add-on of the suite).
I don't see any point for LabVIEW, National insturment can achieve the same things in other platform/language (like it's already done in part in Measurement Studio) or in a new platform/language.
Finally, please remove this 32x32 limit on icon.... it's just ridiculous, why am I supposed to draw a picture of every function? the space is clogged, I can't even type a proper spelled name....
02-25-2015 11:29 AM - edited 02-25-2015 11:48 AM
@natasftw wrote:
I don't see an issue with the apple/orange comparison in general. I just don't believe it was applied very well here. I used one example of text-based languages where the difference was very distinct. We could look at any two text-based languages and find applications where one works better than the other. That was the point.
I'm not trying to claim, nor have I ever, that LabVIEW is the best solution for every application. Arguing against that point simply doesn't make sense Alessandro. It'd be like me trying to argue against the idea that LabVIEW is the worst solution for every application. You haven't made that claim. Why would I try to negate it?
OOP was definitely an afterthought. Honestly, I avoid using it and work with other solutions.
I don't have any issue managing larger applications. MVC tends to be just as much of a hassle to keep organized.
I don't really use unicode for anything. But, I could see where it might bother others.
The last piece is a typical nonsense point. Locals/Globals/etc aren't a problem in themselves. They're a problem when used poorly by a programmer. This is no different than claiming C is problematic because a programmer creates globals when they should be passing by reference, etc. If we universally blamed the language for bad programmers, all languages would be "falling behind." Let's avoid making that point as it simply doesn't make sense.
I only know of Ada where OOP was not an after thought, all other general purposes OOP are after thoughts, LV no different. Since when did C use references? Hey if your lucky enough to have your choice use Visual Studio instead. I've worked with folks on either sides and graphical developers ran circles around the text C, C++ developers. But hey this may not be the case for all, just sayin.
Correction: not even Ada was originally OOP, from Wikipedia " Ada 95 added support for object-oriented programming, including dynamic dispatch."
02-25-2015 01:01 PM - last edited on 02-26-2015 10:28 AM by dcarva
@Alessandro__ wrote:
The problem of LabVIEW is that when you try to go a bit outside of its "sandbox" it falls apart pretty quickly.
And there are many aspects that are needed today, besides the "UI".
I don't see any improvement in this area since .... the "event handler structure" introduction (labview 7 maybe?)
I have every single LV since 6.1, up to 2014.
In the 2014 the lack of improvements and the signs they are on a dead end are evident, because they put in some previously add-on packages (like "PID and Fuzzy functions", or deploy to touch panel) to compensate the lack of stuff... They implement stupid ideas taken frm the forum lol.
There is ZERO sign of strategy here, where are they going?
LabVIEW is not simple, it's a growing mess like other mature languages. It sticks around only because of ecosystem (hardware, drivers, FPGA, the compact RIO stuff, and the other softwares and add-on of the suite).
I don't see any point for LabVIEW, National insturment can achieve the same things in other platform/language (like it's already done in part in Measurement Studio) or in a new platform/language.
Finally, please remove this 32x32 limit on icon.... it's just ridiculous, why am I supposed to draw a picture of every function? the space is clogged, I can't even type a proper spelled name....
Hey, what do you mean, nothing significant in LV 2014! We have a new icon! 😄
Seriously, though - at least LabVIEW is still growing. Maybe more slowly as of late, but it's still growing and improving. What was the last innovation in C++?
02-25-2015 01:02 PM
Alessandro__ wrote:
There is ZERO sign of strategy here, where are they going?
You should check out the LabVIEW News blog. More specifically, check out the entries with the Point of VIEW tag. The are very high level, but it does give a glimps of where NI is going with LabVIEW.
02-25-2015 01:53 PM - edited 02-25-2015 01:58 PM
If you want to look at languages that started OOP you should more look into Python, Lua, or Java. None of them supports very fancy stuff like multiple inheritance though, but that is anyhow considered to complicated for even most properly educated IT people, and a pretty good guarantee for a software mess when used.
That said I do find the lack of a formal interface declaration in LVOOP a limiting factor. Not to the extend that it makes it unusable but definitely something I do miss after having worked in Java.
02-25-2015 10:38 PM
@WayneS1324 wrote:
Jesus christ dude... you're splitting hairs.
LabVIEW = apples
Text based = oranges
90% of programmers would've got that oranges does not equal BASIC, Forth, Prolog, Lisp, Java, Smalltalk, etc, etc, etc, or any other obscure or speciality language.
I assumed that people would get, and I'm sure most did, that oranges = C/C++/C#, Ada, etc.
Java is obscure? Yikes. Now I see why you're grouping all text-based languages into one. You've created such a narrow definition of text-based that you've essentially only left a handful.
@richjoh wrote:
I only know of Ada where OOP was not an after thought, all other general purposes OOP are after thoughts, LV no different. Since when did C use references? Hey if your lucky enough to have your choice use Visual Studio instead. I've worked with folks on either sides and graphical developers ran circles around the text C, C++ developers. But hey this may not be the case for all, just sayin.
Correction: not even Ada was originally OOP, from Wikipedia " Ada 95 added support for object-oriented programming, including dynamic dispatch."
Since when didn't C use references? Bell Labs had pointers in B. Did you seriously program in C without ever passing by reference rather than value? Or, did I just miss the /sarcasm tag?