06-23-2010 06:25 PM
Has anyone used a static analysis tool such as Klockwork with LabWindows?
I'm trying to configure klockwork with the compile/build environment but haven't been able to figure it out yet.
Thanks.
06-23-2010 09:05 PM
My understanding when I looked into this before was that you can't do it - some of the static analyzers will run on the source code but most want to hook up to the compiler to get it to build the AST.
When I asked about using it with CVI's native LCC complier they kindof held their nose.
But - CVI can use VCPP as a release compiler and these analyzers all will work with VCPP so maybe there is a way to do it.
I didn't like the people at Coverity - way too pushy/arrogant, Klockwork people seemed more civilized.
As I recall there's a Finnish company that has a tool that will work on source code directly, I got an eval version from them at one point.
And I like running Campwood Software's free metrics tool on CVI source code, it's hilarious to prove to some of the less talented devlopers I work with that it's not just my opinion that says their code's no good
campwoodsw.com
It's not a static analyzer but it does pump out some good metrics on complexity, comment coverage, etc.
Menchar
06-29-2010 10:33 AM
Hi there,
Just saw your post via a Google alert. I'm not an expert in the tool you mention, but my experience is that pretty much any build environment can be configured to work with our tools given enough time and effort 🙂 If you'd like to drop me an e-mail at gwyn-at-klocwork.com describing the difficulties you're seeing, I'll be glad to get you the help you need.
In partial response to the other poster, commercial static analysis tools tend to work from source directly, not from the output of some other compiler (unlike OS solutions like Dehydra et al). At Klocwork, for example, we have our own compilers for C/C++, C# and Java. I know others in this space are using the EDG compiler front-end to generate their C/C++ ASTs, but it's all wrapped in their own technology, so you're not dependent on configuring some 3rd party compiler extension, at least.
Regards,
Gwyn Fisher
CTO and VP R&D
Klocwork, Inc.
06-29-2010 12:34 PM
How do you accommodate any compiler or library-specific behavior? CVI uses a modified version of the Princeton LCC compiler as I understand it, with extensive proprietary libraries that are unique to CVI.
For example, NI implemented partial C99 coverage, would the tool work with that?
Then again, from an analysis viewpoint you could argue maybe that if it isn't standard C89 or C99, you shouldn't be writing it anyway. And the CVI C code is standard enough that VCPP is OK with it (but not with NI's C99 additions). I've used CVI with the Intel compiler but then it uses VCPP under the hood anyway.
I do seem to recall when I spoke to Klocwork 18 months ago or so about using your tool with CVI, it was a no go unless NI decided to chip in, as you say time and effort (i.e. $) will solve most things. Maybe I'm confusing Klowork with Coverity
So what is the answer, can you procure a COTS Klocwork analysis tool and run it on CVI C89/C99 sources and get meaningful results?
06-29-2010 12:39 PM
I know that Coverity static analyzer tools will not work with CVI, or the underlying lcc compiler upon which it is based. Coverity likes to have its hooks into the specifics of the compiler.
Our local expert tells us that projects based on Tornado and gcc work cleaner in Coverity, than those based on MS Visual C. I would like try use one of these tools (probably the Eclipse IDE with gcc) to compile a release executable out of a CVI project, then submit the project to the Coverity and see what it finds. Does anyone have any suggestions on how to adapt a CVI project to Eclipse/gcc?
Perhaps that technique might also be useful for those folks who have Klocwork, without reconfiguring it specifically for the CVI environment.
- Jim
06-29-2010 01:45 PM
I've often thought about using a 3rd party IDE for CVI code, but more likley NetBeans than Eclipse. Supposedly NetBeans can be hooked up to GCC. Eclipse just seems too "meta" so as to be the IDE to end all IDE's.
Your comments make me think it was Coverity that said their tool flat out wouldn't work with CVI / LCC without internal hookup.
My employer has Klocwork in house at various sites but we don't have the critical mass at this site to get a local license so no way for me to try it out. I think the site up the street was using Klocwork with visual studio code but it also might have been an embedded IDE that they were using it with. I imagine the most bang for the buck when it comes to analysis would be for embedded applications that are more likely to be installed in a critical device that benefits the most from proven efficiency and correctness.
06-29-2010 04:27 PM
The short answer with regard to applicability to any custom compiler situation is that "it depends." Compilers that support a ton of custom grammar are a bit of a pain for any SCA vendor to support, of course, as we have to understand what you're expressing in order to be able to analyze its correctness. If the compilers to which this thread refers are highly custom, then the effort involved is greater than if they are used to within a hairs' breadth of some relevant, or at least prevalent, standard. That is, if you can drive your custom compiler in such a way as to enforce / require, for example, C99 grammer (perhaps with a few carefully contained extensions) then life is good and customization of an SCA solution tends to be field-installable. If not, then lab support is required and the job becomes more involved. Once the grammar issue is sorted out, the SCA compiler needs to be educated on how the native compiler driver works, for example what switches mean what, what includes are implicit, what defines are implicit, etc. etc. But that is very much a field-installable customization for all commercial solutions that I know of.
So, in summary, I suspect that whether it was Coverity or ourselves (or any other participant in the commercial SCA market) faced with the question of "do you support mycc" the answer as given by well-behaved field staff tends to be "here's our supported compiler matrix, pick from the list otherwise it's a no." The more convoluted answer revolves around how much you can make your compiler work like a gcc-type compiler, or a cl-type compiler, or an icc-type compiler, or whatever other archetypes of dialects are understood by the tool. If you can live within the language restrictions that naturally implies, then the solution tends to be simple. If not, then I'm afraid the answer might well be "not right now."
And in terms of the question as to how much an SCA solution needs to know about the dialect in use, the answer again is "it depends." Incorrectly parsing a function may or may not lead to false positives or false negatives. It may lead to a failure to compile the function at all. Or it may have no effect whatsoever, it really depends on the scope of the grammatical oddity and the ability of the SCA compiler in question to fail back (i.e. subexpression vs. statement vs. basic block error recovery). All of which is a bit variable, unfortunately, and doesn't support giving an answer of "well, this much."