LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

Why does every compile produce a different binary file

Hi,

 

We are using a new configuration management tool and due to this we had to prove that we could take projects back out and reproduce the original binary files. Every time we tried this the new binary file matched the original in terms of size, but the actual data had differences.

 

After a bit of experimenting it was noticed if you force CVI to recompile a project you can never get the exact same binary file, even a recompile straight after making the first one. Anybody now why this is the case, or why we are seeing this?

 

Creating custom step DLL's to be used in Test Stand 4.0 with CVI 8.0.1

 

Thanks in advance

0 Kudos
Message 1 of 10
(5,927 Views)

Probably because of the date/time stamps embedded in the binaries.

 

JR

0 Kudos
Message 2 of 10
(5,923 Views)

maybe because the build date is embedded into the binary, especially for debug builds. or simply because the file creation date is not the same. i also seem to remember (but i am not sure) that the path to the debug database is stored into the executable for a debug build. that should prevent you from trying to reproduce a binary in debug configuration. then you add the problem of external libraries: if you don't use the exact same library, the resulting binary might be different. 

 

although i do understand the need for retrieving the same binary in the context of configuration management,  i am not sure that you have to be able to _rebuild_ the same binary. can't you just store the binary along with your source code into the configuration management tool ? personally, that's what i am doing with my version control tool: the source for each product is in its branch, and when i release something to a customer i create a new branch (a "tag" to be precise) with the sources AND the binaries. this way, if anything happens to my customer, i can restore its configuration to the exact same state and this does not require that i keep a computer with all the softwares installed for rebuilding it.

Message Edited by dummy_decoy on 03-20-2009 06:20 AM
0 Kudos
Message 3 of 10
(5,920 Views)

That was our initial take, that it would just be time stamps information. Until that is we used a comparison tool and there were differences throughout the files and sometimes in blocks that were just too big to be time stamp information alone. It was also discovered that builds literally seconds apart were showing differences throughout.

 

Additionally, we have checked that our licences are up to date and we are building release builds.

 

This problem has come about as we are trying to prove the usage of a new configuration management tool and one of the obvious (or so we thought) ways to prove it would be to take the entire project back out and check that we could remake the original binary file.

 

Now we are stumped as to how to prove the usage of the config tool if every binary file created is going to be different, other than going through the eintire testing phase of the original delivery.

0 Kudos
Message 4 of 10
(5,914 Views)

Hello Dave,

 

You're correct. The differences in the binaries aren't attributable only to timestamps.

 

Unfortunately, the CVI compiler and linker wasn't developed with the goal of producing identical binary images from the same source code. To give you an example, it uses memory pointer values internally in order to index hashtables etc. Small variations in memory allocation (e.g. opening and closing a UIR file) completely rearrange symbols during the next compile, which in turn changes the binary image significantly.

 

You're not the first person to ask for the ability to generate identical binaries, and we have spent a fair amount of time investigating the issue. Our conclusion was that it would require a significant rewrite of major portions of the compiler, which obviously is pretty risky and also time consuming. We haven't completely ruled it out, but given that this is not a very common request, it is not something we are planning to do in the short term. But if more and more of our users have no choice but to use these types of image validation processes, we'll definitely revisit this issue.

 

I'm sorry that I don't have better news...

 

Luis

NI

Message 5 of 10
(5,862 Views)

Luis, I'm not interested in such validation problem but I can understant that other developers / organizations are, so I can imagine that at some point you will run the way of rewrinting the compiler.

Nevertheless, since this effectively is a risky operation, as you already have noted, I would prefere that you possibly develop it as an alternative compiler with an option so that the developer can choose which one to use; the actual compiler is quite good and known to work well, and I won't be happy to be compelled to run a new and risky way if not necessary.



Proud to use LW/CVI from 3.1 on.

My contributions to the Developer Community
________________________________________
If I have helped you, why not giving me a kudos?
0 Kudos
Message 6 of 10
(5,850 Views)

Thanks for the feedback, Roberto. We'll definitely consider that possibility if and when it comes to that.

 

Luis

0 Kudos
Message 7 of 10
(5,813 Views)

What happens when you use an external compiler such as VC++ or the Intel Compiler?  Are the binaries compatible then?  It might be the case that compilers don't commonly create identical binaries on recompilation of the identical source. 

 

It does sort of beg the question - why are you recompiling/rebuilding if you're trying to create an identical binary?  What am I missing here - if you want an identical binary why not copy the one you have? Maybe if you don't keep track of the previously built executable and are trying to re-create it.

 

Doesn't most version control rely on diffeences in source code?

 

Menchar

0 Kudos
Message 8 of 10
(5,788 Views)

As our business supports/maintains a delivered project for up to 25 years, configuration control is obviously a big concern for us. As you could imagine during a products life cycle there will be upgrades required. Therefore we need the ability to confidently take a project out of "storage" to recreate its development environment before any upgrade work can begin.

After recreating this environment, the next stage is to prove that you can recompile successfully, and that the new binary file performs like the original.

The testing of which can be both lengthy and expensive.

Alternatively if the compiler had a build option where it could produce a stable binary file, then all that would be required is a quick comparison between the binary files.

Therefore you would know instantly that you had recreated the original binary file and could therefore cut out a whole pile of regression testing. I am sure we are not the only company with this issue.

0 Kudos
Message 9 of 10
(5,764 Views)

Sure, I've dealt with the same thing with long lived systems.  Yup, testing is expensive.

 

What we sometimes do (but often do not) is capture the full development environment in an archive, so that you can roll it back out for future support.  Sometimes we save the development system PC, or a disk image, or sometimes the hard drive itself.

 

There can be other dependencies that affect an "identical" build - OS versions, driver versions, interface versions, etc. and I've found that even if you capture the tool stack for a product you can get nipped by one of these other factors.

 

We've found that proving the validity of configured sources is key - I usually do a full rebuild from archive candidate files on a clean development system prior to deployment just to prove I haven't missed anything.

 

I know developers and SW managers who don't even try to do this -they think it can't be done, or it takes too much time.  I have encounetered executable images that are "golden" in that they simply have to work as they are, perfectly, forever, because we can't ever recreate them 😉

0 Kudos
Message 10 of 10
(5,749 Views)