Sunday 28 February 2010

Backing Up Your Development Environment

Back in the good old bad old days, backing up an embedded project was simple.  You put the source files and makefile on a floppy disk.  And for good measure, you could put a copy of the compiler, assembler, linker, and any other tools on the floppy too.  The complete toolchain needed to recreate the shipped code, neatly wrapped up in its own little time capsule.

And that toolchain is an important detail.  The source code on its own is not enough to recreate a build, you have to factor in the tools too.  The output from a compiler can vary by a surprising amount between versions.

These days things are not so straightforward.  The compiler, assembler, and linker still exist, but they generally lurk behind the comforting facade of a glossy integrated development environment (IDE).  You don't need to figure out the arcane invocations needed to convince the command line tools to do your bidding, you just tick some checkboxes in an options dialog.  And I for one don't have a problem with this, I'm perfectly happy for such details to be sorted out behind the scenes.

However this ease of use can be a double-edged sword.  That sparkly shiny IDE is likely a significant chunk of code, and will be squirreling away information in all sorts of nooks and crannies on your PC - in the registry, in initialisation files, in configuration files, and down in the depths of your Documents and Settings folder.

Add in some vendor patch releases and user modifications, and suddenly keeping a snapshot of the state of your tools starts getting tricky.  Not only do you need to know that the code was built with v1.2.3.4 of the Acme Super Awesome IDE, but also that at that point you'd applied three vendor patches, and hand-modified a chip configuration file that you'd found a bug in.  You'd reported it to Acme MegaCorp technical support, but there hadn't been a bug fix release yet. 

Now what do you do?

My current solution is to store a simple text file called 'build instructions.txt' or similar with each project.  It lists the IDE version, any applied patches, and any funnies such as the aforementioned modified chip configuration file.  This approach is not exactly high tech or complicated, but works fine. 

I can't help thinking that there must be a better way though - this is still a bit handraulic and error-prone.  I suggested to our IT guy that we could create a virtual machine (VM) image containing a bare bones XP install, plus the IDE, plus any 'non-standard' files (patches and the like).  That way we could simply specify which VM image to use with the source and project files hauled out of version control.  But that made him go a bit green and start shaking.  He was mumbling about licensing issues with that many Windows images. 

So - if anybody knows of a better way of doing this, do please let me know in the comments.

4 comments:

Unknown said...

The virtual machine idea is a valid one. I have used it in the past. Fortunately for me I have developed on the Linux operating system with Eclipse for many years and this has limited the license exposure I have to worry about.

Of course archiving your VM image isn't enough either. You also need to archive your VM and make sure that you have a machine around that can run that version. I have encountered times when VMWare has significantly changed the internal structure of their VM images and therefore had to jump through many hoops to migrate an older image into something that is currently distributed.

I know this is not common practice, but having a centralized build system in which one or more developers rely on to compile their code from repository can alleviate part of this problem. Then you have removed the IDE from the equation. Of course you can still have your hooks in the IDE to parse any errors and submit jobs to the centralized build system.

Alan Bowens said...

Thanks for that, some interesting comments there. I'd assumed that once you had your VM image you were home and dry, but it sounds like versionitis can creep in downstream too.

We've kicked around the idea of having a build machine for some of our PC code in the past, but never really ran with it - the reality is that people are much more comfortable working locally. Ironically, although we're basically a microcontroller shop, it's the PC code that can be harder to build on different machines, as different engineers install different components in slightly different places and/or ways. As a result Engineer A's project may not build on Engineer B's machine, even if they have identical add-ons installed. It may be time for me to revisit this idea.

Anders said...

I don't think you have to choose between local and centralized builds. We're working with both approaches. We have a (Hudson based) build server that builds and tests everything in the repositories every night. It even builds the complete installation delivery. What's more, we have it set up to always build the trunk plus all active feature- and release-branches.

The build system (minus Hudson) is easy to check out on a developer's machine and our process mandates that check-ins should be preceded by running the official build and at least a subset of the tests to make sure the checked in files are at least possible to build and the resulting executables not completely impossible to run.

If the developer prefers to do day-to-day development in for example the Visual Studio IDE, he is free to do so as long as the official build system is used before check-in.
(We're actually using the VS compiler, but the build system does not use VS build project files for keeping track of files and options.)

Alan Bowens said...

Hi Anders, sounds like a pretty robust build system. I'll have a look at Hudson, I've seen that mentioned in a couple of places recently.