The current CD building process has some limitations which slow down development by lengthening the CD test cycle. This is especially problematic as we approach release deadlines. This spec attempts to detail areas for improvement and how we can approach that work.


Without improving the process significantly we will have increasing problems in the dapper cycle as we increase the number of CDs we're building and the variants thereof.


This specification covers the CD building process before any switch to Launchpad. It does not attempt to cover CD building using the Launchpad infrastructure. Such specification work is left to another specification as yet undetermined.

CD Building process improvements

CD build parallelisation

In breezy, CD builds were entirely serialised, which left the CD build process taking on the order of eight hours for a full set. Shortly after the breezy release, ColinWatson changed the CD image infrastructure to allow builds of CDs for different flavours to be parallelised; for example, it is now possible to run Ubuntu, Kubuntu, and Edubuntu builds simultaneously without them stepping on each others' toes.

Further work along these lines is possible: we will experiment with further changes to the CD image build process so that temporary files are stored in directories specific to the image type rather than merely specific to the flavour, so that we can build install CDs, live CDs, and DVDs all in parallel. It remains to be seen whether CPU and I/O contention will render high parallelisation of CD builds difficult due to cache thrashing. (Debian's large CD sets are produced serially.)

Rebuilding images for single architectures

It is already possible to rebuild CDs for a subset of the usual set of architectures. However, our publish-daily script is not intelligent enough to know how to re-publish the old images, and this operation has to be done by hand. This can be done by creating a hardlinked copy of the previous published tree (for non-custom builds) at the start of publish-daily, and then publishing the images as usual, taking care to unlink previous images before copying in new ones.

Some care needs to be taken with GPL compliance etc. for final release builds (e.g. Ubuntu 6.04 as opposed to milestone CD releases); if an upload changes the Packages file that would be presented when building images for another architecture, then that architecture's CD images will still have to be rebuilt, and source CD images will have to be rebuilt in any case. Importantly however, this is a social problem and a technical solution would be inappropriate.

Jigdo generation performance work


Jigdo is a tool to allow users to reconstruct large images from a set of component parts and a pair of files known as the jigdo and the template. We provide this to our users because if the user has a local mirror then they can reconstruct a CD image much more quickly than they can download it.

Currently Jigdo takes approximately three times as long as the rest of the CD building process put together. I.E. approximately 45 minutes of a one hour CD run is running the jigdo-file program.

There exists a set of mkisofs extensions written by Steve McIntyre called 'JTE' which are part of the jigit software package. These extensions incorporate the creation of the jigdo and template files into the creation of the iso file itself. This means that jigdo-file does not have to reverse-engineer the positional information after the creation of the iso image.

Proposal for improving the performance

By applying the JTE patches to the cdrtools package in dapper we will provide a version of mkisofs in Ubuntu which can create jigdo and template files on the fly during the building of the CD or DVD images themselves.

It is our firm belief that this will solve the performance issues relating to jigdo generation in our CD building process.

Contingency planning

In the event that this approach does not deliver the hoped-for performance improvements, we will fall back to running jigdo generation at the end of the CD build process, after publishing ISO images and starting the mirror sync, but before releasing the archive mirror lock; following that, another mirror sync will be triggered. This is inferior to the above solution because it does not improve resource contention issues on the CD image build machine, but it still has the potential to speed up test cycles.

Update: Experimentation with JTE demonstrates that it works and renders the cost of jigdo generation negligible by comparison with total CD image build time. This contingency plan will not be required.

Implementation status


CdBuildProcess (last edited 2008-08-06 16:20:01 by localhost)