Check out the new USENIX Web site. next up previous
Next: 3 Evaluation Up: Overhauling Amd for the Previous: 1 Introduction


2 Motivation

When software packages grow large and are required to work on multiple platforms, they become more difficult to maintain without automation. We spent several years maintaining Amd and Am-utils, as well as fixing, porting, and developing other packages. During that time we noticed how difficult it was to maintain and port such packages and that led us to convert Amd to use autotools. As a result of the conversion, we noticed that Am-utils became easier to maintain and port. We therefore set out to quantify this improvement in the portability and maintainabilty of the Am-utils package, and those investigations led to writing this paper.

There are six reasons why porting such packages to new platforms, adding new features, or fixing bugs becomes a difficult task more suitable for automatic configuration:

  1. Operating system variability: There are more Unix systems available today, with more minor releases, and with more patches. Flexible software packaging allows administrators to install selective parts of the system, increasing variability. An automated build process can track small changes automatically, and can even account for local changes.

  2. Code inclusion and exclusion: To handle platform-specific features, large portions of code are often surrounded by #ifdef directives. Platform-specific code is mixed with more generic code. Often, system-specific source files are compiled on every system, because there is no automatic way to compile them conditionally.

  3. Multi-level nested macros: To detect certain features reliably, older code uses deeply nested #ifdef directives. This results in complex macro expressions designed to determine features as reliably as possible. The main problem with such macros is that they provide second-hand or anecdotal knowledge of the system. For example, to test if a compiler supports ``void *'', some code depends on the name of the compiler (GNUC) rather than directly testing for that feature's existence.

  4. Shared libraries: Many packages need to build and use shared or static libraries. Such packages often support shared libraries only on a few systems (e.g., Tcl before it was autotooled), because of differing shared library implementations. Frequent use of non-shared (static) libraries results in duplicated code that wastes disk space and memory.

  5. Human errors: Manually-configured software is more prone to human errors. For example, the first port of Amd to Solaris on the IA32 platforms copied the static configuration file from the SPARC platform, incorrectly setting the endianness to big-endian instead of little-endian.

  6. Novice and overworked administrators: With a rapidly growing user base and the growth of the Internet, the average expertise of system administrators has decreased. Overworked administrators cannot afford to maintain and configure many packages manually.

Converting OSS packages to use GNU autotools--Autoconf [5], Automake [6], and Libtool [7]--addresses the aforementioned problems in five ways:

  1. Standard tests: Autoconf has a large set of standard portable tests that were developed from practical experiences of the maintainers of several GNU packages. Autoconf tests for features by actually exercising those features (e.g., compiling and running programs that use those features). Packages that use Autoconf tests are automatically portable to all of the platforms on which these tests work.

  2. Consistent names: Autoconf produces uniform macro names that are based on features. For example, code which uses Autoconf can test if the system supports a reliable memcmp function using #ifdef HAS_MEMCMP, rather than depending on system-specific macros (e.g., #ifndef SUNOS4). Autoconf provides a single macro per feature, reducing the need for complex or nested macro expressions. This improves code readability and maintainability.

  3. Shared libraries: By using Libtool and Automake along with Autoconf, a package can build shared or static libraries easily, removing a lot of custom code from sources and makefiles.

  4. Human factors: Building packages that use autotools is easy. Administrators are becoming increasingly familiar with the process and the standard set of features autotools provide (i.e., run ./configure and then make). Administrators do not need to configure the software package manually prior to compilation and they are likely to make fewer mistakes. This standardization speeds up installation and configuration of software.

  5. Extensibility: Finally, software maintainers can extend Autoconf by writing more tests for specific needs. For example, we wrote specific tests for the Am-utils package that detect its interaction with certain kernels. This allowed us to separate the common code from the more difficult-to-maintain platform-specific code.

Our experiences with maintaining the Amd package clearly show the benefits of autotools. When we converted the Amd package [12,9] to use autotools, the code size was reduced by more than one-third and the code became clearer and easier to maintain. Fixing bugs and adding new features became easier and faster, even major features that affected significant portions of the code: NFSv3 [8] support, Autofs [1] support, and a run-time automounter configuration file /etc/amd.conf. New features that we added immediately worked on many supported systems and bugs fixes did not introduce additional bugs.


next up previous
Next: 3 Evaluation Up: Overhauling Amd for the Previous: 1 Introduction
Erez Zadok 2002-04-17