How to Work with Libraries
If you are a client of outside libraries (and let’s face it, who can afford not to be?), you probably never think about how you work with the libraries. You include the headers, instantiate the objects you need or call the API, link, run and go.
In a beautiful, perfect world, everything is sunshine and butterflies and your code works like a dream with the libraries and you never, ever have any issues.
The world is a beautiful place, but it is not perfect, especially not in code. There are two tasks that you will have to face in the product lifecycle and they are both inevitable in a product with any kind of longevity: upgrading and downgrading.
Upgrading is not always easy, but it’s straight forward. You check out the header files, you check out the libraries (you are using source control, right? If not, you should probably take the Joel test because you probably have bigger issues.), you replace them with the updated files, rebuild, run your regression tests and if all is good you check in. If something goes wrong, you undo your check outs.
If you do this, you’re doing it wrong.
Here’s why: if you have to downgrade –and and trust me, it happens and not just for bugs: I’ve had to downgrade a product because of legal entanglements– you’ve got way more work on your hands. Sure, you can go back to source control and roll the headers back and roll the libraries back, but that’s a lot more work than it needs to be. Further, if a new release added or removed header files you have to manage that. What if your client code changed to use or expose new features not previously available or the library’s API’s were changed? What if in the rollback you lose bug fixes that were not related to the previous upgrade? Better hope you have good unit tests.
Believe it or not, you don’t want downgrading to be a step backwards. You want downgrading to be a step forwards or at least a side step. You want to be able to turn on a dime and drop in a new library version in short order and be able to switch between existing versions just as quickly or even quicker.
Here’s how I do this:
- Build a directory structure that makes it easy to keep things straight. While I’m not always consistent, I prefer to do something like this.
- My Project Folder
- External Library 1.0
- External Library 2.0
- Never include header files from an external library directly. Instead, make your own wrapper for them and include that. In that header you want set up #define macros to make working with the versions easier.
- Isolate features and activate them based on which version supports them. Consider abstracting the functionality into an object hierarchy so that you can easily build adapters for things that don’t fully exist.
- Set the version in your makefile or build environment so that the compiler picks it up.
- Consider adding unit tests to make sure that your code understands which version of the library it’s using and can check it.
Here’s an example header file that does this:
#ifndef _H_FrobozzHeaders
#define _H_FrobozzHeaders
#pragma once
#if USE_FROBOZZ6 && USE_FROBOZZ7
#error Multiple Frobozz Versions requested in configuration. Did you forget to remove USE_FROBOZZ6 or USE_FROBOZZ7?
#endif
#ifdef USE_FROBOZZ6
#define FrobozzVersion 60
#define FrobozzVersionStr "6.0"
#include "FROBOZZ6/frobozz.h"
#include "FROBOZZ6/frobozz_model.h"
#define HAS_NITFOL 1
#define HAS_FROTZ 0
#endif
#ifdef USE_FROBOZZ7
#define FrobozzVersion 70
#define FrobozzVersionStr "7.0"
#include "FROBOZZ7/frobozz.h"
#include "FROBOZZ7/frobozz_model.h"
#define HAS_NITFOL 1
#define HAS_FROTZ 1
#endif
#ifndef FrobozzVersion
#error Build isn't configured for a version of Frobozz. Did you forget to define USE_FROBOZZ6 or USE_FROBOZZ7?
#endif
#ifndef HAS_NITFOL
#error HAS_NITFOL isn't defined. Perhaps you forgot to put it in the new configuration?
#endif
#ifndef HAS_FROTZ
#error HAS_FROTZ isn't defined. Perhaps you forgot to put it in the new configuration?
#endif
#endif
Things to note:
- USE_FROBOZZx needs to be defined on the compiler command line or in a global list of defines. If it’s not there, compilation will fail. Basically, this header enforces a promise on your part, “I promise that I will declare which version of FROBOZZ I want to use.”
- In if #if USE_xxxx block, I do three things:
- define versions (I don’t always do both numbers and strings, but you might find it helpful
- include the right headers for that version
- define the features that are available – note that I try to avoid declaring the existence of a feature by either #defining it or not. When you take that approach, you can only see what’s available not what’s unavailable. Using the 0/1 approach makes it transparent.
- I double check after each version specific block that at the very least the FrobozzVersion is defined.
- I also double check that the features are defined.
This work makes the upgrade into this process:
- Make a new directory and place the files
- Check out the single product header, add a new USE_ block and adjust the check for failure to define USE_xxxx
- Remove the previous compiler setting and ensure that the error check works
- Set the compiler setting to the previous USE_ you need, change the linker setting
- Build/adjust features for compatibility
- Regress
- Check in
This is the work for downgrading:
- Set the compiler setting to the previous USE_ you need, change the linker setting
- Build/adjust features for compatibility
- Regress
- Check in
Here’s why I like this – downgrading is a proper subset of upgrading. Instead of having two processes that are potentially error-prone and totally different, you really only have one process and the formerly hard process of downgrading which you inevitably have to do in a hurry and under pressure is now the easy process and has checks in it to help cover your back. This is one reason why I jokingly like use the term “rightgrading” instead of “downgrading” if I have to go back to a previous version.