The blank slate the Alpha.gov.uk team was handed was a huge privilege, but threw up some unusual technical challenges. Normally when embarking on building web apps you'd expect to have some sense of the corpus of content or the core functionality early on and could make the initial technology choices based on that. Here we were starting with some notions, a few sketches, and a determination that we not constrain our user-focus by early commitments to specific technical solutions.
We were lucky to have a development team with experience of a number of different languages and frameworks, and quickly agreed that rather than try to settle on a single one of those we'd build each tool using whichever technology would most quickly get us to a relatively-durable prototype, and then "federate" them. We started with the python-based framework Django for the Department pages, added Ruby on Rails for a suite of tools focussed on specific tasks, and used Sinatra (another ruby framework) to glue together our search. If we'd continued for longer and expanded the scope of what we were building it's quite likely that list would have grown.
That got us a few small pieces built, but we needed to join them somehow. While we're building for google-as-the-homepage, and are committed to consistency-not-uniformity, we needed a reasonable way for our front end developers to introduce that consistency, and to overlay a few site-wide components: like the geo tools that let us target information once you've told us where you are. As we anticipated early on, this was where most of the development pain lay.
Our first pass was a custom proxy that all visits were fed through. It knew which pages should go to which apps, applied a standard template, and tied in those site-wide elements. But as time went on it clear that it was awkward to develop with, rather slow, and quite brittle. Eventually we broke it into several pieces, and then replaced them each in turn.
We now have a couple of reusable components that each of our apps can include as middleware that provide templating, and shared services. Above that we're using an excellent package called Varnish to direct each visit to the correct app behind the scenes. Varnish is primarily used for caching web requests so that you don't have to do expensive computation or talk to your database every time you display a rarely changing web page, but can also be configured to do the rest of what we needed.
Everything's hosted on Amazon's EC2 cloud servers (in their EU-west cluster). We're also using their Elastic Load Balancer, and we've got some of our data stored on S3. Using those services has meant that we could experiment with our server configurations, add in more as needed, and quickly scale up where necessary. To co-ordinate it all we're using a tool called Puppet, which lets us rapidly change the configuration of a whole suite of servers with a single command.
As with every aspect of Alpha.gov.uk, the code that's been written is intended as a proof of concept. We've got some APIs in place (but not yet well documented) for a few of our tools (and our use of ScraperWiki has established quite a few more), we've established a reasonable architecture for our code, and given useful clues as to how a more mature federated system would evolve. But the real measure of what we've done is the degree to which it's allowed the whole team to stay focussed on real user needs, rapidly iterating all the way.
We're aiming to have some of that code ready to open source within the next couple of weeks, and will be giving more detail on some specific components as time goes on.