REST APIs as Data Backends
Some months ago, the Socorro team agreed that our current mix of REST API middleware calls and direct SQL calls from the web interface simply wasn’t meeting our needs. We were faced with an increasing number of data sources, including the coming addition of Elastic Search to the data storage system, and maintenance was becoming a problem. Thus, the decision was made to move our data layer to our REST API exclusively, removing all direct access to data storage from the web interface.
This is the second such project I’ve been on where an external API has been used for the retrieval of all data in an application. It’s a novel concept, but one that takes some getting used to to be sure.
There are numerous benefits to the approach we took: first, it allowed us to write our middleware in a different language from our web interface (in this case, Python). It also means that for the users of the application (and Mozilla has users of Socorro outside Mozilla), it becomes much easier to strip off the Firefox-focused web interface we’ve built and replace it with something that produces reports aimed at their specific needs. Finally, when we do rewrite the web interface, it will make it that much easier since we’ll already have the data storage backend in place.
At first I was opposed to the idea of making one or more HTTP calls on loading a page, but I started to get used to the idea. I realized that this model made sense for both our application and in general practice: building applications that use your own API forces you to “dogfood” your own work and opens up numerous opportunities down the road, should you wish to open the API to outside developers or build something along the lines of a smartphone or desktop application.
Of course, this has been a long process and for most applications that are built in a “traditional” style (where the data backend is built into the application), it takes a bit of planning. Adrian Gaudebert (site in French), who is our Serial Reorganizer, has been focused on this as one of his top Quarter 1 goals and he’s unlikely to complete it in this quarter, though not for a lack of trying. There’s a tremendous amount of code that interfaces directly with some sort of data source, and it’s difficult to pull that code apart, push it into the API, and piece the web interface back together. For our use case, however, it makes a lot of sense, and is worth the effort.
In a world where data is exceedingly coming from both internal and external sources, this sort of approach seems to make a lot more sense. It’s one I’ll use in my future web applications and one I encourage to be used more in yours.