I just got into a debate with a few friends about whether or not the current academic computer systems community is still relevant or if we’ve lost our way and are now not productively matching out efforts with how to have impact.
My friends—who I should say are very smart people—argue that somewhere between when academics produced nearly all computer systems (including UNIX, the Internet, databases and basically everything else) and now, we have lost our way. If you look at recent kinds of systems—for example: peer-to-peer file-sharing, modern file systems, modern operating systems, modern databases and modern networks—many of the ones in common use are not those developed by academics.
Instead we’ve seen Napster, Gnutella, BitTorrent, and DirectConnect in peer-to-peer; ext3 and ReiserFS in file systems; Windows and Mac OS in operating systems; MySQL and SQL Server in databases, and the list goes on.
One conclusion from this is to make the observation that the systems we build—be they wireless networking technologies, new internet routing algorithms, new congestion control algorithms, new operating systems, new databases, or new peer-to-peer systems—simply aren’t being used and that this is out of alignment with how we view ourselves. The natural next step is that we should change our approach to acknowledge that we aren’t building systems which people are going to use or to figure out why what we’re building isn’t used.
For a variety of reasons, I think that this conclusion is wrong. First, you don’t have to look far to find recent academic systems which are in widespread use. VMWare and Xen both came direction from academic work on virtualization. Google (or more precisely page rank) started as an academic system. The list goes on, and this doesn’t count the fact that many systems not directly built by academics are heavily influenced by academic research. The ext3 file system is just ext2 with additions based the log-structured file systems paper by Mendel Rosenblum (who is now a co-founder of VMWare). Linux isn’t an academic venture, but it’s essentially a complete reimplementation of UNIX which was.
In the areas where academics appear to be “outperformed” there are some very reasonable explanations. In areas like databases and wireless networking you are looking at the impact of a few grad students and a few million dollars on multi-billion dollar industries employing thousands of people. The fact that we have any impact at all is impressive.
In areas like peer-to-peer file-sharing, most innovation has been driving not by technical needs, but by legal needs and making systems hard to shut down. While this is now someplace that academics find interesting, to have expected academics to do research into how to circumvent legal machinations seems a bit out of whack.
In the end, I feel like I am more able to contribute to the real world from academia than I would be elsewhere. There is a certain level of respect for ideas and tools produced by academics which is hard to garner elsewhere, there are fewer constraints caused by market pressures, and teams of people are smaller and more agile.
I find it interesting that the reasons you prefer academia is the same reason I prefer small company/startup jobs. Small teams and less direction from market pressure in particular.
I’m not sure about the “more respect” thing. There lately seems to be a large trend in programming to have equal respect for ideas that are simply interesting and new regardless of where they come from.
By “more respect” I really mean that when people that aren’t computer scientists look for experts on topics, they will very often go to academia first and take what they say with more weight.