Review: The Future of the Internet and How To Stop It by Jonathan Zittrain

The New York Times recently asked: Do We Need a New Internet?:
At Stanford, where the software protocols for original Internet were designed, researchers are creating a system to make it possible to slide a more advanced network quietly underneath today’s Internet. By the end of the summer it will be running on eight campus networks around the country.

The idea is to build a new Internet with improved security and the capabilities to support a new generation of not-yet-invented Internet applications, as well as to do some things the current Internet does poorly — such as supporting mobile users.

The Stanford Clean Slate project won’t by itself solve all the main security issues of the Internet, but it will equip software and hardware designers with a toolkit to make security features a more integral part of the network and ultimately give law enforcement officials more effective ways of tracking criminals through cyberspace.

Ed Felten of Princeton University responds with an orthodox hacker-purist line:

[The first misconception] is the notion that today's security problems are caused by weaknesses in the network itself. In fact, the vast majority of our problems occur on, and are caused by weaknesses in, the endpoint devices: computers, mobile phones, and other widgets that connect to the Net. The problem is not that the Net is broken or malfunctioning, it's that the endpoint devices are misbehaving — so the best solution is to secure the endpoint devices…
It's an appeal to ye-olde Internet mythologie, complete with deferential references to "the founders" and their foresight, as if the Internet were some real-world Seldonian Foundation.

Neither position is good enough, and Jonathan Zittrain's wise book The Future of the Internet And How to Stop It [home page, Open Library entry] does a great job of explaining why and of providing some better ways of thinking about the problem. Yes, designing security into the Internet will inevitably cripple the very flexibility and permissiveness that has made the Internet a continuing source of unpredictable and surprising innovations. But sentimental idealization of the Original Internet and its "end-to-end" design won't do either [p165].

[U]sers are not well-positioned to painstakingly maintain their machines against attack, leading them to prefer locked down PCs [as Felten appears to advocate, ed], which carry far worse, if different, problems. Those who favor end-to-end principles … should realize that intentional inaction at the network level may be self-defeating, because consumers may demand locked-down endpoint environments that promise security and stability with minimum user upkeep. This is a problem for the power user and consumer alike.

…When endpoints are locked down, and producers are unable to deliver innovative products directly to users, openness in the middle of the network becomes meaningless. Open highways do not mean freedom when they are so dangerous that one never ventures from the house.

The passage shows the strengths and weaknesses of the book. Zittrain takes an unusually and refreshingly pragmatic, realistic view of the Internet, rejecting old approaches as and when needed. On the other hand, you can see from the first two sentences that his prose can be repetitious and dry. You have to work your way through some dense thickets to get to through this forest – but if you are ready to make the effort, it's a worthwhile journey: one of the best Internet books I've read.

Zittrain's central concern is a dialectical contradiction at the heart the Internet:

  1. The Internet's success comes from its remarkable ability to repeatedly generate new and unexpected uses.
  2. This "generativity" (yuck, what a clumsy word), comes in turn from the deliberate dumbness of the Internet protocols themselves; they were deliberately designed to permit any kind of traffic, for any purpose, to pass between end points of the network.
  3. But the more the Internet becomes "prime time", the more it attracts spammers, virus writers and information thieves to prey on the online population, and the Internet's dumbness is free for them to use as well.
  4. The more these highway predators threaten our online experience, the more we seek to retreat to the safety of a closed and protected world.
  5. A closed and protected world may give us safety, but will spell the end of the Internet as a fount of innovation.

The Internet, like capitalism, contains the seeds of its own destruction. But Zittrain is a digital reformer, not a revolutionary. Inspired by the continuing success of Wikipedia in the face of similar problems, he favours a combination of light regulation (like health and safety standards for the Net) and popular community action (neighbourhood watch). He believes that these, combined, can preserve the creative spirit that has led to so many innovations, while staving off the worst of the security and other problems. And he makes a solid case, buttressed by broad research (50 pages of notes and references) and a careful, undogmatic and pragmatic attitude.

The book is in three parts. Part I is a history of the Internet, told to highlight two design principles that were present right in the original TCP/IP architecture. The first is the "procrastination principle", which says the "the network itself should not be designed to do anything that can be taken care of by its users" [p31]. Most features of a network should be implemented at its computer endpoints (the end-to-end principle) rather than "in the middle". The second principle is trust; the Internet is "a bucket-brigade partnership in which network neighbors pass along each other's packets", and the assumption of co-operation and fair dealing is present in its design. So the Internet has no built-in security or identification mechanism; anyone can join the network; and there is no quality of service guarantee for packets it delivers. These two principles have led to what Zittrain calls the "generative dilemma" [36]. "The idea of a Net-wide set of ethics has evaporated as the network has become so ubiquitous" [45]. So how do we regain security while maintaining the ability to be creative?

Part II outlines what Zittrain sees as some of the dangers facing the Internet. One danger is the rise of Internet appliances, such as many of today's mobile phones, X-boxes, and Kindles. These devices promise a secure environment, but at the cost of restricting the ability of programmers to be creative. The second is at the other end of the network, where "Web 2.0" platforms such as Google Maps, Facebook, Salesforce and other hosted environments offer "contingent" environments for programming, where the prospect of unilateral changes to terms of service or agreements inhibits creativity.

Zittrain is enthusiastic about much of what the Internet has wrought, but he parts company from current digital orthodoxy on these issues. When it comes to Web 2.0, influential commentators from Shirky to Lessig to Tapscott and even to Benkler gloss over the differences between commercial web sites and non-profits, in an attempt to highlight a unified Internet culture. Lessig, for example, talks optimistically of a hybrid economy in which these motivations muddle along next to each other, and while Benkler does see an opposition of interests between the market economy and the collaborative network economy, he never makes much of it. I haven't seen as clear-minded an analysis of why commercial Web 2.0 platforms threaten creativity as Zittrain provides and I agree with him wholeheartedly.

The mobile Internet is only beginning to get the kind of attention that Web 2.0 has received. Today's smartphones and yesterday's desktop computers are similar in terms of computational power (see this claim of Windows 3.1 running on a Nokia N95 if you don't believe me). But whereas Microsoft got hauled in front of the courts to keep the desktop computing environment open for non-Microsoft applications and opposing the bundling of Internet Explorer, there are no such worries for mobile phone vendors as they keep their proprietary app stores and their managed devices in the name of security. There are layers of the iPhone, the BlackBerry – yes, and Android devices too – that are open only to the operating system, and which third-party applications cannot access.

A natural response to Zittrain's worry that we face an Internet of closed appliances (or, as Margo Seltzer calls them, "gizmos") and closed services is that these can exist in parallel with open, "generative" devices and communications. But Zittrain rejects this particular compromise:

even in a world of locked-down PCs there will remain old-fashioned generative PCs for professional technical audiences to use. But this view is too narrow. We ought to see the possibilities and benefits of PC generativity made available to everyone, including the millions of people who give no thought to future uses when they obtain PCs, and end up delighted at the new uses to which they can put their machines. And without this ready market, those professional developers would have far more obstacles to reaching critical mass with their creations.[165]

The end-to-end principle, argues Zittrain, has had its day, as "'middle' and 'endpoint' are no longer subtle enough to capture the important emerging features of the Internet/PC landscape" [167].  In its stead he advocates a more general principle that seeks explicitly to maintain "generativity", so that ISP filtering of viruses may be worth considering, for example.

Zittrain sees an inspiration in Wikipedia, whose shambolic, after-the-fact, bits-and-pieces way of fixing problems has been one of its strengths. Zittrain gives two initiatives he has been involved with, that could drive a similar approach for security while maintaining generativity. One is herdict, a browser plugin that collects the input of people from around the world to assess web site accessibility. Another is stopbadware.org , which helps to maintain a list of web sites that may host viruses and other badware. These initiatives extend the now commonplace collection of user experience by commercial software providers and service providers by making the collected data universally available. The distinction between, on the one hand, Google's use of our searches and Microsoft's collection of Windows crash reports and, on the other hand, herdict's goal of making the collected data public is a big one. It shines a light both on the closed nature of the major corporate collectors (no matter how unevil they may claim to be) and on the possibilities of open data. Public access to current databases of virus reports, spam outbreaks, and so on provides a basis for innovative solutions to the worst damages inflicted by malware.

Maintaining a healthy digital environment – both security and generative – is a commons problem, and while dictatorship is an appealing route to take to solve the security half of the issue, real transparency provides a second possibility that has more chance of solving both. But it needs to be real transparency, not just the half-hearted efforts of Web 2.0 companies as they dance along the line between community and profit.

There's a lot more in the book. On the technological front Zittrain sees virtualization technologies as reason for optimism and I agree; I'm getting a new computer at work next week and I anticipate running everything in virtual machines. If I get a virus or my registry gets screwed up, I'll roll back to yesterday's state and it will be gone. Well, that's the plan. Let's see how it works. On the social and political side, the book's final pages tend to wander aroun
d issues of privacy, reputation, and behavioural norms without saying a whole lot that's new. It's a shame the book finishes on this topic, because it fizzles out a bit. But don't let that deter you – there is a lot of subtle insight and a real breadth of knowledge in these pages that you rarely see in books about the cutting edge of technology, and Zittrain's efforts deserve to generate a big discussion.
Bookmark the permalink.

10 Comments

  1. Tom, please do play with Herdict Web (one of the ideas of Prof. Zittrain’s that you mention), which we officially launched last Wednesday at http://www.herdict.org. Participation and feedback welcome!

  2. You are a rebel Tom! Book review without a link to the book. I suspect the mole people are not to pleased.
    Are you going to continue to shun book links or are you still weighing your book link options?

  3. No principle involved – just forgot the link. I’ll fix it. Got to keep those mole people happy.

  4. Thanks for reviewing this. I had dismissed Zittrain as being in the hacker-purist camp, based on an interview I heard, but it sounds like that was premature.

  5. I think you mention a crucial issue which is that this is a shared environment that we all have an interest in but do not independently own, and is thus subject to “The tragedy of the commons”. I would suggest an open source approach to security architecture can benefit the widest range of participants, particularly as we move towards complex webs of services. Take a look at http://www.opensecurityarchitecture.org for more info if you are interested.
    Russ.

  6. This is the best review of Zittrain’s book I’ve seen.
    Sadly, I deeply believe that Zittrain’s treatment and lessons drawn from Wikipedia are incorrect, and the analysis he does is thus seriously flawed. Wikipedia has received an enormous _de facto_ advertising and marketing subsidy from Google’s algorithm (_de facto_ meaning in effect, not that there’s any sort of explicit deal). And that’s very much not applicable to anything else, definitely not from any sort of community best practices.

  7. You link to amazon and to the open library entry. The fulltext (plus readers’ comments and Zittrain’s additional material) is available here: http://yupnet.org/zittrain/. A shorter version (an article in HLR) can be downloaded here: http://www.harvardlawreview.org/issues/119/may06/zittrain.pdf

  8. Thanks for the comments and links.
    Seth. I’m not sure what you are arguing. I realize Wikipedia has got a huge boost from Google search placements (for reasons that still puzzle me), and that other initiatives are not likely to get the same. But I can still see the internal process of Wikipedia as being of value. I know you have issues with the cult, but I do think that the commitment to keep it non-profit and open (forced on it several years ago) has made the operation work well in the aggregate despite, sometimes, the intentions of the inner circle.
    Stephane: the software for the text is interesting – better than Benkler’s wiki, for example. Cheers.

  9. Tom, the issue is the meaning “of value”. The _impression_ people are likely to get from his discussion, is that if you follow Wikipedia’s processes, your project will be as fantastically successful as Wikipedia – he explicitly participates in the mystification here, where he quotes “But Wikipedia is the canonical bee that flies despite scientists’ skepticism that the aerodynamics add up.”. Now, I know that’s putting it starkly, and I could be accused of making a straw-man. But I do think there’s deep flaws in how he “sees an inspiration in Wikipedia”. There’s some relatively small value in Wikipedia’s processes, but they are standard stuff, not a scientific mystery. But the secret ingredient was not those processes, but rather Google’s. Thus, you really can’t learn much new from Wikipedia, in terms of solving big community problems, except that having an enormously powerful sugar-daddy is great.
    One of the less obvious things I’m trying to do with analyzing Wikipedia as a cult is to undercut the idea that its processes represent any sort of overall solution to policy problems. As in, yes, it’s sociologically interesting in how to have a big distributed online cult, and many papers can be written on it, but having cults to do work for free not a very good political program (i.e., this is in the same vein as the idea of not having a strong public sector, rather people should just volunteer and be charitable).

  10. I do agree that Wikipedia’s success is unique, and that the repeated citing of Wikipedia as a paradigm for successful large-group production efforts is overdone, to say the least. I am not sure that I agree with you about Wikipedia being “having cults do work for free” because it is non-commercial. I worry that if you go down the road you are following, then there seems to be no difference between, say, OpenStreetMap (a real community effort) and Google’s MapMaker (a commercial effort that is what I think of as digital sharecropping). To me there is a significant difference.
    Of course, the volunteer vs public sector issue will always be with us, and using “community effort” as a substitute for real social programs (for example) is going down an unhealthy path.
    But I’ll be interested to see what you come up with and look forward to continuing the conversation.

Comments are closed