Free Software and Surveillance

There is much that is moving and challenging in Jacob Appelbaum’s 29C3 keynote from December 2012, about the surveillance state, and Appelbaum has earned the right to be listened to from his work on the Tor Project. But…

 

At several places Appelbaum asserts that creating free software is a way of acting against institutions such as the NSA, and a way of building a better world. So at 12’01”: “It is possible to make a living making free software for freedom, instead of closed source proprietary malware for cops”, and at 40’50”: “everyone that’s worked on free software and open source software… these are things we should try to focus on… When we build free and open source software… we are enabling people to be free in ways that they were not. Literally, people who write free software are granting liberties.”

The picture of hackers versus spooks, positioning free and open source software as an alternative to the surveillance technologies of the NSA, just doesn’t hold up. Appelbaum must know that the NSA has a long history of engagement with open-source software, so “closed source proprietary malware for cops” mischaracterizes the technology of surveillance. The NSA Boundless Informant data-mining tool proclaims that it “leverages FOSS technology”, specifically the Hadoop File System, MapReduce (perhaps built on the Apache Accumulo project, which was created by the NSA and contributed to the Apache Foundation), and CloudBase.

These are just the most recent examples: the NSA holds Open Source  industry Days, like this one last year; it developed the SELinux mechanisms for supporting access control security policies that has been integrated into the mainline Linux kernel since version 2.6. There’s a good chance that the NSA’s huge new data center in Bluffdale, Utah, which Appelbaum describes at the beginning of his talk, is running open source SELinux software on every computer.

And beyond the NSA, the new set of “Big Data” technologies associated with data acquisition and analysis has strong open source roots. These are the file systems, data storage systems and data-processing systems built to manage data sets that span so many disks that routine failure of servers is expected, and tolerance for such failure must be built into the system. While much of the initiative came from proprietary systems built at the web giants (Google File System, Google Big Table, Amazon Dynamo), the open source implementations of Hadoop File System, Hadoop Map Reduce, and databases such as MongoDB and Cassandra are becoming the industry norm. Surveillance is as much an open source phenomenon as a closed source one.

These are all well-known facts, but maybe they need to be restated. Let me be clear: I’m not reversing Appelbaum’s claim. There is a great deal of closed source software around the surveillance and internet control landscape as well. And (full disclaimer), my salary is paid by (but I do not speak for) a company that mainly makes its money off closed source software, so I’m not claiming a moral high ground here. But to trumpet free and open source software as an alternative to the surveillance  systems  it has helped to build is nothing but wishful thinking.

—-

Date: 2013-06-10 Mon. Emacs 24.3.1 (Org mode 8.0.3)

Bookmark the permalink.

17 Comments

  1. I just don’t understand what your faction is trying to achieve here, by pursuing this line of trolling right now. In the face of a significant revelation about state surveillance, you’re still preoccupied with the easiest targets: of course software licenses aren’t, as such, forms of resistance to state repression.

    What I’d like to see is what, from your perspective, is the principled and strategic response here. What combination of political demands and technological tactics *is* suited to an environment where all communication must be assumed to be monitored by corporate and/or government institutions?

    • The point is clarity. Appelbaum’s argument as quoted is a huge and distracting non sequitur. It’s true that OSS can be relied on not to inform on you when you’re not looking. But it’s not true that all closed-source is securocrat malware; it’s also not true that all the software used to spy on us falls into that category.

      Whether you’re writing open source or not, you’re writing stuff that may potentially be used to spy on people or support people who spy on people. If you’re writing closed source, you may also be writing stuff that spies on people of its own accord. But – considering the amount of commercial software development going on out there – you probably aren’t.

    • “I just don’t under­stand what your fac­tion is try­ing to achieve here, by pur­su­ing this line of trolling right now.”

      Really? Faction? Trolling? Get a grip there Sparky, its unfair to cause so much eye rolling this early in the morning 😉

    • What Phil and RAD said.

      I’m a bit gobsmacked Peter, and so is my faction. The surveillance scandal gives us a look at the military-industrial complex, circa 2013. The fact that it runs on and participates in F/OSS demonstrates that the rhetoric of closed bad-guys (Bluecoat, Cisco and Ericsson) vs open good guys (F/OSS) is an obfuscating dead end. As for technological tactics, accepting that F/OSS Hadoop is surveillance technology and thinking through the consequences of that would be a start.

  2. I suspect when Applebaum was talking about your freedom he meant “your” data on “your” computer. On proprietary software, you can never tell when there is a back door that someone else can take advantage of. In free software, you may have security vulnerabilities (and some may have even been put there by the NSA), but at least theoretically, because you can see all the cards you can control all the data.

    However, the tricky part is that your data isn’t kept on your computer, and data is increasingly about communication. In the past you would rip a movie, later you would download it. Today you would stream it. The movie essentially remains in the cloud the entire time. In addition, a chunk of the data you generate is done so by sharing this data between people, usually in public. People who think about digital liberties haven’t thought through what this means, and it is tricky!

    Free software cannot guarantee your liberties when someone else is using that software on your behalf, and you are only receiving a service. There are variants of the GPL (The Affero GPL) which compel companies to release source code when only providing a service. However, even then there is no guarantee of privacy.

    However, you do have some control over your software when it is free (libre). When you get on the internet things are different, but at least you can try and have some control over the communications that are taking place.

    • Yes. Microsoft just announced a machine that will sit in your living room with a camera and microphone trained on you, connected to the Internet, 24 hours a day, with no assurances about what it’s doing with that data. F/OSS isn’t a panacea for privacy issues, but at the very least it protects your data on your machine, which also means surveillance devices attached to your machine. Sometimes your machine is your cell phone: now it’s not just your local data, but your microphone, camera, and GPS data.

      I truly don’t understand Tom’s position here. It reads like trolling to me, because I don’t believe he’s actually that obtuse.

    • Hmm. Well you are both smart people whom I’ve learned from so I’ll have to think about this.

      But you both admit your position is limited to your own computer, and in the modern world that’s a BIG limitation. You are both arguing for the Stallman “Cloud Computing is a trap” position (2008). But I can get even more control over my own data by avoiding computers completely — the “my machine” vision of personal computing has become irrelevant now. What’s more, F/OSS is a big part of what drove its replacement. To position Linux as a “free alternative” to the proprietary software that reports to the NSA (see prism-break.org) seems to me blind to the whole modern computing environment. It ignores the fact that Linux is a big part of the cloud computing infrastructure that is central to the data collection effort. How does “Use Linux: if it’s good enough for the NSA it’s good enough for you” sound? Take a look at the Linux Foundation’s adoption of Xen last month to see what the boundary between open source and proprietary looks like now.

      I suspect that’s going round in circles and not well expressed, but I hope you see what I’m getting at, at least, and that it’s not intended as trolling (of whom, anyway??)

    • I don’t want to con­flate my opin­ions with Picador’s; I actu­ally agree with you to some extent.

      The prob­lem here isn’t one of com­put­ers, but one of trust, and also the loos­en­ing notion of own­er­ship. If you trust pro­pri­etary soft­ware, you’re fine, and if you trust ser­vices, that’s fine too. You can get lock­ers in banks, and the the­ory is that no one can look inside. If you found out that the NSA had a skele­ton key into your secret locker in a bank, that erodes the trust, and it means you can­not use that locker as you intended.

      With com­put­ers, you might trust pro­pri­etary soft­ware, but nowa­days that trust is often betrayed. iPhone apps will rou­tinely send your con­tact list up to the cloud, whether through decep­tion or coer­cion. While you can regain trust through social means (and this is fine), the only way to be “sure” is to use free soft­ware. No mat­ter how you con­nect to the inter­net, no mat­ter how you redress the trust in insti­tu­tions on the inter­net hold­ing your data, if you do not trust the soft­ware run­ning on your own machine, then you really can’t trust any­thing you do on that machine. In this sense, whether or not the NSA uses free soft­ware them­selves, there is sim­ply no alter­na­tive to free soft­ware if you do not trust the soft­ware on your machine.

      The prob­lem with Stall­man is that he has an extreme posi­tion on these issues. How­ever, over time these beliefs tend to be vin­di­cated some­what. In the end most peo­ple extend a level of trust to the goods and ser­vices they use, so Stallman’s views don’t gain much trac­tion. I’m unsure if that’s a good thing or a bad thing.

  3. Was the enigma machine evil? Underlying the “malware for cops” assumption is a surveillance-technology-is-evil assumption.

    In these moments of widespread moral outrage it seems that we skip over some of the key underlying assumptions.

    We don’t often see eye-to-eye, Tom, but I do truly appreciate your ability to dissect difficult topics and pull out the elements that require additional scrutiny.

  4. I am flummoxed, but not surprised, by the people who claim this piece is some form of trolling.

    Appelbaum is by no means alone in weirdly conflating the “freedom” of “free software”–which those of us following Stallman closely have always found a remarkably evasive concept to begin with–with political freedom. In fact, Appelbaum is on the restrained side of some of the more strident claims in this direction (see Rushkoff, Bauwens, Shirky, Benkler, Tapscott, Johnson, Jarvis, & these are *still* the more restrained side of things). And yes, even now, we see people claiming that somehow open source software will protect the freedoms enshrined in the Bill of Rights. Usually these arguments (but probably not always) are advanced by people whose knowledge and understanding of the Bill of Rights is thin, and that’s putting it charitably. This view is not just wrong; it’s dangerous because it’s wrong, and dangerous because, as Tom hints here, it participates in a “keep computerizing everything” dogma that helps to build exactly the kind of political state that eliminates nearly everything we understand as freedom.

    We don’t need “new proposals”: we need not to abandon the principles that most of us thought were enshrined so deeply in law and practice as to be unquestionable.

  5. the view that F/OSS “protects the data on your machine” is just bizarre, and is exactly the view Tom is and should be contesting. Security software and encryption, along with keeping everything in your storage system away from the Cloud, and rarely if ever using online services, might protect it and might not.

    The disturbing fact is that the very public availability of F/OSS software and architectures (and its accompanying though separate ideology of “information must be free”) means that intelligence agencies know every detail about how and where it operates.

    Further, most of us use commercial ISPs who both monitor all traffic (unless it’s heavily encrypted/anonymized ALA Tor) and share it with intelligence.

    The provenance of software is largely irrelevant to security concerns, and to the degree that a software project is open source, unless it is all about providing profound levels of encryption, may make one’s information *more*, not less, vulnerable to intelligence gathering.

  6. Thanks for the interesting take on this.

    If you start the Applebaum video a couple minutes before the 12:01 statement, it seems clear that Applebaum is making the argument that if you are a talented developer, it is a mistake to think that your only option for making a living is working for a company like FinFisher or helping the NSA figure out better ways to spy on us.

    Same thing with 40:50. Yes, it dramatically overstates the case, but Applebaum is saying that in the context of projects like Uniprobe, which he mentions right before.

    I take Applebaum to be urging the attendees to devote more time on “free soft­ware for free­dom” projects — such as Uniprobe, TOR, OTR chat systems, etc.

    David Golumbia: what exactly would be entailed by abandoning the “keep com­puter­iz­ing every­thing” dogma?

  7. David: You make some interesting points, but I want to take issue with a technical point you raised:

    “The prove­nance of soft­ware is largely irrel­e­vant to secu­rity con­cerns, and to the degree that a soft­ware project is open source, unless it is all about pro­vid­ing pro­found lev­els of encryp­tion, may make one’s infor­ma­tion *more*, not less, vul­ner­a­ble to intel­li­gence gathering.”

    You are talking about Security through obscurity here, which is a bad idea ™. In fact, it is shown that the more open the software, the more secure it is, because there are more eyes on it. This is no doubt why the NSA use free software, because it means their own infrastructure is more secure. The “provenance” of software matters because you technically don’t know what closed source software is actually doing, whereas with open source you (at least theoretically) know exactly what’s going on.

  8. Sunny: I wonder how relevant this idea of “using” software is in a software-as-a-service world? Most of the data collection is done from phones, and you can never know what’s going on there unless your root it, which is not a solution for the general public. In other words, back to what David G said.

    Brian: I can see that’s a possible take on what Appelbaum is saying, but to this listener it still sounds more general. I guess that doesn’t matter too much either way, if the more general attitude is widely held, which I believe it is.

  9. Brian: Jacob is a smart guy, and I suspect that, were he to read this piece, he might very well scale back to that claim. I suspect that, if pressed, he might even want to back off the near-religious faith that says that somehow if the words “open” and/or “free” can be attached to something, it is inherently politically good.

    But I remain convinced that that near-religious faith is real, widespread, and dangerously misguided.

  10. Brian, I missed your final question: “what exactly would be entailed by aban­don­ing the ‘keep com­puter­iz­ing every­thing’ dogma?”

    As far as I’m concerned, this is THE question of our day. It’s ironic that computerization advocates can claim simultaneously that they are (a) transforming everything in the social and political fabric, often in the name of “openness” or “democratization”; and that (b) decisions about whether and how such transformations should take place should be solely determined by technologists, corporate capital, and the marketplace (see: Google Glass).

    In my opinion, the very potential (and proven ability) of technology to transform so much–and here I’ll go beyond computerization–means that we must figure out a much more thorough way to ensure democratic oversight, and maybe even control, of some parts of technological change. I frankly don’t think the FDA has been terrible for the US, and I think to the degree that it’s been weakened by industry shenanigans, I think that’s been for the worse.

    The problem that must be confronted is that if Google asserts it is going to “organize the world’s knowledge,” and “we” in democracy say “hands off” because “social planning” would be unwelcome, then we are simply handing that social planning over to a concentrated, highly interested particular group, who have no reason to be particularly respectful of the very principles on which democracy is supposed to rest. I actually don’t think it’s better to let Google decide than to impede “progress” (which is very often much less “progressive” than its advocates insist) by letting “the people” decide. How we get there must be a matter of debate, and I don’t think the debate has been anywhere near vigorous or serious enough so far to identify the correct way to proceed.

  11. David Golumbia wrote:

    “The dis­turb­ing fact is that the very pub­lic avail­abil­ity of F/OSS soft­ware and archi­tec­tures (and its accom­pa­ny­ing though sep­a­rate ide­ol­ogy of “infor­ma­tion must be free”) means that intel­li­gence agen­cies know every detail about how and where it operates.”

    This is not a bug, it is a feature. I primarily use Windows and other closed sourced software due to convenience, but using open source would be much more secure.

    The advantage of me (or people more technically astute than me who I trust) being able to analyze the source code far outweighs the negative that the NSA can also audit the source code.

    Thank you for replying to my other question. I understand your claim, and Cory Doctorow has given some recent speeches highlighting some of the issues you are covering (I want open source encryption–I’m not so sure I want an open source self-driving car).

    I don’t think the sort of democratic control you envision is possible with software (and probably not desirable in most cases). Yes, you could kill Google, but not GITHUB or Pastebin in a meaningful way.

    The computerization of all the things is inevitable at this point IMO.

Comments are closed