Interview with Angela Daly about The Surveillance State

Interview for current documentary project.

Angela Daly is a research fellow in media and communications law at the Swinburne Institute for Social Research in Melbourne, Australia. Her PhD thesis is on the corporate dominance of the Internet.

 

Transcript

[0:00:00] Jordan Brown: So the governments and corporations in synergy have a grip on our lives, empowered by technology, like never before. What does that look like?

[0:00:10] Angela Daly: Well I think the kind of government-corporate access looks like the same technology, or the same design, kind of having a dual function: one, a commodifying function, a capitalist function, whereby money is made and profit is gained for the companies; but also has secondary or second function whereby surveillance and control are facilitated for governments as well. Key to this is a kind of vast gathering of data about us, about people, whereby that data is very valuable to corporations because they can make money out of it, and its valuable for advertising, to sell on to other companies, and so on and so forth; but it also is incredibly valuable to our governments too, to have this very rich and very personal information about all of us.

[0:01:05] Jordan Brown: Can we talk more about what the corporatism—specifically looking at the Internet—what does that space look like? And I’m thinking now of how corporations have physical control over the infrastructure, they build the devices, a lot of them are rolling out the services. What does that look like? Can you talk about that generally? I’m thinking about Google and Facebook…

[0:01:47] Angela Daly: Well I suppose, I think, I’m not a historian necessarily of the Internet and of technology, but my understanding certainly is that it’s really business models—like Google’s in particular—whereby this data gathering has been key to Google succeeding as a business, and suddenly governments and particularly the US government and its allies in other parts of the world, suddenly got very interested in this huge amount of information that was being gathered, if they weren’t interested before. So it’s hard to tell; a bit of ‘the chicken and egg’ situation about what came first, but my inkling is that this business was set up and then became extremely successful, large amounts of data were gathered and that was highly convenient for the government surveillance apparatus as well. I think also, it’s a coincidence of various events is too. I mean, around the time Google was formed, this is kind of the “9-11” or the “post-911 era”; there was a great kind of desire from the government’s perspective to engage in whole lot of surveillance and justifications that the general public were scared into accepting essentially; and also coinciding with the kind of rampant neoliberalism, in that the end of history from the collapse of the Soviet Union onwards. So I’d say that all these things and more have kind of contributed to the situation that we’re in; to why corporations are doing this and why they’re being allowed to do this and also what the state interest is too, and I don’t think it’s easy to understand without understanding all these other and trends going on around the same time.

[0:03:33] Jordan Brown: Sure. I was thinking then, as you were talking, about how there’s a note somewhere that I read: when Google was starting up, they got some sort of venture capital, some kind of injection of money from DARPA; and how later on, Google was found to be—and this is early on in the piece, I can’t remember what year—something called Total Information Awareness, in the States. That was one of their ‘first’ sort of efforts to pool together a whole bunch of third party data-stores and use it for profiling. They [the government] did it across a lot of different data sets—financial records, etc. Google was a player in that, and Google was doing private dealings with the NSA, selling search equipment, etc.

[0:04:33] Angela Daly: I guess for all the rhetoric around ‘small government’ and neoliberalism, that has not actually kind of played out in practice, and certainly not in the tech sphere. Again, for all of the Silicon Valley and libertarianism, there seems to be a very symbiotic relationship with certain parts of the US government. So not the kind of regulatory parts as such, where these companies will recoil from the idea of more regulation of their business practices, but there has been a kind of symbiotic relationship with other parts of the government, particularly the US government, when it comes to as you mentioned selling surveillance apparatus, or equipment, or taking money from these—working in concert with these institutions to develop new products and so on. So again, aside from neoliberalism as an idea and as a phenomenon being a complex relationship—not a simple relationship—between companies, corporations, and the government; as this is played out in the surveillance, kind of, Internet area as well, it’s not an easy relationship either.

[0:05:42] Jordan Brown: You mentioned the commercialisation of the Internet, and I want to talk about how we see the pervasiveness of target advertising. So maybe we can start really generally with that and then pick it apart. Could you explain: What is it? What is target advertising?

[0:06:12] Angela Daly: I suppose advertising kind of coming up on websites that you are visiting, based on your previous browsing history or in the context of Google what you’ve been searching for before. So based on your web browsing history, your previous actions online, and so what you’re seeing supposedly is adverts tailored to you based on that. Of course, I mean, some adverts are better tailored than others, but the idea is rather than kind of advertising to the mass overall, to everyone, where not everyone will be interested in the same things; trying to kind of advertise to niche interests, to niche groups of people—and the idea being this will be more effective from the advertiser’s point of view.

[0:07:10] Jordan Brown: It can be insidious though.

[0:07:12] Angela Daly: Sure. I suppose inherent to targeted advertising is this surveillance and profiling of people and their habits, their online habits, and even to the point of trying to predict what they would be interested in, or not; with, you know, that archetypal example being the woman where Target knew she was pregnant before she even did.

[0:07:41] Jordan Brown: The Dad got really upset.

[0:07:43] Angela Daly: Sure. There are very difficult and unpleasant consequences to that for our privacy and integrity I guess too; our autonomy, and being in control of what we reveal and don’t to others.

[0:08:06] Jordan Brown: I’ve got a quote by Douglas Rushkoff … Digital Nation and Generation Like … picking apart [target advertising]—I don’t want to say subtle, but this really insidious and kind of clever way that target advertising assembles itself. I mean, sometimes it can be very loud and direct, but other times it’s more… I guess we’re sort of trending towards the concept of the Filter Bubble—the goal of the advertiser is to make the content indistinguishable from other content.

[0:09:04] Angela Daly: Sure. I suppose kind of behind the scenes as well, there’s a total lack of transparency over what data has been collected about people, who is using it, whose selling it on to others and for what purposes, and I think this has even attracted the attention—in the US of all places—attracted the attention of the Federal Trade Commission, where I believe they’ve recently released a report on data brokers. So, kind of these very shadowy companies usually that are kind of selling on this data to others which has been collected by others; and it’s very hard for an individual user to really know what is being collected about them and who is using it and what consequences that may have—I think a particularly somewhere like the US, where that can determine things like your credit rating, health insurance, and so on and so forth.

* * *

[0:10:05] Angela Daly: In Europe where there is stronger privacy law, data protection law, these kinds of practices are more restricted but that isn’t to say they don’t happen. The enforcement of these laws is also a real problem, particularly when the Internet is transnational by its very nature, and so even for Europeans, their data, they’ve maybe been profiled by organisations in other countries such as the US where although in theory there is protections of privacy and particularly of European’s’ privacy through a whole series of agreements in practice, these protections are not well enforced; nor does anyone really know what’s going on and that’s a real problem, let alone for those who are living in countries where there are not very strict privacy laws. Even the privacy laws that we have are not particularly well adapted to the current situation of vast amounts of data being gathered about all of us whenever we use the Internet and all those other kinds of technology.

[0:11:14] Jordan Brown: I’m thinking now about the data-brokers like ChoicePoint, Axiom, Quantium, LexusNexis; and them being the big, well-known corporate data-brokers.

* * *

[0:11:30] Jordan Brown: This is Total Information Awareness’s logo. It’s a pyramid with an eye on the top of it… How crazy is that? So creepy…

[0:12:01] Jordan Brown: Unpacking this idea of data analytics, assembling massive amounts of data, mining through it for patterns, trying to predict things; we’re essentially talking about—in one aspect definitely—some kind of manipulation, right? Both on the individual level and on the societal level.

[0:12:26] Angela Daly: Well I think how these trends are playing out is very much in a nefarious way essentially. I personally do not support the vast gathering of data about everyone and so on and so forth, but I do accept that perhaps this kind of idea of ‘big data’ and ‘big data analytics’ may, in certain circumstances be beneficial. So I wouldn’t rail necessarily against vast gathering of data per se, even though I can see that there are some positive uses of it. But I think we need to be very aware of who really is behind a lot of this data gathering and what the real purpose is; and the fact too that a lot of this data is very badly stored so it’s very insecurely stored, yet may be very intimate data about individuals. So I think, the critique is not necessarily the technology itself, but really what the uses of this technology are and for what purpose; and whose doing it, and I think there is a lack of an awareness firstly, and a lack of critique of this.

[0:13:40] Jordan Brown: Definitely. What about the fact, with ‘big data’—sure, we could argue some ‘beneficial uses,’ but at the end of the day, the government can just come in and take that anyway? And the data might be totally innocuous, but this is not the point though. Once you add it with perhaps another data set or maybe look through other patterns of seemingly innocuous data, you can still pull out a lot of richness from that, right?

[0:14:09] Angela Daly: Sure. I guess this is what I mean about, in practice, this being highly concerning given what’s happening, given what seems to be the strong government interest in gathering as much data about all of us as possible for who knows precisely what purpose. I think as a result we should be resisting this data collection despite potential positive elements, essentially because I don’t think the people engaging this data gathering nor the powers that be above them have demonstrated that they are trustworthy enough to look at our data essentially, and to manage our data and to use it for ‘good purposes’ or beneficial purposes. I don’t feel that this is really been demonstrated to us and therefore I don’t find them trustworthy at all.

[0:15:00] Jordan Brown: Can I ask you if you think that’s inherent in some of the technologies though, the fact that a lot of these digital technologies create a trail regardless, which then can—and as we see—empowers the surveillance state?

[0:15:17] Angela Daly: I think this goes back to what I was saying at the beginning. I think also the design of this technology is very important (I suppose, I didn’t quite say that). I think the design of technology is very important and I think particularly when you look at, for instance, Google’s business, Google’s services have been designed to gather lots of information and data about the users for, I would imagine, initially Google’s own profit-making purposes, but this also has an ancillary—a very strong ancillary benefit—to government efforts, surveillance purposes as well. I think what is particularly worrying for me is the transition to mobile devices. So ‘smartphones,’ tablets and even now kind of wearable technologies. So things like ‘Fitbit,’ and so on; which, there is no option to opt-out of this data gathering when using these devices. So it’s not only the kind of apps that people are using, it’s also the devices themselves. So at least on a kind of traditional PC or laptop, every time you’re using this machine, particularly if you are off-line, you weren’t necessarily contributing to this data gathering, whereas these devices, there is no choice. At least in the commercial offerings, you cannot buy a device whereby your data is not harvested at all times, that you’re connected to the Internet and probably even connected and to a 3G telecoms tower as well. So I see this as being particularly worrying. Even if you want to protect your privacy in using certain kinds of devices, this is now pretty much impossible. Yet, that’s the way things are going. And so if you want a mobile phone these days then you don’t really have the option of buying one—at least somewhere like Australia, whereby it’s a kind of ‘dumb phone’ rather than a ‘smartphone’ and every time you use it you are able to protect yourself and your privacy a bit more than perhaps… So I think the technological developments particularly the last 10 years have been very much with this design, that all data or as much data as possible as can be gathered about the user, and there is very little the user can do to kind of opt-out of that.

[0:17:46] Jordan Brown: I’m thinking of that old line, “We’re all walking around with a tracking device in our pocket that just happens to make phone calls.” While you were talking, I was thinking about how the same would be true with the ‘Internet-of-Things.’ Maybe you’ve opted-out, and you’re not even carrying your tracking-device-that-just-happens-to-make-phone-calls down the street with you, but what happens when the environment is embedded with sensors, much in the same way as we have prolific CCTV everywhere? It does beg the question of ‘choice.’ I mean, even the choice of having a mobile phone these days is not really much of a choice.

[0:18:34] Angela Daly: Sure. For instance, in the West, developing countries, opting out of online-anything is incredibly difficult when you know the government is delivering services online as well. It’s kind of, you have to opt-out of society as a whole to opt-out of this data gathering and surveillance.

[0:18:55] Jordan Brown: But that’s impossible though, right?

[0:18:57] Angela Daly: I mean, sure, yeah. Actually, I think in Australia, there’s an interesting case of an Aboriginal man who is trying to opt-out of the system entirely, not necessarily for anti-surveillance purposes, but I guess because he does not recognise the legitimacy of Australia in any respect.

[0:19:18] Jordan Brown: Yes, for sure.

[0:18:57] Angela Daly: I think there was an article in The Guardian about him recently which was really interesting, about how he is trying to live without interacting with the state apparatus in any way and it’s incredibly difficult. So that. You do have these examples of people who are trying to opt-out and it’s incredibly difficult. Also, going back to technology, what you’re saying is totally right—so even people who don’t carry mobile phones, have a ‘smartphone,’ have Facebook and Google accounts and so on; are still being profiled, are still being tracked through others. So there’s this whole phenomenon I guess of what are called shadow profiles on Facebook; whereby, for instance, people who are not on Facebook will have photos uploaded, Facebook uses facial recognition technology—although it’s not supposed to in Europe but who knows whether it actually does but certainly in other countries—it uses this facial recognition technology, can identify people’s faces. So if you have friends or relatives who are not on Facebook but people are uploading photos of them, then Facebook can presumably create these kind of private profiles let’s say, of people. So they have data gathered already about individuals who’ve not opted into the service. Similarly, now that we all have ‘smartphones’ or tablets with cameras, to what extent these cameras are working when they’re are officially ‘off,’ and to what extent images of others are being captured is a real, for me anyway…I think that that’s an interesting issue to explore and it is very…given some of what we do know already about ‘smartphones’, tablets, and an the tracking thereof; also, what we already about webcams actually being activated when they’re supposed to be inactivated, then yeah, I kind of agree with the fact that we now, many of us have these tracking devices and even people who want to opt-out of them, it’s more and more difficult to do so, particularly if you’re living in a big kind of modern city, where even if you want to opt-out, everyone around you is not.

* * *

[0:21:40] Jordan Brown: Maybe you’ve covered it a little bit, but what are the implications of this working in tandem?

[0:21:59] Angela Daly: What I will say is that I think we’re at a really kind of crucial, key moment at the moment, in a whole lot of respects, but particularly: this apparatus is all in place, we have a kind of happy marriage between the governments and big corporations in a whole wide range of spheres, but technology corporations as well; the situation kind of suits them well, but not the general public necessarily. And certainly, when surveys are done about people and their views on privacy, generally people do value privacy, they would be happy to pay for kind of privacy-enhancing alternatives, but the market is not offering this. So…

[0:22:50] Jordan Brown: Sorry, I was even going to interject there. Is that even possible though with what’s already been built? I mean, some talk about encrypting their e-mails and that’s why we’ve got the spooks going head on with, “How do we intercept the private keys, so even if we sit in the middle and intercept that communication,” which is of course what they do by tapping the fibre links, or the connections between those things, even if you’re going after so-called ‘privacy enabling technologies,’ it’s like this arms-race…

[0:23:28] Angela Daly: Yeah, exactly. … I don’t think there’s any interest on behalf of corporations or the government to improve our privacy or to at least A) respect our privacy rights or to enact laws which are going to force either the governments or corporations to provide more privacy protection to individuals. I think that, and we can see why, because this situation suits both parties very well even if it’s not satisfactory for the general public as such. But also, I mean the general public have been persuaded that this is okay, that “they should be willing to give up their privacy because of ‘terrorism’ and serious crime” and so on and so forth, even though there seems to be very little evidence that such vast data gathering and surveillance actually prevents terrorism or serious crimes happening. Various terrorist attacks have happened in spite of this kind of vast data gathering and surveillance, but nevertheless that’s certainly the rhetoric; the justification very much is one that people are buying into to some degree because they are worried about these things and they want to feel safe, they want to feel secure and so they have been persuaded that this is a kind-of-an-okay bargain to make. And I think nevertheless, the media in particular has been, in certain countries—I think Australia in particular—when it’s come to discussions of these recent national security anti-terrorism laws and proposed mandatory data retention, I think the media and even opposition politicians have been very scared and weak to—or even not very well informed—to oppose some of these laws and proposals that are highly intrusive of civil liberties and privacy and so on. I think that this rhetoric around security and terrorism and so on does have to be critiqued, it can’t be a subject in society that we’re unwilling to talk about, unwilling to criticise or think about critically let’s say. However, the status at the moment is very much that these things cannot be questioned or critiqued, and I think that there’s too many politicians particularly in supported opposition parties that are kind of playing it too safe with this kind of thing.

[0:26:24] Jordan Brown: A few things I wanted to say there. The ‘terrorism’ drawcard is one that is used often. … one small insight I had into this was, I think a few years ago now, where Nicola Roxon on ABC Q&A or maybe it was just a newspaper article, but beside the fact, I think it was about a some climate activists, said something like climate activists are in the same league as terrorists because they pose threats to critical infrastructure. And that blew me away. Because what they’re doing is lumping legitimate political activists into the same league as terrorists which justifies the use of these vast spook apparatuses against legitimate political expression. That’s massive, right?

[0:27:29] Angela Daly: Well I think, certainly from the UK as well we can see people labelled as kind of ‘domestic extremists.’ I remember I think we had Green party politicians labelled as ‘domestic extremists’ in the UK. So precisely what being a ‘terrorist’ means or an ‘extremist’ means seems to be a very wide definition and includes what seems to be legitimate thought, opinion, political activities as well. And so again I think this is a good reason to be very critical of increased powers for the police and other law enforcement agencies on the basis of preventing ‘terrorism’ given that this is given a very wide definition, and also I suppose from the European perspective, we’re in at least five if not more years into austerity, financial crisis, so on and so forth; a lot of very unhappy people and a lot of real social issues in Europe at the moment; and so a lot for people to be very unhappy about and very critical of current orders, current political orders and corporate orders for that matter as well. And when you have these…when terrorism or extremism is defined so broadly, we may even be kind of contributing to not very good politicians and corporations—such as banks for instance—remaining in their position in society. So I think there is a kind of protectionism from people who have political or corporate power and this is one of the ways that they are protecting themselves too, is having laws which seem to certainly make certain kinds of political expression difficult if not illegal.

[0:29:20] Jordan Brown: And the vast surveillance state at their disposal to keep that in check.

[0:29:24] Angela Daly: Sure.

[0:29:27] Jordan Brown: That’s crazy. Just thinking on that point, I printed out some news articles—and this is so not extreme—how people that were planning some street theatre when the royal wedding was on, that were rounded up pre-emptively, they hadn’t even done anything yet, they didn’t get to do their action which would’ve been deemed a ‘breach of the peace’ or whatever bullshit; they didn’t do it. This was rounding them up the night before, because they were planning to do something—which was ‘conspiring’ to ‘breach the peace.’ And the way that this was reported, there was an interview with the woman who dressed up as a zombie for the piece of street theatre, she was saying the way the police were questioning her, they would’ve had to have had access to her social media and her emails, because they were asking her about very specific things…

[0:30:24] Angela Daly: Once we start kind of abrogating civil liberties for terrorism or ‘political extremism’ then it’s kind of difficult to stop that. Or, once we start giving up our rights and liberties, it’s difficult to stem that flow. And I think this is what we’re beginning to see—certainly in countries like the UK and the US, and I think other parts of the world as well; that the justifications for becoming more intrusive of rights and liberties begin to proliferate. For instance, in Australia, you now have I think a law in Tasmania preventing certain kinds of environmental protest or secondary boycotts. … Once you start down this path then it can be difficult to turn back and stop, even if so-called threats, or the original threats may not be as threatening anymore, new threats seem to be found or are manufactured.

[0:31:31] Jordan Brown: Yes. It’s like a ‘creeping normalcy,’ slow changes—and they’re not even slow now—changes over time, “We opt into this one inch at a time.” I’m thinking: Is the same true with technology? For instance, once you roll out CCTV everywhere, it’s difficult to take that back, right?

[0:31:51] Angela Daly: Well, at least when it comes to things like, let’s say, I suppose it’s a term which is kind of used in a very specific way, but let’s say: the Internet infrastructure either at the kind-of network level, or the over-the-top services level is set up with a particular design, it becomes difficult to change that fundamental design. So the fundamental design is one where data is intercepted and gathered or created even in the first place, and documented. Then if that’s been the way things have been going for a while, there is so much which is—secondary services infrastructure, and so on, which is based on that happening. So, it can be difficult to shift. It would be a big paradigm shift I suppose to make such changes. Not impossible, but certainly a big difference.

[0:32:46] Jordan Brown: Yeah. That’s huge too. Because it means not only changing the technological paradigm, but also the social and cultural and political—heaps of things.

[0:33:01] Angela Daly: I suppose it’s ironic as well, given that the Internet was set up on a decentralised basis given of decentralised networks are more resilient in many ways, or so I’m told anyway, and at least if one part of the network sort of stops working or malfunctions in some way, the rest of the network can be okay. But I think we also…aside from some of these other, socio-cultural, economic changes, we’re also seeing a kind of centralisation of the network or the infrastructure and services in many ways which was very different to the Internet’s initial design.

[0:33:37] Jordan Brown: How did that happen? I’m interested in: How does corporatism and the co-opting of that decentralisation, turn into ‘180-opposite,’ total centralisation?

[0:33:52] Angela Daly: Well I suppose certainly from the corporate or economic perspective, to some extent businesses don’t want to be regulated and certainly big businesses don’t want to be regulated … Corporations don’t want to be regulated, they’ll often make a big song and dance about not being regulated, however, big corporations—even if that’s the rhetoric, the employee actually, regulation is something that they can afford that their smaller competitors may not be able to afford, and so this kind of centralisation perhaps is an effect of certain regulation being brought in. So things like data retention, for instance, that actually does impose or is likely to impose a cost on businesses based on what’s happened in Europe, and certainly here, that’s very much part of the discussion at the moment. The big businesses like Telstra will be able to bear that cost, however Telstra’s smaller competitors may not be able to bear that cost. So even if large companies may argue against such regulation based on cost, in many ways it may be beneficial to them, or if they’re playing a longer game, it may be beneficial to them economically as well, that this will take out smaller competitors and make it more difficult to compete with them, and they will continue to have a large market share. So basically this kind of regulation can be a barrier to entry for potential competitors and therefore is in the interests in a longer term perspective to existing big corporations. I don’t know if that answers your question…

[0:35:43] Jordan Brown: Sort of. So you were saying that deregulation led to the centralisation of the decentralised model…

[0:35:55] Angela Daly: I guess so. Also, I suppose there’s network effects as well. So, despite the network infrastructure of the Internet being highly decentralised, the over-the-top services particularly “Web 2.0” services have been very much based on networks and kind of social networks in particular; so, for instance, Facebook is the biggest social network even though it’s been declining in certain and maybe doesn’t have as many members or new members as it once did, but still, it’s kind of, I don’t know, has a billion people connected? Something like this?

[0:36:32] Jordan Brown: Yeah, I should’ve looked that up, sorry.

[0:36:34] Angela Daly: So obviously if you kind of join Facebook, you join Facebook because you know a whole lot of people on there already… It’s a big switching cost to switch to another platform where you don’t know anyone, or you have few friends already online. So that also is a kind of phenomena that we see on the Internet which promotes some kind of centralisation in certain services; or, for instance, using Skype as a voice-over-IP service. So you could use other—there are other alternatives to Skype, but if all your friends and family are using Skype, then that’s what you’re going to use. There’s also a kind of, arguably, a lack of interoperability between different services as well. So I think this promotes centralisation at that point.

[0:37:20] Jordan Brown: And by design too. Because they “don’t want you to go.”

[0:37:23] Angela Daly: Sure. Exactly. So this kind of lack of interoperability certainly is something which is by design because that kind of protects that company against competitors in the market as well.

* * *

[0:37:41] Jordan Brown: What are your concerns with surveillance—but let’s unpack that with: What are we dealing with already, right now? What is that? What’s happening? What’s it look like?

[0:37:49] Angela Daly: Okay, so I think right now, I would say particularly in the English speaking—or the ‘five eyes’ countries; so US, UK, Australia, New Zealand, Canada—we’re dealing with a vast surveillance and data gathering apparatus which has been semi-secret for a long time and only as a result really of various whistleblowers—prominently Edward Snowden, but not only Edward Snowden; he’s one of a line of people who have blown the whistle on what our government’s and law-enforcement agencies have been engaging in; stuff which has been very untransparent. So there’s not been very upfront discussion of this kind of surveillance apparatus…

[0:38:40] Jordan Brown: Even actively hidden—they lie about it, and have lied about it.

[0:38:44] Angela Daly: Sure. There’s been a total lack of transparency, and it seems that our governments, law-enforcement agencies, and so on, have only started to become more transparent, when they’ve been forced to be, so in response to various leaks and whistleblowing and so on. So certainly, given that these countries profess themselves as being ‘liberal democracies’ and that these kind of activities ought to be subject to scrutiny—at least knowledge—by the people, if not only kind of our representatives in parliament who seem also not to be particularly, not to have been particularly aware of these developments as well. I think this is highly concerning that we seem to have parts of our governments and administrations that have arguably gone a bit rogue from a democratic perspective as well. And the justification of the moment anyway is ‘War on Terror,’ but how that has kind of morphed over the last I suppose, 13-odd years since 9-11 essentially. But of course there’s a long kind of history of surveillance and threat-of-the-moment too. In the UK it used to be Irish-republican terrorism, now it’s Islamist terrorism. It’s almost even ‘environmental terrorism’ or ‘extremism’ or leftism even…

[0:40:28] Jordan Brown: Yeah, that’s actually a thing: ‘Eco-terrorists.’ They call it that.

[0:40:32] Angela Daly: Yeah, so, I think this does have to be put in a historical perspective as well, that we’re not living in necessarily… All times are exceptional times, and there are threats that are used to justify invasions of civil liberties and rights, and the current justification is kind of the ‘War on Terror.’

TIME CHANGE FROM HERE [0:40:59] Angela Daly: Yeah and if you start protesting against it… Well, what I want to say is that I think it kind of has to be put in a broader historical perspective. Certainly post-World War II, there’s been…this is kind of the latest in a line of…well I suppose developments in surveillance but also developments in kind of military and law enforcement, spy agency cooperation that’s been going on for decades.

[0:43:43] Jordan Brown: And the collusion with the corporate interest in that?

[0:43:47] Angela Daly: Sure.

[0:43:50] Jordan Brown: So, what are your concerns, given what we’ve just talked about?

[0:43:55] Angela Daly: Well I suppose my big concern is the invasion of ordinary citizens’ privacy as a result of…and not just privacy, other rights and liberties as well, but privacy I guess is the big and most obvious one here. So I think that’s highly concerning particularly, I mean, the vast majority of people are not doing anything wrong at all, not that I really think surveillance is kind of justified, kind of… Sorry, what I want to say is: the vast majority of people are subject to this kind of data gathering if not actual surveillance even if they’re not doing anything wrong, or have never been involved in crime, even petty crime, and so it’s totally disproportionate that the rights of normal people who are not engaging in anything nefarious or bad, let alone terrorism are being infringed and impinged upon by this kind of surveillance and data gathering apparatus.

[0:45:02] Jordan Brown: So why should we care then?

[0:45:05] Angela Daly: Well aside I suppose from sort of theoretical or philosophical reasons… What I want to say: aside from the fact our rights are being infringed, aside from the rhetoric around that, there are some practical consequences as well. So firstly, this vast massive amount of data about all of us is not being stored in a very secure fashion. There’s been some kind of high-profile hacks of data stored about individuals this year, I think earlier this year, there was a leak of data about asylum seekers—so people in a very vulnerable position in various ways who, at least have argued that they were escaping torture in their own countries, so if they were sent back to these countries then a whole lot of information would be known about them, or could be known about them as a result of this leak. So that’s kind of one particularly bad example. But if this information was kind of stored securely and only subject to access by very limited people, then perhaps it would be more justifiable. That doesn’t seem to be the case. And even Edward Snowden himself was a government contractor, wasn’t even working directly for the US government. I believe, what he said anyway, was that there was a huge amount of people, thousands and thousands of people had access to huge amounts of data about people not just in the US but all over the world. And so I think there is very little in the way of security, there is little in the way of checks and balances, and also, I mean, what is this data being used for, or what could it be used for? Once the data is there, it’s difficult to get rid of. There’s various ways, technically speaking anyway, but it’s also valuable to the government and also corporations in various ways as well. So, I suppose it kind of remains to be seen what precisely is going to be done with this over the course of our lifetimes, between the generations that are growing up now who may be photo’d; photos of them are put on Facebook from when their babies, so what’s going to be done with all of this information? It may be used in ways that are not particularly equitable or democratic, for instance, and that I think is a big worry. And this is all very unnecessary I suppose as well, it’s not necessary that all this information is gathered about all of us, particularly when it happens outside of our control, or we have only limited control over it. And so I think these are kind of concerns about why this data gathering is at least suspect, if not bad itself.

[0:47:56] Jordan Brown: I’m thinking now about the prospect of, and even William Binney said this, how the real power in collecting all of this data is the retrospective analysis. So yeah, sure, even if you aren’t doing anything ‘bad’ now, with generations growing up now and that’s all fine, at some point in their lives, say 20 years down the line, or whatever, you know, you can pull up their entire life. And to me, that’s really chilling. Because the implications for that are huge, right? It means you’re not a target now, and you might not be, but you could be at any given point…

[0:48:45] Angela Daly: And I suppose another thing to say too, is that this data may not all be very accurate either. So if decisions are being made based on this and potentially this alone that’s also worrying too because it may not actually an accurate reflection of what’s going on. I mean, ‘big data’ has been—which itself is a very vague concept—but there’s been criticisms of ‘big data’ techniques, or at least being used…only big data techniques being used to kind of aid decision-making because it doesn’t paint…it paints a certain picture of people and reality but not, arguably, the full picture. And so even in academic research as well, sort of qualitative techniques or ethnographic techniques are still very important to build a kind of full picture of what’s going on in a particular area and so I think there’s also that issue, that we’re getting…even if…so the data may not be accurate and even if it is accurate, it only paints kind of a certain picture not necessarily the full picture, and we should be cautious about basing too much decision-making, whether it’s from a law enforcement or national security perspective or even from a kind of corporate decision-making perspective. So, who to hire or fire as employees, decisions by health insurance providers and so on and so forth.

[0:50:13] Jordan Brown: ‘Big Data’ is something I’m really suspicious of, one because of the reasons that you’ve mentioned, but also because of the cultural thing—that people sort of believe the computer rather than their own experience, or the real world.

[0:50:25] Angela Daly: Well I think there’s a huge amount of rhetoric from Silicon Valley, kind of, very utopian—I mean there’s a whole lot of utopian rhetoric from there anyway—but there is this kind of messianic discussion of big data as being the kind of solution to all problems: so “the more data we have, the less problems we’ll have.” But it’s very…of course there are certain reasons why these arguments are made, I mean it sells certain products, there’s that kind of self-interested reason why we’re seeing some of these arguments coming out of Silicon Valley. But also I think there’s a kind of cultural side too whereby the problems of society are much more complex than can just be solved by more data. And I think there’s kind of a wish not to engage perhaps with some of these complexities which…because engaging with some of these complexities may present rather unpalatable truths to Silicon Valley, for instance, and particularly I’m thinking around inequality, social problems, health and so on and so forth; that’s not just going to be solved by, I don’t know, people having wearable devices that tell them to walk a bit more. The causes of social problems and bad health are much more complex and more kind of overarching as well rather than just kind of the fault of the individual let’s say. But I think that some of this is somewhat lost in discussions of ‘big data’ too.

[0:51:59] Jordan Brown: Yes. Maybe I should ask about the techno-utopianism now…we kind of just did. But what are your thoughts about the prolific optimism of this culture, the deus ex machina if you like, that specific to technology, how the problems that technology causes, the sort of assumption that if you ‘throw more technology at it, we can fix some of these problems,’ social issues and things as well…

[0:52:39] Angela Daly: What I will say is that I think it’s been an interesting 10 years or so. I mean, the Internet in particular is kind of matured as a technology, I mean it’s in its…well, it’s been publicly available to the masses; we’re kind of now in the second decade. So it’s interesting to kind of see and reflect back on what’s actually happened. I think there have been kind of liberationary aspects of the technology, but of course it doesn’t exist kind of in a vacuum from the rest of society, there’s other things that have been going on in different societies throughout the world which have interacted with these technological developments. So I guess I’m thinking about things like the Arab Spring, the so-called ‘Twitter revolutions’ and so on and so forth. There’s a huge amount of commentary on them both contemporaneous and some years later where with the benefit of hindsight we can see that, yes, the technology was important but it wasn’t the only kind of determining factor there. Nevertheless, I think that this kind of, the vast, the huge amounts of data gathered and the surveillance apparatus which kind of supports and facilitates it, do make me somewhat pessimistic about the liberationary possibilities for this technology anyway, kind of, going forward. I’m more likely to kind of sympathise with a techno-dystopian of someone like Evgeny Morozov now than perhaps I was kind of 10 years ago. Nevertheless, I think we kind of see still at the edges of the Internet, which I see in a kind of non-technical way, but if we think the mainstream of the Internet is kind of mediated by big players like Google and Facebook and Microsoft and so on, still at the edges we’re seeing things like interesting kind of Peer-to-Peer activities; people kind of trying to opt-out or to decentralise what is becoming a more centralised phenomenon through you know, things like mesh networks, also through anonymising techniques—cryptography and so on; and not to say that these are all perfect or all kind of facilitate freedom however conceived or are even kind of, I mean there’s plenty of contestation not in the least by various governments too, trying to decrypt cryptographic techniques as well. But I think you can still see some kind of lawlessness, not sure whether that’s a good thing or not at kind of the darker parts of the Internet. And so even as was, prior to the Internet, the law is not enforced absolutely in its entirety in every situation, and we see that as well with the Internet. Despite this apparatus, despite centralising tendencies, we’re seeing kind of dissent…and deviation, some of which are good, some of which are not so good. That’s still happening, but perhaps the technology overall is not going to be as kind of ‘Earthshattering’…

* * *

[0:57:12] Angela Daly: I think that’s a bit dystopian: that we are…our autonomy is eroded somewhat by all of this…

[0:57:26] Jordan Brown: I think our autonomy is eroded by all of this, and the possibility for [real social change]. What I’m trying to get at is the implications on, ‘Okay, we realise that this space trending badly,’ and if we want to try and do something about that, then we have this big machine against us, right? Waiting to pick us out, to sabotage that event, or round us up before it even happens—as we see…

[0:58:01] Angela Daly: Perhaps I should say that I’m not just techno-dystopian, I’m a bit dystopian in general. I think some of, like I was trying to emphasise before, these trends in technology are not divorced from what’s going on in society overall at all and so there are other dystopian aspects of society at the moment, whether it’s kind of the economic system, whether it’s environmental, whether it’s social and so on and so forth; and certainly, I certainly don’t think what’s happening in technology is happening in isolation from these other trends too.

[0:58:47] Jordan Brown: So while we’re being critical then, does the technology we’ve discussed so far, does it empower all of us—like it claims to—or does it only empower a select few that we’ve been talking about, at the expense of the many?

[0:59:26] Angela Daly: I think there’s a complex answer to this question. I mean it’s undeniable that the many-to-many communication that’s facilitated by the Internet I think is liberationary compared to the one-to-many or one-to-one communications models that we had prior to the Internet, I’m thinking of television or broadcast media, or print media…

[0:59:49] Jordan Brown: Well that would be one-to-many.

[0:59:51] Angela Daly: Yes, but with the Internet, you could have many-to-many communications facility. Anyway, so I think that there definitely are, there have been liberationary aspects of the Internet and other kind of new technologies as well, but we shouldn’t get kind of carried away with that and I get, I think, increasingly we’re seeing that yes, it’s liberating but to a certain point, and that point really is this kind of data gathering, privacy infringing surveillance apparatus which underpins the Internet as we use it as well.

[1:00:28] Jordan Brown: Or even just the facilitating of more obedient, happy consumers. If we’re talking about targeted advertising…

* * *

[1:04:12] Jordan Brown: What about, turning to the point of inequality then? Sort of the “techno-have’s” versus the “techno-have-not’s,” that this experience isn’t for everyone, but we’re sort of creating a subset of people—people that are in the online environment and are having this completely separate experience to people that aren’t.

[1:04:34] Angela Daly: Sure, and even depends on the kind of Internet access that you have as well. So, in…certainly in kind of Northern Europe anyway there is pretty, kind of, good access, pretty fast and not too expensive, but obviously not for everyone either—not everyone can afford that. And even more so in Australia where Internet access is not as cheap or as good as in a whole lot of other even comparable countries despite the ongoing NBN and so on. So I think that some of the…particularly the Silicon Valley set are very much in a bubble whereby you know there’s very fast Internet, people have got enough money to pay for it and that’s kind of how they view the world, whereas that isn’t actually how the world is—a lot of people who do access the Internet in the world are accessing it on mobile phones, over 3G networks as well which are not particularly fast and are quite expensive. So when we talk about (going to a slightly different area) but when we talk about online education or ‘MOOCs’ taking over the place of universities, for instance, maybe to a small extent, I mean, that was actually a whole lot of hype which seems to have collapsed. I mean, universities have started offering online courses but I think it’s unlikely that everyone’s going to go online because you actually need a kind of good enough Internet connection to go online and to have online courses, participate in online courses and access online resources.

[1:06:06] Jordan Brown: So when someone like Tim Berners-Lee says, “The web is humanity connected,” we’re sort of talking about the myths—on the one hand they’re like, “The web or Internet access is like the great levelling.”

[1:06:24] Angela Daly: Oh yeah, well the levelling hasn’t happened I think via the Internet, and again this is where I think we have to move away from techno-utopianism or even techno-dystopianism and see the Internet and technology as a whole, as very much part of what’s going on in other areas too. So, inequality still exists despite the Internet; other problems still exist as well. Arguably, some of these problems are actually exacerbated by the Internet or by not having access—the have’s and the have-not’s, the digital divide. The digital divide was a term kind of used a lot 10 years ago, a bit out of favour now, but there are new divides that open up whereby many people have access to the Internet but they have access via a library or they only have access when they’re at university or school or work and they don’t have access at home. So they may not have…they may not be able to participate in the same way as someone has got access kind of everywhere, or people who only have access via their mobile phones, via people with access with other devices. So…

[1:07:35] Jordan Brown: I’m thinking now about the implications of that though—and this is another big idea too—if we’re defining what it means to bring our privileged ‘equality’ to everyone that looks like the complete corporately dominated Internet that we know and understand…

[1:08:07] Angela Daly: Well I suppose this is kind of happening already. So there are schemes like Internet.org where in certain countries particularly kind of emerging economies or developing countries, certainly large Internet corporations—notably Facebook, I think there’s the ‘Facebook Zero’ service which was what it used to be called, so you can get access to certain services for free on your mobile phone, you’re not kind of spending your data allowance on this, but you’re restricted to kind of walled garden of a few usually big companies.

[1:08:45] Jordan Brown: Yes. That’s my concern too. I mean, we mentioned ‘mesh networks’ a couple of steps back there, and I was thinking about how the ‘Occupy’ movement was really keen to get away from corporately controlled access, the gateways to the Internet, but the irony is that they used their ‘mesh networks,’ which were arguably very clever, to get on Facebook. So, to bring this analogous with the point I’m trying to make about how I think this ‘great levelling’ is a myth, is that we’re just doing the same thing in the so-called ‘third world,’ right? I mean, it’s like, “Come and join the Internet so you can all get on Facebook.” I don’t know if you want to comment on that?

[1:09:29] Angela Daly: I think what is interesting though is…I think it’s this kind of blog maybe, this kind of invention from Kenya, so somewhere, I mean, I think Kenya—people talk about it being sort of the ‘Silicon Valley of Africa,’ but nevertheless, it’s certainly outside of kind of the Western global north and there’s been some kind of…this interesting device which has been developed there, which is specifically for kind of areas where there are not kind of good connections, so good mobile phone connections and apparently this device can kind of facilitate Internet access in very remote places, but this is something unlikely be developed in somewhere like Silicon Valley because there’s no need for that…

[1:10:22] Jordan Brown: Well, Google Loon, I think it is called—where they put routers up in balloons or something crazy, and they have them hovering over the place… But this is my point though: it’s like, “This is so you can come and use Google; welcome to our walled garden.” And I don’t know if I’m being too gritty in my sour distaste of all these things, but the thing I’m trying to make the point of is what I see as a trend—which also underlines a lot of the social and political issues that we’re talking about, how that’s peripheral to technology too—that social issues in the context of the time informs the technology; the point I’m trying to get at though, by bringing those ideas together is how Lewis Mumford talked a lot about ‘authoritarian technics.’ The example I’ve got here is: How would you provide your own access to the Internet without being beholden to the corporate gatekeepers? This is despite the fact that the infrastructure runs on—even the root DNS servers, the nameservers that resolve everyone’s domain names, that’s a corporately controlled service; so we’re talking about fibre-optic links; we’re talking about wired and wireless infrastructure, that’s all corporately owned. So my point is almost—and this may be a big idea too—is that it seems to me that a couple of layers deep behind this technological progression, we have that these technologies can never be democratic, because it doesn’t fit into the way they’re capable of being designed. Like the military, that [the Internet] started out as a military invention. That’s why we start with the A-B-C. So we’ve got the three power brokers, and they run the show. Everyone else sits around reacting.

[1:12:30] Angela Daly: Sure, but I think people sit around reacting but also subverting; I mean, sometimes politically, sometimes not. And I mean, there’s plenty…everything pretty much ever that’s been invented has been used for purposes which have not been the idea of the person who created it, either in good ways or bad ways. So I think there’s still…we have to kind of believe in the power of our imaginations and creativity to kind of either subvert or reuse…

* * *

[1:10:22] Jordan Brown: Well I was just about to interrupt and say that I’m really critical of that too—because one example I’ve heard before, an old example, how people subverted SMS which was a missed call service that was devised by the corporate power.

[1:13:27] Angela Daly: And in fact, SMS, they never thought it would take off, at all…

[1:13:32] Jordan Brown: Exactly, yeah, so what I’m trying to say is that even if you subvert the technology and try and use it back against itself, A-B-C is waiting there to co-opt it back in to the way they have power and control over the technology.

[1:13:47] Angela Daly: Well I suppose, this is also a little bit out of my area because I’m not really a social theorist…

(Change battery)

* * *

[1:17:15] Jordan Brown: We’ve got all these laws being built, being proposed or coming into effect, which have serious political and human rights implications—so Internet censorship regimes in so-called democracies such as the UK, Australia; laws that make it illegal for journalists and the public alike to talk about what we’re talking about—what the intelligence agencies are doing in secret to keep things secret, this sort of rapacious…

* * *

[1:18:26] Jordan Brown: I was just going to say, what are some of these laws—in broad overview—and what picture does that paint? What sort of world is being built? I guess with the fact that there’s this need to open up and have a bit of transparency, but we’re seeing the cramping down on that and the ramping up of secrecy.

[1:18:46] Angela Daly: Sure. I have to say, I know you don’t want to talk just about Australia, but I think Australia kind of shows…I think when it comes to particularly the last year or so, or since the current Abbott government came into power, we’ve seen a whole lot of laws being passed and measures being taken which seem to silence dissent in various ways and I suppose not just at the federal level, but also for instance in Tasmania as I was talking about, these kind of anti-protest laws. So there seems to be kind of a lot of laws and measures which are coming into effect which are very damaging to rights and liberties. And Australia is also particularly a kind of bad example given that there isn’t a lot in the way of constitutional rights for Australian citizens, and this is really unusual compared…Australia is really unusual in this respect, that in, I believe New Zealand, Canada, definitely the US and in the UK with the Human Rights Act, actually there are ways of challenging laws which do not—and other measures—which do not respect the human rights of citizens, and in some cases noncitizens as well. But Australia, I mean it’s really hard to challenge laws if you cannot…if there’s no actual right…there’s no Bill of Rights here. There are, at least when it comes to free expression, there is an implied right to political speech, implied into the constitution, but there is no constitutional right to privacy or constitutional protections against kind of seizures and searches by the government as it’s framed in some countries like the US as well. And so I think this kind of puts Australians in a particularly weak position, and the Australian government is really able to do a lot more legally speaking, constitutionally speaking, than even in some similar countries.

[1:20:49] Jordan Brown: But that’s crazy, right? I mean, it has huge implications.

[1:20:51] Angela Daly: Yeah and I mean, I’m obviously not from here, but I was told that…I’ve not been here for a very long time, but I was told that there have been debates in the last 10 or 20 years about a Bill of Rights and people, the discussion was very much that, you know, “Parliament is enough to protect us.”

[1:21:07] Jordan Brown: I think that’s bullshit though. I have a friend of mine who does a series of protests against advertising in public space, and in Victoria, just to do this quickly, the state of Victoria is apparently meant to have this charter of human rights and responsibilities which is reflective of the UN charter, and even the courts go out of their way to reinterpret the charter—it says in the first bit that, ‘this only applies to humans,’ but through his particular case, there’s been this really rapacious and consistent ruling to protect property rights [of corporations] over political expression. So, I’d even argue that if there were legal structures in place, there’s still the matter of having the courts get on side.

[1:22:01] Angela Daly: Exactly. Actually, for instance, there’s this ongoing case in the UK with regards to the Tempura program, part of one of these surveillance programs that was revealed by the Snowden leak or the Snowden whistleblowing which I think, there’s just been a decision at first instance which I think is going to be appealed, but at first instance the judge, or it was maybe even a tribunal but anyway the judge or equivalent found that this was probably legal or at least there was a good enough argument this was legal. So if this is what’s…so if the judges are very deferential towards administrative powers of the government which often happens in ‘national security’ cases, then that is very problematic. So even if…the rights as they exist—and particularly at least in the European sphere—there are legitimate reasons to infringe rights, they’re usually listed, each of the rights in the European convention on human rights, but nevertheless it still is very much dependent on judge’s interpretation. And certainly when confronted with a kind of ‘national security’ justification for infringing or being invasive of certain human rights and particularly privacy, judges have been very…seem to be very deferential to that justification to the point that individual rights may not be well protected. And I think there is a real kind of socio- or critical legal exercise to be done in looking at why judges are deciding cases in this way, because certainly in Europe we’ve seen that the Court of Justice of the European Union seems to be more proactive in protecting rights. It ruled earlier this year that the European data retention directive was invalid in part because of its interference or the interference that it entailed with the privacy rights and data protection rights of kind of all European citizens basically, or everyone living in the EU. And it’s a huge judgement, a very interesting judgement as well, but it’s interesting why that would come from Europe rather than, for instance, a domestic UK judge, and I actually think it would be very unlikely and that a judge in the UK would kind of come to a similar conclusion because of this deference to the executive power.

[1:24:40] Jordan Brown: And they’re beholden to that though, yeah? The European court?

[1:24:44] Angela Daly: Yeah, exactly. So this judgement kind of trumps what’s happening in the UK, and that’s another long discussion that can be had…

[1:24:55] Jordan Brown: It reminds me of, sort of, how international treaties or international agreements, or the peripheral—or even the direct effects, forget that—of globalisation. So how domestically, “We don’t want to do some thing,” but, “Screw that, there’s some international agreement that requires you to do this.” I’m thinking of the WTO or the World Health Organisation… But what I really wanted you to do though, was, in a very general way, can you pull all those ideas together? There’s big questions around the functioning of the legal system; we’ve built this sort of crazy surveillance society; things need to change—they’re all really huge, how do you pull them all together? What does it look like?

[1:25:54] Angela Daly: I think it looks depressing. Honestly, I think it’s a depressing picture. But I don’t know whether this is a picture too divorced from what’s going on outside the kind of technical sphere or in other aspects of life as well. I don’t know if we’re yet at a tipping point where things might change, but I think there’s increasingly a kind of interconnectedness of what’s happening here with what’s happening in other areas as well. So, kind of as you mentioned, the fact that kind of surveillance and these anti-terrorism powers are used against environmental protesters sometimes; the vast power of corporations in all aspects of life, so not just kind of the Internet sphere, but also when it comes to opposing regulation in other aspects of…in other spheres as well. I think this certainly cannot be seen in isolation from the other trends, which are I think depressing. I mean, I don’t think there’s a lot to be hopeful about with the current political or corporate set-up that we have at the moment. But maybe, you know, this is just part of the picture that change needs to happen across the whole.

* * *

END OF TRANSCRIPT