Angela Daly

Jordan Brown interviews Angela Daly, about the corporatisation of the Internet and its implications on community, power in society, activism and democracy.

angeladaly.com

Angela Daly is a research fellow in media and communications law at the Swinburne Institute for Social Research in Melbourne, Australia. She is also a digital rights advocate. Her PhD thesis is on the corporate dominance of the Internet.

Jordan Brown: Governments and corporations in synergy have a grip on our lives, empowered by technology, like never before. What does that look like?

Angela Daly: Well I think the kind of government-corporate access looks like the same technology, or the same design, kind of having a dual function: One, a commodifying function, a capitalist function, whereby money is made and profit is gained for the companies; but also has secondary or second function whereby surveillance and control are facilitated for governments as well. Key to this is a kind of vast gathering of data about us, about people, whereby that data is very valuable to corporations because they can make money out of it, and its valuable for advertising, to sell on to other companies, and so on and so forth; but it also is incredibly valuable to our governments too, to have this very rich and very personal information about all of us.

Jordan Brown: Can we talk more about what the corporatism, specifically looking at the Internet? I’m thinking of how corporations have physical control over the infrastructure, they build the devices, etc. Can you talk about that generally? For example, Google and Facebook.

Angela Daly: Well I suppose, I think, I’m not a historian necessarily of the Internet and of technology, but my understanding certainly is that it’s really business models—like Google’s in particular—whereby this data gathering has been key to Google succeeding as a business, and suddenly governments and particularly the US government and its allies in other parts of the world, suddenly got very interested in this huge amount of information that was being gathered, if they weren’t interested before. So it’s hard to tell; a bit of ‘the chicken and egg’ situation about what came first, but my inkling is that this business was set up and then became extremely successful, large amounts of data were gathered and that was highly convenient for the government surveillance apparatus as well. I think also, it’s a coincidence of various events is too. I mean, around the time Google was formed, this is kind of the “9-11” or the “post-911 era”; there was a great kind of desire from the government’s perspective to engage in whole lot of surveillance and justifications that the general public were scared into accepting essentially; and also coinciding with the kind of rampant neoliberalism, in that the end of history from the collapse of the Soviet Union onwards. So I’d say that all these things and more have kind of contributed to the situation that we’re in; to why corporations are doing this and why they’re being allowed to do this and also what the state interest is too, and I don’t think it’s easy to understand without understanding all these other and trends going on around the same time.

Jordan Brown: When Google was starting up, they got capital from the Pentagon, and later, there was a government project called “Total Information Awareness” where Google searches were instrumental for surveillance. Google was also doing private dealings with the NSA, selling search equipment, etc.

Angela Daly: I guess for all the rhetoric around ‘small government’ and neoliberalism, that has not actually kind of played out in practice, and certainly not in the tech sphere. Again, for all of the Silicon Valley and libertarianism, there seems to be a very symbiotic relationship with certain parts of the US government. So not the kind of regulatory parts as such, where these companies will recoil from the idea of more regulation of their business practices, but there has been a kind of symbiotic relationship with other parts of the government, particularly the US government, when it comes to as you mentioned selling surveillance apparatus, or equipment, or taking money from these—working in concert with these institutions to develop new products and so on. So again, aside from neoliberalism as an idea and as a phenomenon being a complex relationship—not a simple relationship—between companies, corporations, and the government; as this is played out in the surveillance, kind of, Internet area as well, it’s not an easy relationship either.

Jordan Brown: Can we talk about how we see the pervasiveness of target advertising?

Angela Daly: I suppose advertising kind of coming up on websites that you are visiting, based on your previous browsing history or in the context of Google what you’ve been searching for before. So based on your web browsing history, your previous actions online, and so what you’re seeing supposedly is adverts tailored to you based on that. Of course, I mean, some adverts are better tailored than others, but the idea is rather than kind of advertising to the mass overall, to everyone, where not everyone will be interested in the same things; trying to kind of advertise to niche interests, to niche groups of people—and the idea being this will be more effective from the advertiser’s point of view.

Jordan Brown: It can be insidious though.

Angela Daly: Sure. I suppose inherent to targeted advertising is this surveillance and profiling of people and their habits, their online habits, and even to the point of trying to predict what they would be interested in, or not; with, you know, that archetypal example being the woman where Target knew she was pregnant before she even did.

Jordan Brown: The Dad got really upset.

Angela Daly: Sure. There are very difficult and unpleasant consequences to that for our privacy and integrity I guess too; our autonomy, and being in control of what we reveal and don’t to others.

Jordan Brown: We’re trending towards the concept of the Filter Bubble, where the goal of the advertiser is to make the content indistinguishable from other content.

Angela Daly: Sure. I suppose kind of behind the scenes as well, there’s a total lack of transparency over what data has been collected about people, who is using it, whose selling it on to others and for what purposes, and I think this has even attracted the attention—in the US of all places—attracted the attention of the Federal Trade Commission, where I believe they’ve recently released a report on data brokers. So, kind of these very shadowy companies usually that are kind of selling on this data to others which has been collected by others; and it’s very hard for an individual user to really know what is being collected about them and who is using it and what consequences that may have—I think a particularly somewhere like the US, where that can determine things like your credit rating, health insurance, and so on and so forth.

* * *

In Europe where there is stronger privacy law, data protection law, these kinds of practices are more restricted but that isn’t to say they don’t happen. The enforcement of these laws is also a real problem, particularly when the Internet is transnational by its very nature, and so even for Europeans, their data, they’ve maybe been profiled by organisations in other countries such as the US where although in theory there is protections of privacy and particularly of European’s’ privacy through a whole series of agreements in practice, these protections are not well enforced; nor does anyone really know what’s going on and that’s a real problem, let alone for those who are living in countries where there are not very strict privacy laws. Even the privacy laws that we have are not particularly well adapted to the current situation of vast amounts of data being gathered about all of us whenever we use the Internet and all those other kinds of technology.

Jordan Brown: I’m thinking now about the data-brokers like ChoicePoint, Axiom, Quantium, LexusNexis. They’re the big, well-known corporate data-brokers. Unpacking this idea of data analytics, assembling massive amounts of data, mining through it for patterns, trying to predict things; we’re essentially talking about manipulation, right? Both on the individual level and on the societal level?

Angela Daly: Well I think how these trends are playing out is very much in a nefarious way essentially. I personally do not support the vast gathering of data about everyone and so on and so forth, but I do accept that perhaps this kind of idea of ‘big data’ and ‘big data analytics’ may, in certain circumstances be beneficial. So I wouldn’t rail necessarily against vast gathering of data per se, even though I can see that there are some positive uses of it. But I think we need to be very aware of who really is behind a lot of this data gathering and what the real purpose is; and the fact too that a lot of this data is very badly stored so it’s very insecurely stored, yet may be very intimate data about individuals. So I think, the critique is not necessarily the technology itself, but really what the uses of this technology are and for what purpose; and whose doing it, and I think there is a lack of an awareness firstly, and a lack of critique of this.

Jordan Brown: Definitely. What about with so-called ‘big data’ then. Sure, we could argue some ‘beneficial uses,’ but at the end of the day, the government can just come in and take that anyway? And the data might be totally innocuous, but this is not the point though. Once you add it with perhaps another data set or maybe look through other patterns of seemingly innocuous data, you can still pull out a lot of richness from that, right?

Angela Daly: Sure. I guess this is what I mean about, in practice, this being highly concerning given what’s happening, given what seems to be the strong government interest in gathering as much data about all of us as possible for who knows precisely what purpose. I think as a result we should be resisting this data collection despite potential positive elements, essentially because I don’t think the people engaging this data gathering nor the powers that be above them have demonstrated that they are trustworthy enough to look at our data essentially, and to manage our data and to use it for ‘good purposes’ or beneficial purposes. I don’t feel that this is really been demonstrated to us and therefore I don’t find them trustworthy at all.

Jordan Brown: Can I ask you if you think that’s inherent in some of the technologies, that a lot of these digital technologies create a trail regardless, which then can, and as we see, empower the surveillance state?

Angela Daly: I think this goes back to what I was saying at the beginning. I think also the design of this technology is very important (I suppose, I didn’t quite say that). I think the design of technology is very important and I think particularly when you look at, for instance, Google’s business, Google’s services have been designed to gather lots of information and data about the users for, I would imagine, initially Google’s own profit-making purposes, but this also has an ancillary—a very strong ancillary benefit—to government efforts, surveillance purposes as well. I think what is particularly worrying for me is the transition to mobile devices. So ‘smartphones,’ tablets and even now kind of wearable technologies. So things like ‘Fitbit,’ and so on; which, there is no option to opt-out of this data gathering when using these devices. So it’s not only the kind of apps that people are using, it’s also the devices themselves.

So at least on a kind of traditional PC or laptop, every time you’re using this machine, particularly if you are off-line, you weren’t necessarily contributing to this data gathering, whereas these devices, there is no choice. At least in the commercial offerings, you cannot buy a device whereby your data is not harvested at all times, that you’re connected to the Internet and probably even connected and to a 3G telecoms tower as well. So I see this as being particularly worrying. Even if you want to protect your privacy in using certain kinds of devices, this is now pretty much impossible. Yet, that’s the way things are going. And so if you want a mobile phone these days then you don’t really have the option of buying one—at least somewhere like Australia, whereby it’s a kind of ‘dumb phone’ rather than a ‘smartphone’ and every time you use it you are able to protect yourself and your privacy a bit more than perhaps… So I think the technological developments particularly the last 10 years have been very much with this design, that all data or as much data as possible as can be gathered about the user, and there is very little the user can do to kind of opt-out of that.

Jordan Brown: I’m thinking of that old line, “We’re all walking around with a tracking device in our pocket that just happens to make phone calls.” I’m thinking about how the same would be true with the ‘Internet-of-Things.’ Maybe you’ve opted-out, and you’re not even carrying your tracking-device-that-just-happens-to-make-phone-calls down the street with you, but what happens when the environment is embedded with sensors, much in the same way as we have prolific CCTV everywhere? It does beg the question of ‘choice.’ I mean, even the choice of having a mobile phone these days is not really much of a choice.

Angela Daly: Sure. For instance, in the West, developing countries, opting out of online-anything is incredibly difficult when you know the government is delivering services online as well. It’s kind of, you have to opt-out of society as a whole to opt-out of this data gathering and surveillance.

Jordan Brown: But that’s impossible though, right?

Angela Daly: I mean, sure, yeah. Actually, I think in Australia, there’s an interesting case of an Aboriginal man who is trying to opt-out of the system entirely, not necessarily for anti-surveillance purposes, but I guess because he does not recognise the legitimacy of Australia in any respect. I think there was an article in The Guardian about him recently which was really interesting, about how he is trying to live without interacting with the state apparatus in any way and it’s incredibly difficult. You do have these examples of people who are trying to opt-out and it’s incredibly difficult. Also, going back to technology, what you’re saying is totally right—so even people who don’t carry mobile phones, have a ‘smartphone,’ have Facebook and Google accounts and so on; are still being profiled, are still being tracked through others. So there’s this whole phenomenon I guess of what are called shadow profiles on Facebook; whereby, for instance, people who are not on Facebook will have photos uploaded, Facebook uses facial recognition technology—although it’s not supposed to in Europe but who knows whether it actually does but certainly in other countries—it uses this facial recognition technology, can identify people’s faces. So if you have friends or relatives who are not on Facebook but people are uploading photos of them, then Facebook can presumably create these kind of private profiles let’s say, of people. So they have data gathered already about individuals who’ve not opted into the service. Similarly, now that we all have ‘smartphones’ or tablets with cameras, to what extent these cameras are working when they’re are officially ‘off,’ and to what extent images of others are being captured is a real, for me anyway…I think that that’s an interesting issue to explore and it is very…given some of what we do know already about ‘smartphones’, tablets, and an the tracking thereof; also, what we already about webcams actually being activated when they’re supposed to be inactivated, then yeah, I kind of agree with the fact that we now, many of us have these tracking devices and even people who want to opt-out of them, it’s more and more difficult to do so, particularly if you’re living in a big kind of modern city, where even if you want to opt-out, everyone around you is not.

Jordan Brown: What are the implications of this working in tandem?

Angela Daly: What I will say is that I think we’re at a really kind of crucial, key moment at the moment, in a whole lot of respects, but particularly: this apparatus is all in place, we have a kind of happy marriage between the governments and big corporations in a whole wide range of spheres, but technology corporations as well; the situation kind of suits them well, but not the general public necessarily. And certainly, when surveys are done about people and their views on privacy, generally people do value privacy, they would be happy to pay for kind of privacy-enhancing alternatives, but the market is not offering this. So…

Jordan Brown: Sorry, I was even going to interject there. Is that even possible though with what’s already been built? I mean, some talk about encrypting their e-mails and that’s why we’ve got the spooks going head on with, “How do we intercept the private keys, so even if we sit in the middle and intercept that communication,” which is of course what they do by tapping the fibre links, or the connections between those things, even if you’re going after so-called ‘privacy enabling technologies,’ it’s like this arms-race…

Angela Daly: Yeah, exactly. … I don’t think there’s any interest on behalf of corporations or the government to improve our privacy or to at least A) respect our privacy rights or to enact laws which are going to force either the governments or corporations to provide more privacy protection to individuals. I think that, and we can see why, because this situation suits both parties very well even if it’s not satisfactory for the general public as such. But also, I mean the general public have been persuaded that this is okay, that “they should be willing to give up their privacy because of ‘terrorism’ and serious crime” and so on and so forth, even though there seems to be very little evidence that such vast data gathering and surveillance actually prevents terrorism or serious crimes happening. Various terrorist attacks have happened in spite of this kind of vast data gathering and surveillance, but nevertheless that’s certainly the rhetoric; the justification very much is one that people are buying into to some degree because they are worried about these things and they want to feel safe, they want to feel secure and so they have been persuaded that this is a kind-of-an-okay bargain to make. And I think nevertheless, the media in particular has been, in certain countries—I think Australia in particular—when it’s come to discussions of these recent national security anti-terrorism laws and proposed mandatory data retention, I think the media and even opposition politicians have been very scared and weak to—or even not very well informed—to oppose some of these laws and proposals that are highly intrusive of civil liberties and privacy and so on. I think that this rhetoric around security and terrorism and so on does have to be critiqued, it can’t be a subject in society that we’re unwilling to talk about, unwilling to criticise or think about critically let’s say. However, the status at the moment is very much that these things cannot be questioned or critiqued, and I think that there’s too many politicians particularly in supported opposition parties that are kind of playing it too safe with this kind of thing.

Jordan Brown: The Attorney General, Nicola Roxon said about some climate activists, that they are in the same league as terrorists because they pose threats to critical infrastructure. And that blew me away. Because what they’re doing is lumping legitimate political activists into the same league as terrorists which justifies the use of these vast spook apparatuses against legitimate political expression. That’s massive, right?

Angela Daly: Well I think, certainly from the UK as well we can see people labelled as kind of ‘domestic extremists.’ I remember I think we had Green party politicians labelled as ‘domestic extremists’ in the UK. So precisely what being a ‘terrorist’ means or an ‘extremist’ means seems to be a very wide definition and includes what seems to be legitimate thought, opinion, political activities as well. And so again I think this is a good reason to be very critical of increased powers for the police and other law enforcement agencies on the basis of preventing ‘terrorism’ given that this is given a very wide definition, and also I suppose from the European perspective, we’re in at least five if not more years into austerity, financial crisis, so on and so forth; a lot of very unhappy people and a lot of real social issues in Europe at the moment; and so a lot for people to be very unhappy about and very critical of current orders, current political orders and corporate orders for that matter as well. And when you have these…when terrorism or extremism is defined so broadly, we may even be kind of contributing to not very good politicians and corporations—such as banks for instance—remaining in their position in society. So I think there is a kind of protectionism from people who have political or corporate power and this is one of the ways that they are protecting themselves too, is having laws which seem to certainly make certain kinds of political expression difficult if not illegal.

Jordan Brown: And the vast surveillance state at their disposal to keep that in check.

Angela Daly: Sure. Once we start kind of abrogating civil liberties for terrorism or ‘political extremism’ then it’s kind of difficult to stop that. Or, once we start giving up our rights and liberties, it’s difficult to stem that flow. And I think this is what we’re beginning to see—certainly in countries like the UK and the US, and I think other parts of the world as well; that the justifications for becoming more intrusive of rights and liberties begin to proliferate. For instance, in Australia, you now have I think a law in Tasmania preventing certain kinds of environmental protest or secondary boycotts. … Once you start down this path then it can be difficult to turn back and stop, even if so-called threats, or the original threats may not be as threatening anymore, new threats seem to be found or are manufactured.

Jordan Brown: It’s like a ‘creeping normalcy,’ slow changes, and they’re not even slow now, changes over time, “We opt into this one inch at a time.” Is the same true with technology? For instance, once you roll out CCTV everywhere, it’s difficult to take that back?

Angela Daly: Well, at least when it comes to things like, let’s say, I suppose it’s a term which is kind of used in a very specific way, but let’s say: the Internet infrastructure either at the kind-of network level, or the over-the-top services level is set up with a particular design, it becomes difficult to change that fundamental design. So the fundamental design is one where data is intercepted and gathered or created even in the first place, and documented. Then if that’s been the way things have been going for a while, there is so much which is—secondary services infrastructure, and so on, which is based on that happening. So, it can be difficult to shift. It would be a big paradigm shift I suppose to make such changes. Not impossible, but certainly a big difference.

I suppose it’s ironic as well, given that the Internet was set up on a decentralised basis given of decentralised networks are more resilient in many ways, or so I’m told anyway, and at least if one part of the network sort of stops working or malfunctions in some way, the rest of the network can be okay. But I think we also…aside from some of these other, socio-cultural, economic changes, we’re also seeing a kind of centralisation of the network or the infrastructure and services in many ways which was very different to the Internet’s initial design.

Jordan Brown: How did that happen? I’m interested in how does corporatism and the co-opting of that decentralisation, turn into total centralisation?

Angela Daly: Well I suppose certainly from the corporate or economic perspective, to some extent businesses don’t want to be regulated and certainly big businesses don’t want to be regulated … Corporations don’t want to be regulated, they’ll often make a big song and dance about not being regulated, however, big corporations—even if that’s the rhetoric, the employee actually, regulation is something that they can afford that their smaller competitors may not be able to afford, and so this kind of centralisation perhaps is an effect of certain regulation being brought in. So things like data retention, for instance, that actually does impose or is likely to impose a cost on businesses based on what’s happened in Europe, and certainly here, that’s very much part of the discussion at the moment. The big businesses like Telstra will be able to bear that cost, however Telstra’s smaller competitors may not be able to bear that cost. So even if large companies may argue against such regulation based on cost, in many ways it may be beneficial to them, or if they’re playing a longer game, it may be beneficial to them economically as well, that this will take out smaller competitors and make it more difficult to compete with them, and they will continue to have a large market share. So basically this kind of regulation can be a barrier to entry for potential competitors and therefore is in the interests in a longer term perspective to existing big corporations. I don’t know if that answers your question…

Also, I suppose there’s network effects as well. So, despite the network infrastructure of the Internet being highly decentralised, the over-the-top services particularly “Web 2.0” services have been very much based on networks and kind of social networks in particular; so, for instance, Facebook is the biggest social network even though it’s been declining in certain and maybe doesn’t have as many members or new members as it once did, but still, it’s kind of, I don’t know, has a billion people connected? Something like this?

So obviously if you kind of join Facebook, you join Facebook because you know a whole lot of people on there already… It’s a big switching cost to switch to another platform where you don’t know anyone, or you have few friends already online. So that also is a kind of phenomena that we see on the Internet which promotes some kind of centralisation in certain services; or, for instance, using Skype as a voice-over-IP service. So you could use other—there are other alternatives to Skype, but if all your friends and family are using Skype, then that’s what you’re going to use. There’s also a kind of, arguably, a lack of interoperability between different services as well. So I think this promotes centralisation at that point.

Jordan Brown: And by design too. Because they “don’t want you to go.”

Angela Daly: Sure. Exactly. So this kind of lack of interoperability certainly is something which is by design because that kind of protects that company against competitors in the market as well.

* * *

Jordan Brown: What are your concerns with surveillance? What are we dealing with already, right now? What’s happening?

Angela Daly: Okay, so I think right now, I would say particularly in the English speaking—or the ‘five eyes’ countries; so US, UK, Australia, New Zealand, Canada—we’re dealing with a vast surveillance and data gathering apparatus which has been semi-secret for a long time and only as a result really of various whistleblowers—prominently Edward Snowden, but not only Edward Snowden; he’s one of a line of people who have blown the whistle on what our government’s and law-enforcement agencies have been engaging in; stuff which has been very nontransparent. So there’s not been very upfront discussion of this kind of surveillance apparatus…

Jordan Brown: Even actively hidden. They lie about it, and have lied about it.

Angela Daly: Sure. There’s been a total lack of transparency, and it seems that our governments, law-enforcement agencies, and so on, have only started to become more transparent, when they’ve been forced to be, so in response to various leaks and whistleblowing and so on. So certainly, given that these countries profess themselves as being ‘liberal democracies’ and that these kind of activities ought to be subject to scrutiny—at least knowledge—by the people, if not only kind of our representatives in parliament who seem also not to be particularly, not to have been particularly aware of these developments as well. I think this is highly concerning that we seem to have parts of our governments and administrations that have arguably gone a bit rogue from a democratic perspective as well. And the justification of the moment anyway is ‘War on Terror,’ but how that has kind of morphed over the last I suppose, 13-odd years since 9-11 essentially. But of course there’s a long kind of history of surveillance and threat-of-the-moment too. In the UK it used to be Irish-republican terrorism, now it’s Islamist terrorism. It’s almost even ‘environmental terrorism’ or ‘extremism’ or leftism even… So, I think this does have to be put in a historical perspective as well, that we’re not living in necessarily… All times are exceptional times, and there are threats that are used to justify invasions of civil liberties and rights, and the current justification is kind of the ‘War on Terror.’

And if you start protesting against it… Well, what I want to say is that I think it kind of has to be put in a broader historical perspective. Certainly post-World War II, there’s been…this is kind of the latest in a line of…well I suppose developments in surveillance but also developments in kind of military and law enforcement, spy agency cooperation that’s been going on for decades.

I suppose my big concern is the invasion of ordinary citizens’ privacy as a result of…and not just privacy, other rights and liberties as well, but privacy I guess is the big and most obvious one here. So I think that’s highly concerning particularly, I mean, the vast majority of people are not doing anything wrong at all, not that I really think surveillance is kind of justified, kind of… the vast majority of people are subject to this kind of data gathering if not actual surveillance even if they’re not doing anything wrong, or have never been involved in crime, even petty crime, and so it’s totally disproportionate that the rights of normal people who are not engaging in anything nefarious or bad, let alone terrorism are being infringed and impinged upon by this kind of surveillance and data gathering apparatus.

Jordan Brown: So why should we care then?

Angela Daly: Well aside I suppose from sort of theoretical or philosophical reasons, aside from the fact our rights are being infringed, aside from the rhetoric around that, there are some practical consequences as well. So firstly, this vast massive amount of data about all of us is not being stored in a very secure fashion. There’s been some kind of high-profile hacks of data stored about individuals this year, I think earlier this year, there was a leak of data about asylum seekers—so people in a very vulnerable position in various ways who, at least have argued that they were escaping torture in their own countries, so if they were sent back to these countries then a whole lot of information would be known about them, or could be known about them as a result of this leak. So that’s kind of one particularly bad example. But if this information was kind of stored securely and only subject to access by very limited people, then perhaps it would be more justifiable. That doesn’t seem to be the case. And even Edward Snowden himself was a government contractor, wasn’t even working directly for the US government. I believe, what he said anyway, was that there was a huge amount of people, thousands and thousands of people had access to huge amounts of data about people not just in the US but all over the world. And so I think there is very little in the way of security, there is little in the way of checks and balances, and also, I mean, what is this data being used for, or what could it be used for? Once the data is there, it’s difficult to get rid of. There’s various ways, technically speaking anyway, but it’s also valuable to the government and also corporations in various ways as well. So, I suppose it kind of remains to be seen what precisely is going to be done with this over the course of our lifetimes, between the generations that are growing up now who may be photo’d; photos of them are put on Facebook from when their babies, so what’s going to be done with all of this information? It may be used in ways that are not particularly equitable or democratic, for instance, and that I think is a big worry. And this is all very unnecessary I suppose as well, it’s not necessary that all this information is gathered about all of us, particularly when it happens outside of our control, or we have only limited control over it. And so I think these are kind of concerns about why this data gathering is at least suspect, if not bad itself.

Jordan Brown: I’m thinking about the prospect of, and even William Binney said this, how the real power in collecting all of this data is the retrospective analysis. So yeah, sure, even if you aren’t doing anything ‘bad’ now, with generations growing up now and that’s all fine, at some point in their lives, say 20 years down the line, you can pull up their entire life. And to me, that’s really chilling. Because the implications for that are huge. It means you’re not a target now, and you might not be, but you could be at any given point.

Angela Daly: And I suppose another thing to say too, is that this data may not all be very accurate either. So if decisions are being made based on this and potentially this alone that’s also worrying too because it may not actually an accurate reflection of what’s going on. I mean, ‘big data’ has been—which itself is a very vague concept—but there’s been criticisms of ‘big data’ techniques, or at least being used…only big data techniques being used to kind of aid decision-making because it doesn’t paint…it paints a certain picture of people and reality but not, arguably, the full picture. And so even in academic research as well, sort of qualitative techniques or ethnographic techniques are still very important to build a kind of full picture of what’s going on in a particular area and so I think there’s also that issue, that we’re getting…even if…so the data may not be accurate and even if it is accurate, it only paints kind of a certain picture not necessarily the full picture, and we should be cautious about basing too much decision-making, whether it’s from a law enforcement or national security perspective or even from a kind of corporate decision-making perspective. So, who to hire or fire as employees, decisions by health insurance providers and so on and so forth.

Jordan Brown: ‘Big Data’ is something I’m really suspicious of, one because of the reasons that you’ve mentioned, but also because of the culture: that people believe the computer rather than their own experience, the experience of the real world.

Angela Daly: Well I think there’s a huge amount of rhetoric from Silicon Valley, kind of, very utopian—I mean there’s a whole lot of utopian rhetoric from there anyway—but there is this kind of messianic discussion of big data as being the kind of solution to all problems: so “the more data we have, the less problems we’ll have.” But it’s very…of course there are certain reasons why these arguments are made, I mean it sells certain products, there’s that kind of self-interested reason why we’re seeing some of these arguments coming out of Silicon Valley. But also I think there’s a kind of cultural side too whereby the problems of society are much more complex than can just be solved by more data. And I think there’s kind of a wish not to engage perhaps with some of these complexities which…because engaging with some of these complexities may present rather unpalatable truths to Silicon Valley, for instance, and particularly I’m thinking around inequality, social problems, health and so on and so forth; that’s not just going to be solved by, I don’t know, people having wearable devices that tell them to walk a bit more. The causes of social problems and bad health are much more complex and more kind of overarching as well rather than just kind of the fault of the individual let’s say. But I think that some of this is somewhat lost in discussions of ‘big data’ too.

Jordan Brown: Yes. Maybe I should ask about the techno-utopianism. What are your thoughts about the prolific optimism of this culture, the deus-ex-machina if you like, that specific to technology, how the problems that technology causes, the sort of assumption that if you ‘throw more technology at it, we can fix some of these problems,’ social issues and so on.

Angela Daly: What I will say is that I think it’s been an interesting 10 years or so. I mean, the Internet in particular is kind of matured as a technology, I mean it’s in its…well, it’s been publicly available to the masses; we’re kind of now in the second decade. So it’s interesting to kind of see and reflect back on what’s actually happened. I think there have been kind of liberationary aspects of the technology, but of course it doesn’t exist kind of in a vacuum from the rest of society, there’s other things that have been going on in different societies throughout the world which have interacted with these technological developments. So I guess I’m thinking about things like the Arab Spring, the so-called ‘Twitter revolutions’ and so on and so forth. There’s a huge amount of commentary on them both contemporaneous and some years later where with the benefit of hindsight we can see that, yes, the technology was important but it wasn’t the only kind of determining factor there. Nevertheless, I think that this kind of, the vast, the huge amounts of data gathered and the surveillance apparatus which kind of supports and facilitates it, do make me somewhat pessimistic about the liberationary possibilities for this technology anyway, kind of, going forward. I’m more likely to kind of sympathise with a techno-dystopian of someone like Evgeny Morozov now than perhaps I was kind of 10 years ago. Nevertheless, I think we kind of see still at the edges of the Internet, which I see in a kind of non-technical way, but if we think the mainstream of the Internet is kind of mediated by big players like Google and Facebook and Microsoft and so on, still at the edges we’re seeing things like interesting kind of Peer-to-Peer activities; people kind of trying to opt-out or to decentralise what is becoming a more centralised phenomenon through you know, things like mesh networks, also through anonymising techniques—cryptography and so on; and not to say that these are all perfect or all kind of facilitate freedom however conceived or are even kind of, I mean there’s plenty of contestation not in the least by various governments too, trying to decrypt cryptographic techniques as well. But I think you can still see some kind of lawlessness, not sure whether that’s a good thing or not at kind of the darker parts of the Internet. And so even as was, prior to the Internet, the law is not enforced absolutely in its entirety in every situation, and we see that as well with the Internet. Despite this apparatus, despite centralising tendencies, we’re seeing kind of dissent…and deviation, some of which are good, some of which are not so good. That’s still happening, but perhaps the technology overall is not going to be as kind of ‘Earthshattering’…

* * *

I think that’s a bit dystopian: that we are…our autonomy is eroded somewhat by all of this…

Perhaps I should say that I’m not just techno-dystopian, I’m a bit dystopian in general. I think some of, like I was trying to emphasise before, these trends in technology are not divorced from what’s going on in society overall at all and so there are other dystopian aspects of society at the moment, whether it’s kind of the economic system, whether it’s environmental, whether it’s social and so on and so forth; and certainly, I certainly don’t think what’s happening in technology is happening in isolation from these other trends too.

Jordan Brown: Does the technology empower all of us, like it claims to? Or does it only empower a select few that we’ve been talking about, at the expense of the many?

Angela Daly: I think there’s a complex answer to this question. I mean it’s undeniable that the many-to-many communication that’s facilitated by the Internet I think is liberationary compared to the one-to-many or one-to-one communications models that we had prior to the Internet. There definitely have been liberationary aspects of the Internet and other kind of new technologies as well, but we shouldn’t get kind of carried away with that and I get, I think, increasingly we’re seeing that yes, it’s liberating but to a certain point, and that point really is this kind of data gathering, privacy infringing surveillance apparatus which underpins the Internet as we use it as well.

Jordan Brown: What about, turning to the point of inequality then? Sort of the “techno-haves” versus the “techno-have-nots,” that this experience isn’t for everyone, but they’re sort of creating a subset of people that are in the online environment and are having this completely separate experience to people that aren’t.

Angela Daly: Sure, and even depends on the kind of Internet access that you have as well. So, in…certainly in kind of Northern Europe anyway there is pretty, kind of, good access, pretty fast and not too expensive, but obviously not for everyone either—not everyone can afford that. And even more so in Australia where Internet access is not as cheap or as good as in a whole lot of other even comparable countries despite the ongoing NBN and so on. So I think that some of the…particularly the Silicon Valley set are very much in a bubble whereby you know there’s very fast Internet, people have got enough money to pay for it and that’s kind of how they view the world, whereas that isn’t actually how the world is—a lot of people who do access the Internet in the world are accessing it on mobile phones, over 3G networks as well which are not particularly fast and are quite expensive. So when we talk about (going to a slightly different area) but when we talk about online education or ‘MOOCs’ taking over the place of universities, for instance, maybe to a small extent, I mean, that was actually a whole lot of hype which seems to have collapsed. I mean, universities have started offering online courses but I think it’s unlikely that everyone’s going to go online because you actually need a kind of good enough Internet connection to go online and to have online courses, participate in online courses and access online resources.

Jordan Brown: So when someone like Tim Berners-Lee says, “The web is humanity connected,” we’re sort of talking about the myths. For example, “The web is like the great levelling.”

Angela Daly: Oh yeah, well the levelling hasn’t happened I think via the Internet, and again this is where I think we have to move away from techno-utopianism or even techno-dystopianism and see the Internet and technology as a whole, as very much part of what’s going on in other areas too. So, inequality still exists despite the Internet; other problems still exist as well. Arguably, some of these problems are actually exacerbated by the Internet or by not having access—the haves and the have-nots, the digital divide. The digital divide was a term kind of used a lot 10 years ago, a bit out of favour now, but there are new divides that open up whereby many people have access to the Internet but they have access via a library or they only have access when they’re at university or school or work and they don’t have access at home. So they may not have…they may not be able to participate in the same way as someone has got access kind of everywhere, or people who only have access via their mobile phones, via people with access with other devices. So…

Jordan Brown: I’m thinking now about the implications of that. If we’re defining what it means to bring our privileged ‘equality’ to everyone that looks like the complete corporately dominated Internet that we know and understand…

Angela Daly: Well I suppose this is kind of happening already. So there are schemes like Internet.org where in certain countries particularly kind of emerging economies or developing countries, certainly large Internet corporations—notably Facebook, I think there’s the ‘Facebook Zero’ service which was what it used to be called, so you can get access to certain services for free on your mobile phone, you’re not kind of spending your data allowance on this, but you’re restricted to kind of walled garden of a few usually big companies.

Jordan Brown: Yes. That’s my concern too. I mean, we mentioned ‘mesh networks,’ and I was thinking about how the ‘Occupy’ movement was really keen to get away from corporately controlled access, the gateways to the Internet, but the irony is that they used their ‘mesh networks’ which were arguably very clever, to get on Facebook. So, to bring this analogous with the point I’m trying to make about how I think this ‘great levelling’ is a myth, is that we’re just doing the same thing in the so-called ‘third world’? I mean, it’s like, “Come and join the Internet so you can all get on Facebook!”

Angela Daly: I think what is interesting though is…I think it’s this kind of blog maybe, this kind of invention from Kenya, so somewhere, I mean, I think Kenya—people talk about it being sort of the ‘Silicon Valley of Africa,’ but nevertheless, it’s certainly outside of kind of the Western global north and there’s been some kind of…this interesting device which has been developed there, which is specifically for kind of areas where there are not kind of good connections, so good mobile phone connections and apparently this device can kind of facilitate Internet access in very remote places, but this is something unlikely be developed in somewhere like Silicon Valley because there’s no need for that…

Jordan Brown: Google Loon, where they put routers up in balloons or something crazy, and they have them hovering over the place, it’s like, “This is so you can come and use Google; welcome to our walled garden.” The thing I’m trying to make the point of is what I see as a trend, which also underlines a lot of the social and political issues that we’re talking about, is that the social issues in the context of the time informs the technology. By bringing those ideas together is how Lewis Mumford talked a lot about ‘authoritarian technics.’ The example I’ve got here is: How would you provide your own access to the Internet without being beholden to the corporate gatekeepers? For example, the root DNS servers, the nameservers that resolve everyone’s domain names, that’s a corporately controlled service. Fibre-optic links. The wired and wireless infrastructure. It’s all corporately owned. So my point is that these technologies can never be democratic, because it doesn’t fit into the way they’re capable of being designed. For example, the Internet started out as a military invention. A-B-C. So we’ve got the three power brokers, and they run the show. Everyone else is reacting.

Angela Daly: Sure, but I think people are reacting but also subverting; I mean, sometimes politically, sometimes not. And I mean, there’s plenty…everything pretty much ever that’s been invented has been used for purposes which have not been the idea of the person who created it, either in good ways or bad ways. So I think there’s still…we have to kind of believe in the power of our imaginations and creativity to kind of either subvert or reuse.

I know you don’t want to talk just about Australia, but I think Australia kind of shows…I think when it comes to particularly the last year or so, or since the current Abbott government came into power, we’ve seen a whole lot of laws being passed and measures being taken which seem to silence dissent in various ways and I suppose not just at the federal level, but also for instance in Tasmania as I was talking about, these kind of anti-protest laws. So there seems to be kind of a lot of laws and measures which are coming into effect which are very damaging to rights and liberties. And Australia is also particularly a kind of bad example given that there isn’t a lot in the way of constitutional rights for Australian citizens, and this is really unusual compared…Australia is really unusual in this respect, that in, I believe New Zealand, Canada, definitely the US and in the UK with the Human Rights Act, actually there are ways of challenging laws which do not—and other measures—which do not respect the human rights of citizens, and in some cases noncitizens as well. But Australia, I mean it’s really hard to challenge laws if you cannot…if there’s no actual right…there’s no Bill of Rights here. There are, at least when it comes to free expression, there is an implied right to political speech, implied into the constitution, but there is no constitutional right to privacy or constitutional protections against kind of seizures and searches by the government as it’s framed in some countries like the US as well. And so I think this kind of puts Australians in a particularly weak position, and the Australian government is really able to do a lot more legally speaking, constitutionally speaking, than even in some similar countries.

Jordan Brown: That has big implications.

Angela Daly: Yeah and I mean, I’m obviously not from here, but I was told that…I’ve not been here for a very long time, but I was told that there have been debates in the last 10 or 20 years about a Bill of Rights and people, the discussion was very much that, you know, “Parliament is enough to protect us.

There’s this ongoing case in the UK with regards to the Tempura program, part of one of these surveillance programs that was revealed by the Snowden leak or the Snowden whistleblowing which I think, there’s just been a decision at first instance which I think is going to be appealed, but at first instance the judge, or it was maybe even a tribunal but anyway the judge or equivalent found that this was probably legal or at least there was a good enough argument this was legal. So if this is what’s…so if the judges are very deferential towards administrative powers of the government which often happens in ‘national security’ cases, then that is very problematic. So even if…the rights as they exist—and particularly at least in the European sphere—there are legitimate reasons to infringe rights, they’re usually listed, each of the rights in the European convention on human rights, but nevertheless it still is very much dependent on judge’s interpretation. And certainly when confronted with a kind of ‘national security’ justification for infringing or being invasive of certain human rights and particularly privacy, judges have been very…seem to be very deferential to that justification to the point that individual rights may not be well protected. And I think there is a real kind of socio- or critical legal exercise to be done in looking at why judges are deciding cases in this way, because certainly in Europe we’ve seen that the Court of Justice of the European Union seems to be more proactive in protecting rights. It ruled earlier this year that the European data retention directive was invalid in part because of its interference or the interference that it entailed with the privacy rights and data protection rights of kind of all European citizens basically, or everyone living in the EU. And it’s a huge judgement, a very interesting judgement as well, but it’s interesting why that would come from Europe rather than, for instance, a domestic UK judge, and I actually think it would be very unlikely and that a judge in the UK would kind of come to a similar conclusion because of this deference to the executive power.

Jordan Brown: There are big questions around the functioning of the legal system, this sort of crazy surveillance society. Things need to change, they’re all really huge, how do you pull them all together? What does it look like?

Angela Daly: I think it looks depressing. Honestly, I think it’s a depressing picture. But I don’t know whether this is a picture too divorced from what’s going on outside the kind of technical sphere or in other aspects of life as well. I don’t know if we’re yet at a tipping point where things might change, but I think there’s increasingly a kind of interconnectedness of what’s happening here with what’s happening in other areas as well. So, kind of as you mentioned, the fact that kind of surveillance and these anti-terrorism powers are used against environmental protesters sometimes; the vast power of corporations in all aspects of life, so not just kind of the Internet sphere, but also when it comes to opposing regulation in other aspects of…in other spheres as well. I think this certainly cannot be seen in isolation from the other trends, which are I think depressing. I mean, I don’t think there’s a lot to be hopeful about with the current political or corporate set-up that we have at the moment. But maybe, you know, this is just part of the picture that change needs to happen across the whole.

* * *

END OF TRANSCRIPT

is an activist, artist, musician, independent film-maker and freelance journalist whose work focuses on the interface between the dominant culture and the real impact on people, society and the environment. He has won awards and industry accolades for his work, including the 2018 Edward Snowden Award, the 2017 Change Maker Award (NIFF), and the 2016 Honorary Award of the Ministry of Justice (Slovakia).