Resistance Radio

Derrick Jensen: Hi, I’m Derrick Jensen, and this is Resistance Radio on the Progressive Radio Network. My guest today is Jordan Brown. He is an activist, artist, musician, and award-winning independent film-maker whose work broadly looks at the social, political, and environmental implications of digital technologies. He is particularly interested in cultivating a critical view of today’s culture of screens, and has recently completed a feature-length documentary on the subject called Stare Into The Lights My Pretties.

So first off, thank you for your fabulous work, and second, thank you for being on the program.

Jordan Brown: Well thanks, Derrick. Thank you for your work as well, it’s really great, and thanks for having me on, it’s a pleasure.

Derrick Jensen: So tell me, and tell the audience, about the new film, Stare Into the Lights My Pretties.

Jordan Brown: Okay. Recently I completed my first feature-length documentary. It’s a film that takes a critical view of today’s sort of culture of screens. We’re just completely immersed in this, this society and culture that is completely dependent on digital technologies. And it’s a polemical film, so it’s something that I want to use as a sort of catalyst to open up some bigger questions about this culture’s reliance on technology, but also how the technology influences people, society and the environment, sort of this tiered interplay, I suppose.

Derrick Jensen: Before we go any further, what do you mean by “culture of screens?” Does that include television, then? What are you saying?

Jordan Brown: Well, yeah. There’s a voice in the film, Susan Greenfield. She’s a renowned neuroscientist from the UK. She defines this brilliantly in the film, in that what she means by screen culture, and I guess, what I’ve taken to expand in the film is that screen culture is defined as—it’s not so much the devices themselves—well, it is that—but it’s moreso the amount of time we spend with screens, and how screens are sort of everywhere. So she includes things like the amount of time people spend in front of a laptop, or with their mobile phones, and with emerging technologies like Google Glass, things like this. She talks about screen culture as being this sort of pervasive phenomenon that has big implications on the brain, and she talks about that from a neuroscience perspective.

Derrick Jensen: You’ve said on the person, society and the environment. So what are some of the personal implications for devoting—or let’s back up even before that. So when we talk about screen time, do we have any numbers? Do you know how much time people spend on screens today vs. how much time they did, let’s go with the background of 200 years ago, they spent zero. 80 years ago they spent some, but not as much, and then—do you know? Like, how much time do people spend on screens?

Jordan Brown: Well I know today—and some of these figures are fairly old—I think from 2010, that it’s something in the vicinity of ten hours a day. And that’s sort of the average amount of time that an adult would spend in front of a screen. Perhaps that figure would be much larger for younger people these days, but I think it’s somewhere around that area. There are varying numbers, but I think a more conservative one would be anywhere from six to ten, twelve hours a day in front of a screen.

Derrick Jensen: Which is the majority of our waking hours.

Jordan Brown: Yeah, it’s probably one of the things we spend most of our time doing, interacting with some kind of digital technology in one way or another.

Derrick Jensen: So what are some of the neurophysical and other psychological effects of this? Because that has to have some effects.

Jordan Brown: Okay. I guess one place we could start with this would be just the way that it physically changes the brain. Susan Greenfield, the neuroscientist in the film, talks about this on a scientific basis. There’s this notion in neuroscience called “neuroplasticity,” which is the phenomenon of how your brain physically changes. The connections in your brain are physically influenced by your experience, the way you perceive and understand the world.

Derrick Jensen: A classic example of this, by the way, is that they’ve done brain scans of London taxi drivers. And London’s supposed to be one of the hardest cities in the world to get around. And London taxi drivers then, learning all those streets affects their brain physically.

Jordan Brown: That’s a really great example. And I guess it’s analogous to doing a physical exercise. So if you’re using, say if you’re using your arms, your muscles develop, and if you’re not using your muscles, they atrophy and you lose that capability. The same thing in the brain; if you’re rehearsing a certain set of skills or having a certain sort of reaction or experience, then that develops and sustains the neuro-connections in your brain, and that strengthens those connections, which makes it easier for your brain, for those pathways in the brain to interact with each other.

So, if we’re spending the majority of our time sitting in front of a screen, we get really good at using a screen, but we also change our brains more into this sort of notion of becoming a computer itself. We become really used to being bombarded by lots of information. A lot of the online environment is designed, I would say, and has obvious elements of distraction. We can talk about stuff like advertising that’s constantly bombarding you when you’re online, or just even the notion of when you’re trying to do something online, say you’re writing someone back, you’re writing an email or something, or you’re looking up something on Google to satisfy one of your random curiosities that you may have at any point in the day. Most likely you’re interrupted. You’re interrupted with a ping of something. Maybe your phone goes off, maybe you get an email or you get a Facebook message or something like this.

So distraction is a big part of the online environment, so I guess one of the things, from a neuroscience perspective, is that if you’re practising being distracted, then that’s what you’re going to do. You’re going to not only become used to being distracted, but it will become a much stronger part of your experience. And I guess that it exacerbates and encourages that kind of thinking, that kind of scatterbrained thinking where you go jumping from one thing to the next all the time. You’re not going deep with anything, which means that you’re not practising and using the ability to think about things linearly or really going deep on anything, because you’re constantly being distracted and jumping around. I imagine that has concerning personal implications, but also we can take it on from there. If we have large groups of people who have short attention spans and are used to being distracted, then surely that has concerning social implications as well.

Derrick Jensen: I interviewed someone else about how long-form thinking is becoming rarer, and then the relationship between an incapacity to do long-form thinking and an incapacity to solve some of the most difficult personal and social questions facing us today.

Jordan Brown: I think that’s something really classic. Long-form thinking is probably the first thing we see being lost to the screens. I was thinking about this the other day, even in terms of news reporting, how insane this—I mean, we can talk about how the collapse of journalism and the crisis of mainstream media. That’s a whole other conversation. But I think even just the notion that—yeah, news is being reported in short snippets these days. We have this sort of news culture that—institutions want to break the story in real time, and so instead of this analysis, we just have this rolling constant sort of flood of tweets, or little snippets of information without much context, or without much analysis. We just get bombarded by this tsunami of information that doesn’t really connect to anything. We don’t have time to critically analyse anything, we don’t get this opportunity, or give ourselves the opportunity to go deep with any information, because we’re just being bombarded. We’re really losing this ability to be doing long-form thinking and going deep on an idea, and sitting with it. Being contemplative, I think.

Derrick Jensen: I watched your film, and I sent you a note about this, that I was only going to watch like the first five or ten minutes and I ended up watching the whole thing in one sitting and just thought it was absolutely fabulous.

Jordan Brown: Well, thank you.

Derrick Jensen: And one of the people in there mentioned the effects of screens on our memories. That we’re seeing already a decrease in the capacity to remember. Because why should you remember when you can just type it into Google?

Jordan Brown: I think Pat Sparrow is her name, sorry if I have that wrong. “Sparrow” is definitely her last name. [You should look it up on Google!] She did a study that was talking about how—the people she was looking at, you give them information that they’re not expected to remember later, then they don’t remember that information as well. And she talks about this phenomenon where people, instead of remembering the information itself, they recall the place to find it, which is essentially Google these days. She was talking about the way people would think about Google first, or would think about their computer first, when they’re trying to recall information, where they’re seeking to find an answer to something. This sort of plugs into the larger question of what are we doing as a culture, when that’s the first thing we think of? We think about the place to find it, when we’re confronted with a question or something we don’t know.

Derrick Jensen: And there is also the question of giving up tremendous power to that single source of information, when instead of relying on either your own memory—basically we have outsourced memory to Google.

Jordan Brown: Yes. I think this is really where the film began for me. The germ of the film’s idea was, I had this idea back in 2009, that I was going to, because I could see it emerging then, already, and perhaps it was a new journey for me, but I had this idea that maybe I would make a film using Google as a case study to sort of talk about the way that one company is having such a tremendous influence over many aspects of our lives. And of course at that stage Google had, I think, just started scanning a bunch of out-of-print books, and making this large digital repository of books that was just for themselves. And so one of the concerns I have, and had back then, is what does it mean when one company privatises the world’s information and uses it sort of in a monopolistic fashion?

And of course, you know, I went off to interview and did things like this and started developing a film, but really got bowled over by the fact that it’s not just one company. And it’s not just one technology either. One of the other voices in the film, Katina Michael, which was one of the first interviews that I had, broke out into this larger discussion of this culture’s complete fascination with things like artificial intelligence and robotics. So I think I realised pretty early on that it’s not just, it’s not just one company, it’s not just Google, and it’s not just one technology, it’s not just the Internet, it’s not just computers. It’s this sort of larger question of the implications of digital technology in general. That’s where the film really started, I suppose, was from the question of what does that mean, on a personal and societal level, if one company has such extensive influence over the information streams that we’re all tethered to?

Derrick Jensen: Speaking of one company, suddenly I remembered the Lord of the Rings line about “one Ring to rule them all,” so speaking of one company influencing people on a pretty profound scale, you have—one of the parts of the film that I thought was really, really important—the whole thing is important, but one part that just blew me away was the bit about Facebook. There was a line—I’m going to throw the line out, and then if you can fill in behind and in front of this line? The line was about how—something along the lines of “we think that Facebook is helping little Johnny to make friends, but what Facebook is really about is predicting your future, is knowing you so well that they can predict your future wants and get you to pay for them.”

So can you fill in around that? That line was just so brilliant.

Jordan Brown: The line was, that was by Douglas Rushkoff. He was talking about the myth. He was talking about how there’s not really this deep understanding about what the tools we are using are for. So we sort of have this myth that is espoused by a bunch of companies, all the online companies do this. They have these slogans and these big philanthropic ideas and aims. Google presents itself as being this champion of openness, but what they really exist for is to develop really detailed profiles about you, so they can sell you ads. That’s their whole business.

And Facebook is the same. Facebook isn’t so much a place to make friends. I mean, that’s sort of where it came from, but one of the primary purposes of itself as a business is to commodify your social relationships. And so what Douglas Rushkoff was talking about there is that today we don’t even know what the tools we’re using are for. You ask a regular young user of Facebook “what do you think Facebook is for?” and the answer will be “to help me make friends.” But you go to the boardroom of Facebook and what are they talking about there? Are they talking about “How can we make little Johnny make more friends? How can we foster deeper, more meaningful human relationships?” No, they’re not talking about that at all. They’re talking about “How are we going to monetize Johnny’s social graph?” And “How are we going to use that data, mine it and look for patterns to sell Johnny’s future to himself before he knows his own self?”

I think that idea is talking about something quite large, which I think is concerning about lots of these digital tools. They’re manipulative and are so by design. It’s even a little bit sinister in a way, in that there’s this pretence that the tools are designed for this purpose, and maybe they are in some cases, but I think there’s a strong ancillary purpose, which is to make money and sell advertising, which also opens up this great door for surveillance and manipulation and social control, these ideas, which are explored in the film.

Derrick Jensen: Let’s be explicit about this. How does Google, or Facebook, monetise our desires or monetise our interests—how exactly do they make money?

Jordan Brown: Google does this in a bunch of ways. One of the primary ways is every time you search for something on Google, or every time you visit a webpage, and there are many webpages out there online that use Google services, such that if you visit a website that has a Google ad on it, that’s talking to Google servers, and that information Google brings in to develop a profile of you. They’re amassing vast amounts of data about what they’re guessing are people. And they keep collecting information to the point where they can guess with pretty good probability that they’re really talking to you, talking to the person on the other end of this screen. So one of the ways they make money is by using that information to sell you products, to show you targeted advertising. Advertising is essentially Google’s primary business model.

The same is true for Facebook. Facebook is all about bringing you in to this digital space where everyone else is. Everyone’s having their conversations there and people are doing their socialising, I mean, they call it the social network, after all. Which has this sort of effect. As more people join Facebook, it becomes more valuable to join, because everyone else is there. They call it the “network effect.” It’s the same thing that happened with the telephone. The telephone wasn’t really useful until everyone had a telephone, which made the appeal of getting a telephone stronger. And there is social influence on that too, I guess “peer pressure” is not the right phrase, but there are social pressures that come from an example like that. But yeah, Facebook operates in the same way in that they’re really gathering all of these really detailed information streams, that they’re doing this vast data gathering for the purposes of advertising. And then not only that, with a company like Facebook, they’re also selling the data that they have as far as I know, or there are other companies out there that buy information from companies like Facebook, to amass detailed demographic profiles. I think they call it “psychoanalytics” these days.

So there are a bunch of companies out there that are in the business of, I guess essentially they spy on people. The guy who invented the World Wide Web, Tim Berners-Lee, recently spoke about this publicly a few years ago, saying that the Internet is now the world’s largest surveillance network, which is basically run on targeted advertising, for the purposes of advertising. So there are a whole bunch of companies out there on the web that exist to profile people, to really learn what they’re doing, learn about their interests, learn about where they are in physical space, because we’re all essentially walking around with mobile phones in our pockets that are essentially tracking devices that just happen to make phone calls. So we have all these companies out there that are in the business of vacuuming up all these digital data trails and then using them for a bunch of purposes. One of them that’s really pervasive is giving people advertising to sell them stuff in a really targeted and granulated way that’s really specific to them, that can capture your own unique personal characteristics. You see what I’m trying to get at?

Derrick Jensen: Yeah. You know, the whole time I was watching your film, and again in this conversation, I kept thinking about how I started my book Welcome To the Machine, and the way I started it was that when I was a kid, I was a Christian. I’m talking six or eight, in that area. We were told “God knows your thoughts, and God knows everything about you.” And that’s okay, because God loves you. It’s okay. But the Devil can’t read your thoughts, but the Devil can watch you all the time. And even when I was six, seven and eight, I realised that if you can watch someone’s every action, you can effectively know their thoughts. It became very clear to me that even in this construct, effectively, and of course I couldn’t use this language, I was seven years old. But I understood the concept that if you can surveil someone every moment and know if they move from here to here, and know the expressions on their face from selfies, whatever, if you can know a person’s external actions, and you get enough data—that’s the real point, if you can collect enough data on a person, you can begin to effectively predict that person’s thinking.

And a small example that’s true for my dog, is that when I used to leave the house—I mean, I still leave the house, but when I had this one dog, every time I left the house he would get up and walk with me if we went for a walk, but if I went to the car he would not even get up from the porch, he would just sit there. And it took me a couple of years to figure out it was because if I was holding car keys in my hand, he knew I was getting in the car and would not bother to get up. And my point is, if you see somebody with car keys in their hand, you can predict they’re going to get in a car. And that’s just a really trivial thing. If somebody is always looking up some specific thing, if somebody is constantly looking up critics of screen culture, one can begin to surmise that that person is thinking negative thoughts about screen culture.

This whole thing is just a long-winded way of saying that total surveillance means that you can effectively predict someone’s actions. That’s the whole point.

Jordan Brown: Yes. And I would say that’s something that has been purposely designed in some of these technologies. The whole purpose is to encourage disclosure. In one of the things we see a lot, that happens in—I mean, in a lot of online platforms, but particularly Facebook, is that they want to encourage you to disclose. The whole purpose of the technology is to play to certain psychological characteristics so that you are disclosing all the time. And even in other ways, in some ways that you’re not doing directly. I’ve heard of instances where, say, you take a picture on a device such as an iPhone that has a whole bunch of other data embedded into it that you may not even realise. You’re just taking a photo of something somewhere. But the GPS coordinates are embedded into the photo and once you upload that information onto the Internet that information can be extracted and possibly added to your profile.

It’s sort of been baked in to the way these technologies are designed and deployed in the world, that they’re pushing for as much disclosure from us as possible, to come to this point where, yeah, I would argue, as you mention in your book Welcome to the Machine that this is a panopticon, the way that the technology functions is similar, or analogous to what a panopticon is.

Derrick Jensen: And people may not know what that is. So can you tell us?

Jordan Brown: Yeah, sure. Well, I think you’ve mentioned before, probably in a much better way, maybe I could paraphrase you. I think the panopticon was developed by Jeremy Bentham, is that correct?

Derrick Jensen: That is correct.

Jordan Brown: So his design—it’s basically a prison design. So you design a prison in a circle, and you have the guard tower in the centre and all the prison cells around the outside, that face the centre. And the tower in the centre is where the people that run the prison are, the people who are looking at all of the cells. I don’t know what they’re called, what do you call them, the people that run the prison?

Derrick Jensen: Guards.

Jordan Brown: Yeah, the guards. So the guards are in the centre tower and the prisoners are all around the outside. The guard tower is really bright, the lights come from there so the guard tower can see into all of the cells of the prison in the circle. So there’s no way to hide. There’s no interstices of the prison to sort of hide in. But the other effect of this is that if you’re a prisoner you can’t see the guard. You can’t see in the centre because it’s bright. You don’t even know where the centre is. But the other part of this is you also don’t know when you’re being watched. You can be watched at any given time and you don’t know when that is. So there’s this sort of chilling effect on your behaviour that is central to social control, central to controlling people’s behaviour. If you don’t know when you’re going to be observed, you’re going to change your behaviour. Because you could be observed at any time. So it’s sort of this neat way to have control over people.

Derrick Jensen: Earlier you mentioned how phones had to get to a sort of critical mass before they were, I think you called it the “networking effect.” That reminded me of how I’ve read some accounts of when telephones were introduced, there were a lot of people who thought they were incredibly intrusive. Carl Jung, for example, did not like them and felt that they interrupted your capacity to have contemplative time. And it just makes me think about how so often technologies are—at first there is a, there can be a discussion of the harmful effects of them. But then that discussion is swept away and they are presented to us as a, at first as a form of convenience or a form of utility, and then somewhere along the line they become an absolute necessity, until, by now, I’m one of the few people I know who does not carry a cell phone with me everywhere I go. I mean, I hate getting phone calls already, so when, it’s like the notion of actually carrying a phone with me when I’m walking through the forest or running errands or doing something else is just horrifying. But I’m a dinosaur and I recognise that. I was just thinking about this process. I mean, I love the line you said a little while ago about how we’re basically carrying around surveillance devices that happen to be useful as phones. Tracking devices. And so can you talk a little bit about—and I keep thinking also, I’m sorry to go on, but I keep thinking also about one of the Doobie Brothers albums, which was called “What Were Once Vices Are Now Habits.” And I’m thinking about how what was once scary is now accepted. You know, GMO’s were the same way. There was this big discussion; “GMO’s are going to be really scary,” and then at some point it went from “Well, we need to think about it” to “Oh. Here they are, they’re in your food, and by the way, you don’t even get to know if they’re in your food.”

So I don’t know what I’m getting at. Can you pull any question out of this little rant that I just did?

Jordan Brown: Well, there are a few things there. There are a few ways to go. One of the things I’d like to say first is I haven’t had a phone for seven years. I gave up a mobile phone a long time ago, because I went without it for a period of time and I really liked it. And that was seven years ago and I think things have changed a lot since then. And even then, I’ve noticed the way that that has changed a bunch of my social interactions. The way you meet a new person for the first time, or someone wants to get in contact with you. “What’s your mobile number?” Or “How can I find you on Facebook?” I think there’s this weird thing that happens when you’re telling people “I’m not on Facebook” or “I don’t have a phone” or something. It’s like “Who is this person? Why aren’t they part of this grand digital world?” It raises this weird air with people.

So there’s that, that’s the first thing. I think there really is this sort of, and this is something I try and explore just a little in the film, but I think it’s a fairly big topic in and of itself, and could probably—and I hope the film is a catalyst to open up these kinds of discussions. You’ve mentioned that this happens with a bunch of technologies, not just digital technologies, but especially with digital technologies, I think. There’s this effect called “creeping normalcy,” which is the way that slow incremental changes, and perhaps with digital technology it’s not even that slow. The pace of technology is so rapid. But what creeping normalcy is, is this notion that if you—if things change at a slow enough pace, then things become normalised and pass by unnoticed. So yeah. I mean, the examples that you mentioned are quite good. Something comes along that would be seen as objectionable at the time, “This is outrageous!” But it slowly becomes normalised and people sort of forget it, or let go of it. And especially something like technology where if you have even just a question that’s not even critical about technology, you get labelled as a Luddite or you’re backwards or you’re an old fuddy-duddy or something like that. This really systemic cultural objection to even questioning technology.

And there’s this great line by Jerry Mander in the film, sort of saying there’s considerable resistance to the very idea of challenging technology, and even just asking questions. “What are we doing? Is what we’re doing with technology a good thing?” Then I would ask on a social level, how technology influences what society means and how society functions and the implications on society. All of that. What are the negative effects of that? There’s this strong opposition to even asking the question, which I think is problematic. We need to break through that. We need to seriously consider what is happening.

I just saw something in the news today. A former Facebook executive, Chamath Palihapitiya, recently came out saying that he feels tremendous guilt about how he contributed—his job at Facebook at the time was with trying to encourage user growth and get more people on the platform in 2007. And I think in November this year he’s come out saying “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works. There’s no civil discourse, no cooperation. We have misinformation, mistruth. And it’s not an American problem this is not about Russians ads. This is a global problem.”

I think possibly this is a good time for this film to be coming out. We even have former executives in technologies themselves being like “Ah shit, what have we created?”

I think I read something not long ago, too, by someone who was working on the iPad or the iPhone, something like that, saying that he has sleepless nights thinking about how he has children himself and being really confronted by the fact that lots of young people are completely addicted to their phones and there are a whole bunch of negative effects there having to do with addiction. There are stories in, say, South Korea, where people are dying in Internet cafes because they’re playing online games for hours and hours on end without drinking water or going to the toilet and they’re just dying. Actually there was one famous case in South Korea in which, it has a sad irony to it. There was one young couple, who had a daughter in real life, a couple who were completely addicted to playing this online game. And in the online game, the purpose was to skill up and play this game with this little avatar of a girl. And they were so addicted to this game that their real-life daughter died of malnutrition. They would sneak out at night and leave the baby at home and play this game for days on end, and in the real world their own daughter died from malnutrition and complete neglect. They were just completely addicted to this online game.

Derrick Jensen: We have about seven or eight minutes left. You hit one subject that I really wanted to hit, and I wonder if you can, but leave enough time to do another question? The question has to do with you said “dopamine-driven” and you mentioned the word “addicted.” That was another part of the film I thought was absolutely brilliant, was where you talked about the addictiveness of screen culture. Can you talk a little bit about that, about how you get a little hit of dopamine every time you see that somebody liked your comment?

Jordan Brown: That’s essentially it. It’s the same thing—there’s a segment in there about how there’s been a lot of research done on what drives gambling, what makes gambling so enticing. And there’s a behaviourist towards the end of the film who talks about this. It’s from a different era, I suppose. He wrote a book in the 60’s, sort of talking about ways that you can encourage and exacerbate different forms of behaviour modification. And one of the analogies that I tried to make there was that the design of the technologies is similar to playing a slot machine, or something like that. Because a slot machine is built on how you don’t know when you’re going to get these exciting and enticing rewards. So you go playing the slot machine, constantly looking to get that little hit. It’s analogous to the online experience where you check your phone or; one of the first things a lot of us do when we wake up is we check our phone. What do you get? Sometimes you get an exciting new TED talk, or you get an email from the girl you’re trying to date or something like this. And other times you get nothing. You just get junk, or spam. So it’s a similar kind of phenomenon in the brain, in which you don’t know when you’re going to get that exciting and enticing reward, and that’s what drives the addiction, that’s what drives the—they call it stickiness drivers, the notion of “How can we make our product so enticing that people keep coming back?”

This is something that the technologists, they employ teams of psychologists to tweak very specific things about not only their programs, but the experience of the programs, to make them completely enticing and to keep people coming back. This idea in advertising goes way back to the early days of advertising where it’s which colours do people respond to the most? And which media vehicle do we execute certain advertising messages in, to get people to do certain things? But I think the difference now is that with digital technology these insights are so much more valuable to psychologists and advertisers and the people who are working on marketing messages. They know the ways we will react best. They know how to push our buttons because we’ve disclosed so much about ourselves. Possibly in some ways they can know us better than we know ourselves. There are huge implications there, it’s deeply problematic.

Derrick Jensen: One of the things I love about your work—okay, my next question’s going to be: So what do we do about this? One of the things I love about your work, and you did, after all, turn my essay “Forget Shorter Showers” into another great film. I know that your sole solution to this is not going to be “Well, turn it off and step outside.” Because that might work for you or me, but that doesn’t actually solve the social problem. So having said all this—sorry to paint you into a corner here—but having said all this, what do you want people to do with the information? So somebody watches your film, and again, the name is “Stare Into the Lights My Pretties.” Somebody watches your film, what do you want them to do with the analysis and information they gain from this?

Jordan Brown: I guess the first thing would be to—I mean, there are a bunch of things. One of the things that I would hope would come out of watching the film would be to question our own—to look at ourselves, first. How am I being influenced by technology and what does that mean? And then, what is my understanding of these technologies? How am I being manipulated, what’s happening? And then I guess the next part of it is; well, if I think that’s a bad thing—we do need to turn off the computers, we need to sort of walk away from this. But we also need to, I think, really have some solid reflection about many of the larger questions that stem out of this.

I would like people to ask the question of who benefits from this the most? Because I think it’s clearly not us. The corporations that are completely controlling our online experience have a really strong grip on our lives. I think that’s problematic. And I hope that one of the questions that people have after watching this film, and perhaps if they agree with that, is how do we tackle these huge corporate interests that have extensive influence over our lives? And one of the things that I’d hope would come out of that is basically a strong critique of capitalism, in that we need to—it’s not just about turning off the technologies and walking away. It’s about completely rethinking the social system that gives rise to these technologies and perpetuates them, and changing that, and really putting a spanner in the works. So we’re talking about completely rethinking the social and economic systems that exist in the world. Completely walking away from capitalism, I suppose, and imagining some other world. And I hope that’s possible.

Derrick Jensen: Well, it has to be possible. And the thing we haven’t really talked about is the environmental implications of digital technologies.

Jordan Brown: Well yeah, that’s very important.

Derrick Jensen: And if we can’t figure out a way to solve all this then there’s not going to be a planet.

Jordan Brown: That’s one of the big things, I think. Fundamentally these technologies aren’t—I think it was John Michael Greer wrote a book called the Ecotechnic Future. I may not have this right, but I have a memory about how he was talking about the amount of energy that is required to run the Internet. And I guess we could look at this in other ways too, like the energy that goes into developing computers or the rare earth minerals that are required to make computers exist. That’s completely not sustainable. But just looking at the Internet, the amount of energy that is required to make the Internet happen is quite astounding. I think I was reading something before about how at current rates of consumption, by 2020 the web-hosting industry will surpass the airline industry in global environmental pollution. So that’s quite astounding to think about. The global airline industry, the total output of their pollution will be smaller than what is output by the Internet by 2020 at current rates of consumption. And that ignores growth, so is probably something that will happen quite sooner in that there’s this complete explosion with digital technologies, with the growth, it’s insane.

The environmental impacts of technology I think are another thing that isn’t considered or spoken about as often as it should be. The environmental impacts are horrifying. We can talk about things like what’s going on in the Congo, the mining of the rare earth minerals that go into our gaming consoles and smart phones and tablets, all of these technologies. There’s a complete horror that’s happening in that part of the world. Human rights abuses and the civil war, warlords and pogroms, all the horrible things that are happening there, that are basically driven by companies like Sony and Nokia and Apple and so on. The environmental implications of digital technology span out in a bunch of ways. It’s a huge conversation in and of itself. The end point is that this is not something that can continue indefinitely. There is an end point to this. We need to choose if we’re going to have a soft crash and withdrawal from the screens in a way that can wake us up to real life again, or if we’re going to just stay embedded in these computers, completely distracted, withdrawn from the real world, and have a harsh, brutal reawakening to reality, while the real world burns.

Derrick Jensen: Well, I highly recommend that anybody who listens to this—how can people watch your film? That’s the last question.

Jordan Brown: They can go to my website or look for it on YouTube. Or the Internet Archive is probably another more interesting place to find this. It’s out there. You can find it. I hope people enjoy it.

Derrick Jensen: I recommend everybody watch it. And I would like to thank you very much for being on the program. I would like to thank listeners for listening. My guest today has been Jordan Brown. This is Derrick Jensen for Resistance Radio on the Progressive Radio Network.

* * *

END OF TRANSCRIPT

is a prolific author and radical environmental activist. He was named one of Utne Reader’s “50 Visionaries Who Are Changing Your World” and won the Eric Hoffer Award in 2008. The topics of some of his work include the pathology of abuse, domestic violence and dominant culture; the problems of civilisation, and strategies for resistance. He has taught creative writing at Pelican Bay State Prison and Eastern Washington University; and has written for Orion, Audubon, and The Sun Magazine, among many others.