Jore Fri, 12 Oct 2018 11:33:23 +0000 en-US hourly 1 Stare Into The Lights My Pretties Mon, 20 Nov 2017 15:41:11 +0000   Detail We live in a world of screens. The average adult spends the majority of their waking hours in front of some sort of screen or device. We’re enthralled,… more




We live in a world of screens. The average adult spends the majority of their waking hours in front of some sort of screen or device. We’re enthralled, we’re addicted to these machines. How did we get here? Who benefits? What are the cumulative impacts on people, society and the environment? What may come next if this culture is left unchecked, to its end trajectory, and is that what we want?

Stare Into The Lights My Pretties investigates these questions with an urge to return to the real physical world, to form a critical view of technological escalation driven by rapacious and pervasive corporate interest. Covering themes of addiction, privacy, surveillance, information manipulation, behaviour modification and social control, the film lays the foundations as to why we may feel like we’re sleeprunning into some dystopian nightmare with the machines at the helm. Because we are, if we don’t seriously avert our eyes to stop this culture from destroying what is left of the real world.


Written and Directed by Jordan Brown. Original camera by Jordan Brown, Masao Tamaoki and James Tomalin. Music by Jore, Sigur Rós, The Cinematic Orchestra, Ólafur Arnalds, Bonobo, Soundsource, Bzaurie, Clark, Rollmottle, Ma Spaventi, Nils Frahm, Max Richter, Eunoia and Seame Campbell.

Additional footage credit where credit is due is made out to respective creators, some of whom are: Em Styles, Katerina Vittozzi, Eric De Lavarène, Isabelle Delannoy, Brian Frank, Yann Arthus-Bertrand, Ivan Cash, Yordan Zhechev, Ron Fricke, Monika Fleishmann, Raymond Delacruz, Rob Featherstone, Michael Mcsweeney, Juan Falgueras, Trevor Hedge, Jean Counet, David Kleijwegt, Godfrey Reggio, Naomi Ture, Chris Zobl, Siddharth Hirwani, Melly Lee, Refik Anadol, Marc Homs, Schnellebuntebilder, Kyle Littlejohn, Tobias Gremmler, Marina Wanderlust, Kristopher Lee, Brandon Johnson, Nicolas Fevrier, Judd Frazier, Ben Stevens, David Fedele, Frank Wiering, Rob Mcbride, Vido Yuandao, Justine Ezarik, David Machado Santos, Vasco Sotomaior, Wolfgang Strauss, Kornhaber Brown, Matthew Epler, James Kwan, China Techy, BigThink, Gigaom, Inc. Magazine, The Guardian, TED, TEDx, BBC, ABC, CNN, Indymedia; and all further credit where credit is due for unknown or unattributed creators whose work appears.

Content creators and/or participants may or may not agree with the views expressed in this film, which was made with no budget, not-for-profit, and is released to the world for free for the purposes of critical discourse, education, and for cultivating radical social and political change.

A very heartfelt thank you to Kyle Magee, Stuart Brown, Katina Michael, M.G. Michael, Debra Protaseiwicz, Nathalie Crawford, Aleisha Manion, Claire Hilton, Christopher Brown, Matthew Brown, Rachel Williams, Liz Shield, Oliver Grabinski, Matilda Stevens, Rachel Clutterbuck, Matthew Storen, Steve Fraser, Andrew Protaseiwicz, Antonietta Melideo, Eric Crouch, Deanne Love, Masao Tamaoki, Lelia Green, Aleks Krotoski, Susan Greenfield, Eli Pariser, Sherry Turkle, Andrew Keen, Nicholas Carr, Roger Clarke, Derrick Jensen, Jerry Mander, Bruce Schneier, Lierre Kieth, Clifford Nass, Douglas Rushkoff, Evgeny Morozov, Jon Ronson, Lewis Mumford, Nafeez Ahmed, Angela Daly, Mikko Hypponen and Rebecca MacKinnon.


1. Introduction
2. “Progress”
3. No Accident
4. Mindset (Screen Culture)
5. It’s All About Me!
6. The Megamachine
7. Creeping Normalcy (Surveillance Camera Man)
8. Vegged Out
9. It’s Full of Sugar and It Tastes So Nice
10. The Real World
11. Credits

Watch Online »






November 26th, 2017 Move Me Productions Film Festival Antwerp, Belgium
December 2nd, 2017 Barcelona Planet Film Festival Barcelona, Spain
December 28th – 31st 2017 The Independent Film Collaborative Dhaka, Bangladesh
January 2nd, 2018 The Mediterranean Film Festival Siracusa, Italy/Sicily
January 19th – 20th, 2018 European Cinematography Awards Warsaw, Poland
January 27th, 2018 Eurasia International Film Festival Moscow, Russia
March 2nd-11th, 2018 Festival international Signes de Nuit Tucumán, Argentina
March 8th, 2018 Plan A at Betahaus Berlin, Germany
March 7th – 11th, 2018 Silk Road International Film Festival Dublin, Ireland
March 9th, 2018 Barcelona Around International Film Festival Barcelona, Spain
March 18th, 2018 Indie Lincs Film Festival Lincoln, United Kingdom
March 22nd – 25th, 2018 Fuori Mercato Independent Film Festival Como Lombardia, Italy
March 28th, 2018 CRCLR House Berlin, Germany
April 11th – 14th, 2018 Festival Internazionale Segni Della Notte Urbino, Italy
May 11th, 2018 Melbourne Documentary Film Nights Melbourne, Australia
June 27th, 2018 Spotlight Documentary Film Awards Atlanta Georgia, United States
June 30th, 2018 Free Speech TV Denver Colorado, United States
August 9th, 2018 Channel 44 Television Adelaide, Australia
August 13th, 2018 Civic Media Centre Gainesville Florida, United States
September 21st, 2018 b-ware! Ladenkino Berlin, Germany
September 23rd, 2018 OmniCommons Oakland California, United States
October 10th – 14th, 2018 Bozcaada International Festival Bozcaada, Turkey
October 17th – 20th, 2018 One World Berlin, Human Rights Film Festival Berlin, Germany
October 19th, 2018 Channel 31 Television Melbourne, Australia
November 7th, 2018 Soapbox Gallery New York, United States
]]> 0
Somnolently Soliloquy Thu, 08 Dec 2016 05:02:54 +0000 Somnolently Soliloquy by jore About this Track Released 8th December 2016, independently. Written, performed, recorded and mixed by Jordan Brown throughout the many rooms of a warehouse on Pitt St,… more


About this Track

Released 8th December 2016, independently.

Written, performed, recorded and mixed by Jordan Brown throughout the many rooms of a warehouse on Pitt St, Brunswick…

Instruments used were real drum kit, real bass guitar, real voice, real upright piano, real guitar with unreal computer wash, no synths allowed. Organic dreamy landscape emerges from audiomulch patch, pedal pushings, tape delay and the scissor-hand inside the computer.

Artwork by Jordan Brown, again inspired by cells—roots and tree is a flat scan of human brain neuron branches; sky, landmass and colours derived from lichen ascocarp as clouds and space. Disc is mistletoe stem micrograph by Keith Wheeler.

Thanks to Joshua Lapham, Rachel Clutterbuck, and Matthew Storen.

Dedicated to Aleisha Manion.


Somnolence. “The a state of strong desire for sleep, or sleeping for unusually long periods. It can refer to the state preceding falling asleep/dream-like state.”   Somnolently (the the act of, or, being like).

Soliloquy. “Dramatic or literary form of discourse in which a character talks to himself or herself or reveals his or her thoughts by speaking to oneself/internal monologue to the audience.”

Anti© – Creative Commons BY-NC-SA 2016.

]]> 0
Sketches to Remember Sanity Mon, 20 Jun 2016 22:14:24 +0000 Sketches to Remember Sanity by jore Titles 01. Uncolonising our hearts and minds (03:46) 02. Fingers glued to the screen which are glued to the eyes which are glued to… more



01. Uncolonising our hearts and minds (03:46)
02. Fingers glued to the screen which are glued to the eyes which are glued to the screen (03:58)
03. You don’t have me (14:07)
04. Strength to destroy the technics (04:38)
05. I can hear you (02:21)
06. Sleeping in the fluff of The Cloud (Virtual reality is a toxic mimic of Physical Reality) (12:34)
07. Five Eyes, so many eyes, pay attention (03:21)
08. The sound of Predictive Analytics algorithms (02:34)
09. Warning tones (01:36)
10. What’s left of the world (04:01)
11. The Torrent Sea (06:16)
12. Do not give up (03:37)

About this Album

Released 21st June 2016, independently.

A record of soundtracks and landscapes centred around the theme of this culture’s addiction to the virtual world while the real world burns, too busy being glued to screens like zombies. It’s insane.

This record is a call to build a culture of resistance, for strength and resilience in fighting back, before it’s too late.


Written, performed, recorded, mixed and produced by Jordan Brown. 2016.
Cover artwork inspired by a microscopic slide of the sclereid cells of a pear fruit.
Thanks to Krzysztof Szkurlatowski, Alison Ferrett, Pegs Adams, Derrick Jensen, Nathalie Crawford, and Katina Michael.

Anti© – Creative Commons BY-NC-SA 2016.

]]> 0
Forget Shorter Showers Tue, 25 Aug 2015 07:28:16 +0000 Detail Would any sane person think dumpster diving would have stopped Hitler, or that composting would have ended slavery or brought about the eight-hour workday; or that chopping wood and… more



Would any sane person think dumpster diving would have stopped Hitler, or that composting would have ended slavery or brought about the eight-hour workday; or that chopping wood and carrying water would have gotten people out of Tsarist prisons; or that dancing around a fire would have helped put in place the Voting Rights Act of 1957 or the Civil Rights Act of 1964?

Then why now, with all the world at stake, do so many people retreat into these entirely personal “solutions”? Why are these solutions not sufficient? But most importantly, what can be done instead to actually stop the murder of the planet?


Words based on the essay of the same title by Derrick Jensen. Original camera by Jordan Brown and Masao Tamaoki. Music by Jordan Brown, Chris Clark, Alan Whitehead, Amon Tobin, Horace Silver, Bonobo and Murcof. Additional footage credits to respective creators, some of whom are: Chris White, Babak Tafreshi, Melissa Parker, Colin Riche, Alex Delany, Sam Irwin, Jeff Grewe, Chris Pritchard, Ben Knight, Danny McShane, Joel Schat, Simon Waller, Ron Fricke, Franklin López, Michel Benjamin, Dominique Gentil, Emily James, Mårten Nilsson, Lukas Eisenhauer, Josh Fox, Peter Mettler, and Indymedia. And all further credit where credit is due for unknown or unattributed creators whose work appears.



March 5th, 12th-13th, 2016 Project Native Film Festival Massachusetts, United States
April 3rd, 2016 Craque-Bitume Quebec, Canada
April 14th – 15th, 2016 Georgetown University Film Festival Washington DC, United States
April 23rd, 2016 Earth Port Film Festival Massachusetts, United States
May 20th, 2016 Ekotop Envirofilm Festival Bratislava, Slovakia
May 21st, 2016 May Day Sustainability Shorts Film Festival Illinois, United States
July 29th – 31st, 2016 Social Machinery Film Festival Mantova, Italy
July 27th, 2016 Where Is The Horse Film Festival Monterrey, Mexico
August 5th – 7th, 2016 G2 Green Earth Film Festival California, United States
October 6th – 16th, 2016 Awareness Film Fest Los Angeles, United States
October 6th, 2016 Wandering Reel Traveling Film Festival Tofino, Canada and more
October 7th – 9th, 2016 Below Five Zero Melbourne, Australia
October 14th – 16th, 2016 Kuala Lumpur Eco Film Festival Kuala Lumpur, Malaysia
October 15th – 19th, 2016 Ekotop Envirofilm Festival Prague, Czech Republic
October 17th – 23rd, 2016 Life Sciences Film Festival Prague, Czech Republic
October 21st – 31st, 2016 American Conservation Film Festival West Virginia, United States
November 7th, 2016 Myrna Loy Center Montana, United States
November 16th – 20th, 2016 OFF Cinema Festiv Poznań, Poland
January 5th, 2017 Egyptian Theatre Idaho, United States
January 12th – 16th, 2017 Wild & Scenic Film Festival Nevada City California, United States
February 12th, 2017 The Siskiyou FilmFest Grants Pass Oregon, United States
March 7th, 2017 The Flying Monkey Movie House Plymouth, United States
March 18th, 2017 Salida Steam Plant Theatre Colorado, United States
March 18th, 2017 Auburn State Theatre California, United States
March 25th, 2017 Picture Farm Film Festival Brooklyn New York, United States
March 25th, 2017 Columbia Schoolhouse Cultural Centre San Juan, United States
April 5th, 2017 Violet Crown Cinema Charlottesville, United States
April 6th, 2017 Harrisonburg Court Square Theatre Virginia, United States
April 14th, 2017 Whitman College Washington, United States
May 11th, 2017 California State University, Chico California, United States
May 13th, 2017 Doris Duke Theatre Honolulu, United States
September 20th – 24th, 2017 Matsalu Nature Film Festival Lihula, Estonia
October 12th, 2017 Greenwich West Arts Centre London, United Kingdom
October 14th, 2017 Reflection Riding Arboretum & Nature Center Tennessee, United States
October 22nd, 2017 Red Mountain Theatre Alabama, United States
October 26th – 29th, 2017 Noosa International Film Festival Queensland, Australia
October 28th, 2017 Soper Reese Community Theatre California, United States
October 28th, 2017 Henderson Campus Center, Allegheny College Pennsylvania, United States
November 2nd – 6th, 2017 CMS VATAVARAN Film Festival and Forum New Delhi, India
November 3rd, 2017 Auburn University Goodwyn Hall Alabama, United States
November 10th, 2017 Museum of Northern California Art California, United States
November 10th, 2017 Flying Monkey Arts Center Huntsville, United States
November 16th, 2017 Roxy Theatre, Missoula Montana, United States
November 16th, 2017 Seacrets Jamaica USA Montana, United States
November 25th, 2017 Columbia Valley Centre British Columbia, Canada
December 8th, 2017 Great Falls Myrna Loy Center Montana, United States
December 9th, 2017 Wimberley Corral Theatre Texas, United States
December 16th, 2017 Auroville Film Festival Tamil Nadu, India
January 19th, 2018 Kino Moviemento Berlin, Germany
January 20th, 2018 Russel C. Davis Planetarium Mississippi, United States
January 24th, 2018 Cottonwood Cinema Montana, United States
January 25th, 2018 Judith Theatre Montana, United States
January 25th, 2018 River Oaks Theatre Texas, United States
January 25th, 2018 Rose Hills Theatre, Pomona College California, United States
February 3rd, 2018 Prince Charles Secondary School Creston, Canada
February 17th, 2018 Rosenheimer Moviemento Kino Munich, Germany
February 28th, 2018 ARTisan Berlin, Germany
April 11th, 2018 Punkfilmfest Berlin Berlin, Germany
April 25th, 2018 Renewal Film Series, Soapbox Gallery New York, United States
July 4th, 2018 Indie Best Film Festival California, United States
August 4th, 2018 Cool Community Hall California, United States
September 6th, 2018 Carrboro Century Center Chapel Hill, North Carolina, United States
September 8th, 2018 Schaible Auditorium Fairbanks, United States
September 13th, 2018 Roxy Theatre Revelstoke, Canada
September 20th, 2018 RA International Independent Film Awards Vilnius, Lithuania
September 21st, 2018 Escalante High School Utah, United States
October 27th, 2018 Sierra Nevada Brewery Chico California, United States
]]> 0
Interview with Angela Daly about The Surveillance State Mon, 15 Dec 2014 09:22:16 +0000 Interview for current documentary project. Angela Daly is a research fellow in media and communications law at the Swinburne Institute for Social Research in Melbourne, Australia. Her PhD thesis is… more


Interview for current documentary project.

Angela Daly is a research fellow in media and communications law at the Swinburne Institute for Social Research in Melbourne, Australia. Her PhD thesis is on the corporate dominance of the Internet.



[0:00:00] Jordan Brown: So the governments and corporations in synergy have a grip on our lives, empowered by technology, like never before. What does that look like?

[0:00:10] Angela Daly: Well I think the kind of government-corporate access looks like the same technology, or the same design, kind of having a dual function: one, a commodifying function, a capitalist function, whereby money is made and profit is gained for the companies; but also has secondary or second function whereby surveillance and control are facilitated for governments as well. Key to this is a kind of vast gathering of data about us, about people, whereby that data is very valuable to corporations because they can make money out of it, and its valuable for advertising, to sell on to other companies, and so on and so forth; but it also is incredibly valuable to our governments too, to have this very rich and very personal information about all of us.

[0:01:05] Jordan Brown: Can we talk more about what the corporatism—specifically looking at the Internet—what does that space look like? And I’m thinking now of how corporations have physical control over the infrastructure, they build the devices, a lot of them are rolling out the services. What does that look like? Can you talk about that generally? I’m thinking about Google and Facebook…

[0:01:47] Angela Daly: Well I suppose, I think, I’m not a historian necessarily of the Internet and of technology, but my understanding certainly is that it’s really business models—like Google’s in particular—whereby this data gathering has been key to Google succeeding as a business, and suddenly governments and particularly the US government and its allies in other parts of the world, suddenly got very interested in this huge amount of information that was being gathered, if they weren’t interested before. So it’s hard to tell; a bit of ‘the chicken and egg’ situation about what came first, but my inkling is that this business was set up and then became extremely successful, large amounts of data were gathered and that was highly convenient for the government surveillance apparatus as well. I think also, it’s a coincidence of various events is too. I mean, around the time Google was formed, this is kind of the “9-11” or the “post-911 era”; there was a great kind of desire from the government’s perspective to engage in whole lot of surveillance and justifications that the general public were scared into accepting essentially; and also coinciding with the kind of rampant neoliberalism, in that the end of history from the collapse of the Soviet Union onwards. So I’d say that all these things and more have kind of contributed to the situation that we’re in; to why corporations are doing this and why they’re being allowed to do this and also what the state interest is too, and I don’t think it’s easy to understand without understanding all these other and trends going on around the same time.

[0:03:33] Jordan Brown: Sure. I was thinking then, as you were talking, about how there’s a note somewhere that I read: when Google was starting up, they got some sort of venture capital, some kind of injection of money from DARPA; and how later on, Google was found to be—and this is early on in the piece, I can’t remember what year—something called Total Information Awareness, in the States. That was one of their ‘first’ sort of efforts to pool together a whole bunch of third party data-stores and use it for profiling. They [the government] did it across a lot of different data sets—financial records, etc. Google was a player in that, and Google was doing private dealings with the NSA, selling search equipment, etc.

[0:04:33] Angela Daly: I guess for all the rhetoric around ‘small government’ and neoliberalism, that has not actually kind of played out in practice, and certainly not in the tech sphere. Again, for all of the Silicon Valley and libertarianism, there seems to be a very symbiotic relationship with certain parts of the US government. So not the kind of regulatory parts as such, where these companies will recoil from the idea of more regulation of their business practices, but there has been a kind of symbiotic relationship with other parts of the government, particularly the US government, when it comes to as you mentioned selling surveillance apparatus, or equipment, or taking money from these—working in concert with these institutions to develop new products and so on. So again, aside from neoliberalism as an idea and as a phenomenon being a complex relationship—not a simple relationship—between companies, corporations, and the government; as this is played out in the surveillance, kind of, Internet area as well, it’s not an easy relationship either.

[0:05:42] Jordan Brown: You mentioned the commercialisation of the Internet, and I want to talk about how we see the pervasiveness of target advertising. So maybe we can start really generally with that and then pick it apart. Could you explain: What is it? What is target advertising?

[0:06:12] Angela Daly: I suppose advertising kind of coming up on websites that you are visiting, based on your previous browsing history or in the context of Google what you’ve been searching for before. So based on your web browsing history, your previous actions online, and so what you’re seeing supposedly is adverts tailored to you based on that. Of course, I mean, some adverts are better tailored than others, but the idea is rather than kind of advertising to the mass overall, to everyone, where not everyone will be interested in the same things; trying to kind of advertise to niche interests, to niche groups of people—and the idea being this will be more effective from the advertiser’s point of view.

[0:07:10] Jordan Brown: It can be insidious though.

[0:07:12] Angela Daly: Sure. I suppose inherent to targeted advertising is this surveillance and profiling of people and their habits, their online habits, and even to the point of trying to predict what they would be interested in, or not; with, you know, that archetypal example being the woman where Target knew she was pregnant before she even did.

[0:07:41] Jordan Brown: The Dad got really upset.

[0:07:43] Angela Daly: Sure. There are very difficult and unpleasant consequences to that for our privacy and integrity I guess too; our autonomy, and being in control of what we reveal and don’t to others.

[0:08:06] Jordan Brown: I’ve got a quote by Douglas Rushkoff … Digital Nation and Generation Like … picking apart [target advertising]—I don’t want to say subtle, but this really insidious and kind of clever way that target advertising assembles itself. I mean, sometimes it can be very loud and direct, but other times it’s more… I guess we’re sort of trending towards the concept of the Filter Bubble—the goal of the advertiser is to make the content indistinguishable from other content.

[0:09:04] Angela Daly: Sure. I suppose kind of behind the scenes as well, there’s a total lack of transparency over what data has been collected about people, who is using it, whose selling it on to others and for what purposes, and I think this has even attracted the attention—in the US of all places—attracted the attention of the Federal Trade Commission, where I believe they’ve recently released a report on data brokers. So, kind of these very shadowy companies usually that are kind of selling on this data to others which has been collected by others; and it’s very hard for an individual user to really know what is being collected about them and who is using it and what consequences that may have—I think a particularly somewhere like the US, where that can determine things like your credit rating, health insurance, and so on and so forth.

* * *

[0:10:05] Angela Daly: In Europe where there is stronger privacy law, data protection law, these kinds of practices are more restricted but that isn’t to say they don’t happen. The enforcement of these laws is also a real problem, particularly when the Internet is transnational by its very nature, and so even for Europeans, their data, they’ve maybe been profiled by organisations in other countries such as the US where although in theory there is protections of privacy and particularly of European’s’ privacy through a whole series of agreements in practice, these protections are not well enforced; nor does anyone really know what’s going on and that’s a real problem, let alone for those who are living in countries where there are not very strict privacy laws. Even the privacy laws that we have are not particularly well adapted to the current situation of vast amounts of data being gathered about all of us whenever we use the Internet and all those other kinds of technology.

[0:11:14] Jordan Brown: I’m thinking now about the data-brokers like ChoicePoint, Axiom, Quantium, LexusNexis; and them being the big, well-known corporate data-brokers.

* * *

[0:11:30] Jordan Brown: This is Total Information Awareness’s logo. It’s a pyramid with an eye on the top of it… How crazy is that? So creepy…

[0:12:01] Jordan Brown: Unpacking this idea of data analytics, assembling massive amounts of data, mining through it for patterns, trying to predict things; we’re essentially talking about—in one aspect definitely—some kind of manipulation, right? Both on the individual level and on the societal level.

[0:12:26] Angela Daly: Well I think how these trends are playing out is very much in a nefarious way essentially. I personally do not support the vast gathering of data about everyone and so on and so forth, but I do accept that perhaps this kind of idea of ‘big data’ and ‘big data analytics’ may, in certain circumstances be beneficial. So I wouldn’t rail necessarily against vast gathering of data per se, even though I can see that there are some positive uses of it. But I think we need to be very aware of who really is behind a lot of this data gathering and what the real purpose is; and the fact too that a lot of this data is very badly stored so it’s very insecurely stored, yet may be very intimate data about individuals. So I think, the critique is not necessarily the technology itself, but really what the uses of this technology are and for what purpose; and whose doing it, and I think there is a lack of an awareness firstly, and a lack of critique of this.

[0:13:40] Jordan Brown: Definitely. What about the fact, with ‘big data’—sure, we could argue some ‘beneficial uses,’ but at the end of the day, the government can just come in and take that anyway? And the data might be totally innocuous, but this is not the point though. Once you add it with perhaps another data set or maybe look through other patterns of seemingly innocuous data, you can still pull out a lot of richness from that, right?

[0:14:09] Angela Daly: Sure. I guess this is what I mean about, in practice, this being highly concerning given what’s happening, given what seems to be the strong government interest in gathering as much data about all of us as possible for who knows precisely what purpose. I think as a result we should be resisting this data collection despite potential positive elements, essentially because I don’t think the people engaging this data gathering nor the powers that be above them have demonstrated that they are trustworthy enough to look at our data essentially, and to manage our data and to use it for ‘good purposes’ or beneficial purposes. I don’t feel that this is really been demonstrated to us and therefore I don’t find them trustworthy at all.

[0:15:00] Jordan Brown: Can I ask you if you think that’s inherent in some of the technologies though, the fact that a lot of these digital technologies create a trail regardless, which then can—and as we see—empowers the surveillance state?

[0:15:17] Angela Daly: I think this goes back to what I was saying at the beginning. I think also the design of this technology is very important (I suppose, I didn’t quite say that). I think the design of technology is very important and I think particularly when you look at, for instance, Google’s business, Google’s services have been designed to gather lots of information and data about the users for, I would imagine, initially Google’s own profit-making purposes, but this also has an ancillary—a very strong ancillary benefit—to government efforts, surveillance purposes as well. I think what is particularly worrying for me is the transition to mobile devices. So ‘smartphones,’ tablets and even now kind of wearable technologies. So things like ‘Fitbit,’ and so on; which, there is no option to opt-out of this data gathering when using these devices. So it’s not only the kind of apps that people are using, it’s also the devices themselves. So at least on a kind of traditional PC or laptop, every time you’re using this machine, particularly if you are off-line, you weren’t necessarily contributing to this data gathering, whereas these devices, there is no choice. At least in the commercial offerings, you cannot buy a device whereby your data is not harvested at all times, that you’re connected to the Internet and probably even connected and to a 3G telecoms tower as well. So I see this as being particularly worrying. Even if you want to protect your privacy in using certain kinds of devices, this is now pretty much impossible. Yet, that’s the way things are going. And so if you want a mobile phone these days then you don’t really have the option of buying one—at least somewhere like Australia, whereby it’s a kind of ‘dumb phone’ rather than a ‘smartphone’ and every time you use it you are able to protect yourself and your privacy a bit more than perhaps… So I think the technological developments particularly the last 10 years have been very much with this design, that all data or as much data as possible as can be gathered about the user, and there is very little the user can do to kind of opt-out of that.

[0:17:46] Jordan Brown: I’m thinking of that old line, “We’re all walking around with a tracking device in our pocket that just happens to make phone calls.” While you were talking, I was thinking about how the same would be true with the ‘Internet-of-Things.’ Maybe you’ve opted-out, and you’re not even carrying your tracking-device-that-just-happens-to-make-phone-calls down the street with you, but what happens when the environment is embedded with sensors, much in the same way as we have prolific CCTV everywhere? It does beg the question of ‘choice.’ I mean, even the choice of having a mobile phone these days is not really much of a choice.

[0:18:34] Angela Daly: Sure. For instance, in the West, developing countries, opting out of online-anything is incredibly difficult when you know the government is delivering services online as well. It’s kind of, you have to opt-out of society as a whole to opt-out of this data gathering and surveillance.

[0:18:55] Jordan Brown: But that’s impossible though, right?

[0:18:57] Angela Daly: I mean, sure, yeah. Actually, I think in Australia, there’s an interesting case of an Aboriginal man who is trying to opt-out of the system entirely, not necessarily for anti-surveillance purposes, but I guess because he does not recognise the legitimacy of Australia in any respect.

[0:19:18] Jordan Brown: Yes, for sure.

[0:18:57] Angela Daly: I think there was an article in The Guardian about him recently which was really interesting, about how he is trying to live without interacting with the state apparatus in any way and it’s incredibly difficult. So that. You do have these examples of people who are trying to opt-out and it’s incredibly difficult. Also, going back to technology, what you’re saying is totally right—so even people who don’t carry mobile phones, have a ‘smartphone,’ have Facebook and Google accounts and so on; are still being profiled, are still being tracked through others. So there’s this whole phenomenon I guess of what are called shadow profiles on Facebook; whereby, for instance, people who are not on Facebook will have photos uploaded, Facebook uses facial recognition technology—although it’s not supposed to in Europe but who knows whether it actually does but certainly in other countries—it uses this facial recognition technology, can identify people’s faces. So if you have friends or relatives who are not on Facebook but people are uploading photos of them, then Facebook can presumably create these kind of private profiles let’s say, of people. So they have data gathered already about individuals who’ve not opted into the service. Similarly, now that we all have ‘smartphones’ or tablets with cameras, to what extent these cameras are working when they’re are officially ‘off,’ and to what extent images of others are being captured is a real, for me anyway…I think that that’s an interesting issue to explore and it is very…given some of what we do know already about ‘smartphones’, tablets, and an the tracking thereof; also, what we already about webcams actually being activated when they’re supposed to be inactivated, then yeah, I kind of agree with the fact that we now, many of us have these tracking devices and even people who want to opt-out of them, it’s more and more difficult to do so, particularly if you’re living in a big kind of modern city, where even if you want to opt-out, everyone around you is not.

* * *

[0:21:40] Jordan Brown: Maybe you’ve covered it a little bit, but what are the implications of this working in tandem?

[0:21:59] Angela Daly: What I will say is that I think we’re at a really kind of crucial, key moment at the moment, in a whole lot of respects, but particularly: this apparatus is all in place, we have a kind of happy marriage between the governments and big corporations in a whole wide range of spheres, but technology corporations as well; the situation kind of suits them well, but not the general public necessarily. And certainly, when surveys are done about people and their views on privacy, generally people do value privacy, they would be happy to pay for kind of privacy-enhancing alternatives, but the market is not offering this. So…

[0:22:50] Jordan Brown: Sorry, I was even going to interject there. Is that even possible though with what’s already been built? I mean, some talk about encrypting their e-mails and that’s why we’ve got the spooks going head on with, “How do we intercept the private keys, so even if we sit in the middle and intercept that communication,” which is of course what they do by tapping the fibre links, or the connections between those things, even if you’re going after so-called ‘privacy enabling technologies,’ it’s like this arms-race…

[0:23:28] Angela Daly: Yeah, exactly. … I don’t think there’s any interest on behalf of corporations or the government to improve our privacy or to at least A) respect our privacy rights or to enact laws which are going to force either the governments or corporations to provide more privacy protection to individuals. I think that, and we can see why, because this situation suits both parties very well even if it’s not satisfactory for the general public as such. But also, I mean the general public have been persuaded that this is okay, that “they should be willing to give up their privacy because of ‘terrorism’ and serious crime” and so on and so forth, even though there seems to be very little evidence that such vast data gathering and surveillance actually prevents terrorism or serious crimes happening. Various terrorist attacks have happened in spite of this kind of vast data gathering and surveillance, but nevertheless that’s certainly the rhetoric; the justification very much is one that people are buying into to some degree because they are worried about these things and they want to feel safe, they want to feel secure and so they have been persuaded that this is a kind-of-an-okay bargain to make. And I think nevertheless, the media in particular has been, in certain countries—I think Australia in particular—when it’s come to discussions of these recent national security anti-terrorism laws and proposed mandatory data retention, I think the media and even opposition politicians have been very scared and weak to—or even not very well informed—to oppose some of these laws and proposals that are highly intrusive of civil liberties and privacy and so on. I think that this rhetoric around security and terrorism and so on does have to be critiqued, it can’t be a subject in society that we’re unwilling to talk about, unwilling to criticise or think about critically let’s say. However, the status at the moment is very much that these things cannot be questioned or critiqued, and I think that there’s too many politicians particularly in supported opposition parties that are kind of playing it too safe with this kind of thing.

[0:26:24] Jordan Brown: A few things I wanted to say there. The ‘terrorism’ drawcard is one that is used often. … one small insight I had into this was, I think a few years ago now, where Nicola Roxon on ABC Q&A or maybe it was just a newspaper article, but beside the fact, I think it was about a some climate activists, said something like climate activists are in the same league as terrorists because they pose threats to critical infrastructure. And that blew me away. Because what they’re doing is lumping legitimate political activists into the same league as terrorists which justifies the use of these vast spook apparatuses against legitimate political expression. That’s massive, right?

[0:27:29] Angela Daly: Well I think, certainly from the UK as well we can see people labelled as kind of ‘domestic extremists.’ I remember I think we had Green party politicians labelled as ‘domestic extremists’ in the UK. So precisely what being a ‘terrorist’ means or an ‘extremist’ means seems to be a very wide definition and includes what seems to be legitimate thought, opinion, political activities as well. And so again I think this is a good reason to be very critical of increased powers for the police and other law enforcement agencies on the basis of preventing ‘terrorism’ given that this is given a very wide definition, and also I suppose from the European perspective, we’re in at least five if not more years into austerity, financial crisis, so on and so forth; a lot of very unhappy people and a lot of real social issues in Europe at the moment; and so a lot for people to be very unhappy about and very critical of current orders, current political orders and corporate orders for that matter as well. And when you have these…when terrorism or extremism is defined so broadly, we may even be kind of contributing to not very good politicians and corporations—such as banks for instance—remaining in their position in society. So I think there is a kind of protectionism from people who have political or corporate power and this is one of the ways that they are protecting themselves too, is having laws which seem to certainly make certain kinds of political expression difficult if not illegal.

[0:29:20] Jordan Brown: And the vast surveillance state at their disposal to keep that in check.

[0:29:24] Angela Daly: Sure.

[0:29:27] Jordan Brown: That’s crazy. Just thinking on that point, I printed out some news articles—and this is so not extreme—how people that were planning some street theatre when the royal wedding was on, that were rounded up pre-emptively, they hadn’t even done anything yet, they didn’t get to do their action which would’ve been deemed a ‘breach of the peace’ or whatever bullshit; they didn’t do it. This was rounding them up the night before, because they were planning to do something—which was ‘conspiring’ to ‘breach the peace.’ And the way that this was reported, there was an interview with the woman who dressed up as a zombie for the piece of street theatre, she was saying the way the police were questioning her, they would’ve had to have had access to her social media and her emails, because they were asking her about very specific things…

[0:30:24] Angela Daly: Once we start kind of abrogating civil liberties for terrorism or ‘political extremism’ then it’s kind of difficult to stop that. Or, once we start giving up our rights and liberties, it’s difficult to stem that flow. And I think this is what we’re beginning to see—certainly in countries like the UK and the US, and I think other parts of the world as well; that the justifications for becoming more intrusive of rights and liberties begin to proliferate. For instance, in Australia, you now have I think a law in Tasmania preventing certain kinds of environmental protest or secondary boycotts. … Once you start down this path then it can be difficult to turn back and stop, even if so-called threats, or the original threats may not be as threatening anymore, new threats seem to be found or are manufactured.

[0:31:31] Jordan Brown: Yes. It’s like a ‘creeping normalcy,’ slow changes—and they’re not even slow now—changes over time, “We opt into this one inch at a time.” I’m thinking: Is the same true with technology? For instance, once you roll out CCTV everywhere, it’s difficult to take that back, right?

[0:31:51] Angela Daly: Well, at least when it comes to things like, let’s say, I suppose it’s a term which is kind of used in a very specific way, but let’s say: the Internet infrastructure either at the kind-of network level, or the over-the-top services level is set up with a particular design, it becomes difficult to change that fundamental design. So the fundamental design is one where data is intercepted and gathered or created even in the first place, and documented. Then if that’s been the way things have been going for a while, there is so much which is—secondary services infrastructure, and so on, which is based on that happening. So, it can be difficult to shift. It would be a big paradigm shift I suppose to make such changes. Not impossible, but certainly a big difference.

[0:32:46] Jordan Brown: Yeah. That’s huge too. Because it means not only changing the technological paradigm, but also the social and cultural and political—heaps of things.

[0:33:01] Angela Daly: I suppose it’s ironic as well, given that the Internet was set up on a decentralised basis given of decentralised networks are more resilient in many ways, or so I’m told anyway, and at least if one part of the network sort of stops working or malfunctions in some way, the rest of the network can be okay. But I think we also…aside from some of these other, socio-cultural, economic changes, we’re also seeing a kind of centralisation of the network or the infrastructure and services in many ways which was very different to the Internet’s initial design.

[0:33:37] Jordan Brown: How did that happen? I’m interested in: How does corporatism and the co-opting of that decentralisation, turn into ‘180-opposite,’ total centralisation?

[0:33:52] Angela Daly: Well I suppose certainly from the corporate or economic perspective, to some extent businesses don’t want to be regulated and certainly big businesses don’t want to be regulated … Corporations don’t want to be regulated, they’ll often make a big song and dance about not being regulated, however, big corporations—even if that’s the rhetoric, the employee actually, regulation is something that they can afford that their smaller competitors may not be able to afford, and so this kind of centralisation perhaps is an effect of certain regulation being brought in. So things like data retention, for instance, that actually does impose or is likely to impose a cost on businesses based on what’s happened in Europe, and certainly here, that’s very much part of the discussion at the moment. The big businesses like Telstra will be able to bear that cost, however Telstra’s smaller competitors may not be able to bear that cost. So even if large companies may argue against such regulation based on cost, in many ways it may be beneficial to them, or if they’re playing a longer game, it may be beneficial to them economically as well, that this will take out smaller competitors and make it more difficult to compete with them, and they will continue to have a large market share. So basically this kind of regulation can be a barrier to entry for potential competitors and therefore is in the interests in a longer term perspective to existing big corporations. I don’t know if that answers your question…

[0:35:43] Jordan Brown: Sort of. So you were saying that deregulation led to the centralisation of the decentralised model…

[0:35:55] Angela Daly: I guess so. Also, I suppose there’s network effects as well. So, despite the network infrastructure of the Internet being highly decentralised, the over-the-top services particularly “Web 2.0” services have been very much based on networks and kind of social networks in particular; so, for instance, Facebook is the biggest social network even though it’s been declining in certain and maybe doesn’t have as many members or new members as it once did, but still, it’s kind of, I don’t know, has a billion people connected? Something like this?

[0:36:32] Jordan Brown: Yeah, I should’ve looked that up, sorry.

[0:36:34] Angela Daly: So obviously if you kind of join Facebook, you join Facebook because you know a whole lot of people on there already… It’s a big switching cost to switch to another platform where you don’t know anyone, or you have few friends already online. So that also is a kind of phenomena that we see on the Internet which promotes some kind of centralisation in certain services; or, for instance, using Skype as a voice-over-IP service. So you could use other—there are other alternatives to Skype, but if all your friends and family are using Skype, then that’s what you’re going to use. There’s also a kind of, arguably, a lack of interoperability between different services as well. So I think this promotes centralisation at that point.

[0:37:20] Jordan Brown: And by design too. Because they “don’t want you to go.”

[0:37:23] Angela Daly: Sure. Exactly. So this kind of lack of interoperability certainly is something which is by design because that kind of protects that company against competitors in the market as well.

* * *

[0:37:41] Jordan Brown: What are your concerns with surveillance—but let’s unpack that with: What are we dealing with already, right now? What is that? What’s happening? What’s it look like?

[0:37:49] Angela Daly: Okay, so I think right now, I would say particularly in the English speaking—or the ‘five eyes’ countries; so US, UK, Australia, New Zealand, Canada—we’re dealing with a vast surveillance and data gathering apparatus which has been semi-secret for a long time and only as a result really of various whistleblowers—prominently Edward Snowden, but not only Edward Snowden; he’s one of a line of people who have blown the whistle on what our government’s and law-enforcement agencies have been engaging in; stuff which has been very untransparent. So there’s not been very upfront discussion of this kind of surveillance apparatus…

[0:38:40] Jordan Brown: Even actively hidden—they lie about it, and have lied about it.

[0:38:44] Angela Daly: Sure. There’s been a total lack of transparency, and it seems that our governments, law-enforcement agencies, and so on, have only started to become more transparent, when they’ve been forced to be, so in response to various leaks and whistleblowing and so on. So certainly, given that these countries profess themselves as being ‘liberal democracies’ and that these kind of activities ought to be subject to scrutiny—at least knowledge—by the people, if not only kind of our representatives in parliament who seem also not to be particularly, not to have been particularly aware of these developments as well. I think this is highly concerning that we seem to have parts of our governments and administrations that have arguably gone a bit rogue from a democratic perspective as well. And the justification of the moment anyway is ‘War on Terror,’ but how that has kind of morphed over the last I suppose, 13-odd years since 9-11 essentially. But of course there’s a long kind of history of surveillance and threat-of-the-moment too. In the UK it used to be Irish-republican terrorism, now it’s Islamist terrorism. It’s almost even ‘environmental terrorism’ or ‘extremism’ or leftism even…

[0:40:28] Jordan Brown: Yeah, that’s actually a thing: ‘Eco-terrorists.’ They call it that.

[0:40:32] Angela Daly: Yeah, so, I think this does have to be put in a historical perspective as well, that we’re not living in necessarily… All times are exceptional times, and there are threats that are used to justify invasions of civil liberties and rights, and the current justification is kind of the ‘War on Terror.’

TIME CHANGE FROM HERE [0:40:59] Angela Daly: Yeah and if you start protesting against it… Well, what I want to say is that I think it kind of has to be put in a broader historical perspective. Certainly post-World War II, there’s been…this is kind of the latest in a line of…well I suppose developments in surveillance but also developments in kind of military and law enforcement, spy agency cooperation that’s been going on for decades.

[0:43:43] Jordan Brown: And the collusion with the corporate interest in that?

[0:43:47] Angela Daly: Sure.

[0:43:50] Jordan Brown: So, what are your concerns, given what we’ve just talked about?

[0:43:55] Angela Daly: Well I suppose my big concern is the invasion of ordinary citizens’ privacy as a result of…and not just privacy, other rights and liberties as well, but privacy I guess is the big and most obvious one here. So I think that’s highly concerning particularly, I mean, the vast majority of people are not doing anything wrong at all, not that I really think surveillance is kind of justified, kind of… Sorry, what I want to say is: the vast majority of people are subject to this kind of data gathering if not actual surveillance even if they’re not doing anything wrong, or have never been involved in crime, even petty crime, and so it’s totally disproportionate that the rights of normal people who are not engaging in anything nefarious or bad, let alone terrorism are being infringed and impinged upon by this kind of surveillance and data gathering apparatus.

[0:45:02] Jordan Brown: So why should we care then?

[0:45:05] Angela Daly: Well aside I suppose from sort of theoretical or philosophical reasons… What I want to say: aside from the fact our rights are being infringed, aside from the rhetoric around that, there are some practical consequences as well. So firstly, this vast massive amount of data about all of us is not being stored in a very secure fashion. There’s been some kind of high-profile hacks of data stored about individuals this year, I think earlier this year, there was a leak of data about asylum seekers—so people in a very vulnerable position in various ways who, at least have argued that they were escaping torture in their own countries, so if they were sent back to these countries then a whole lot of information would be known about them, or could be known about them as a result of this leak. So that’s kind of one particularly bad example. But if this information was kind of stored securely and only subject to access by very limited people, then perhaps it would be more justifiable. That doesn’t seem to be the case. And even Edward Snowden himself was a government contractor, wasn’t even working directly for the US government. I believe, what he said anyway, was that there was a huge amount of people, thousands and thousands of people had access to huge amounts of data about people not just in the US but all over the world. And so I think there is very little in the way of security, there is little in the way of checks and balances, and also, I mean, what is this data being used for, or what could it be used for? Once the data is there, it’s difficult to get rid of. There’s various ways, technically speaking anyway, but it’s also valuable to the government and also corporations in various ways as well. So, I suppose it kind of remains to be seen what precisely is going to be done with this over the course of our lifetimes, between the generations that are growing up now who may be photo’d; photos of them are put on Facebook from when their babies, so what’s going to be done with all of this information? It may be used in ways that are not particularly equitable or democratic, for instance, and that I think is a big worry. And this is all very unnecessary I suppose as well, it’s not necessary that all this information is gathered about all of us, particularly when it happens outside of our control, or we have only limited control over it. And so I think these are kind of concerns about why this data gathering is at least suspect, if not bad itself.

[0:47:56] Jordan Brown: I’m thinking now about the prospect of, and even William Binney said this, how the real power in collecting all of this data is the retrospective analysis. So yeah, sure, even if you aren’t doing anything ‘bad’ now, with generations growing up now and that’s all fine, at some point in their lives, say 20 years down the line, or whatever, you know, you can pull up their entire life. And to me, that’s really chilling. Because the implications for that are huge, right? It means you’re not a target now, and you might not be, but you could be at any given point…

[0:48:45] Angela Daly: And I suppose another thing to say too, is that this data may not all be very accurate either. So if decisions are being made based on this and potentially this alone that’s also worrying too because it may not actually an accurate reflection of what’s going on. I mean, ‘big data’ has been—which itself is a very vague concept—but there’s been criticisms of ‘big data’ techniques, or at least being used…only big data techniques being used to kind of aid decision-making because it doesn’t paint…it paints a certain picture of people and reality but not, arguably, the full picture. And so even in academic research as well, sort of qualitative techniques or ethnographic techniques are still very important to build a kind of full picture of what’s going on in a particular area and so I think there’s also that issue, that we’re getting…even if…so the data may not be accurate and even if it is accurate, it only paints kind of a certain picture not necessarily the full picture, and we should be cautious about basing too much decision-making, whether it’s from a law enforcement or national security perspective or even from a kind of corporate decision-making perspective. So, who to hire or fire as employees, decisions by health insurance providers and so on and so forth.

[0:50:13] Jordan Brown: ‘Big Data’ is something I’m really suspicious of, one because of the reasons that you’ve mentioned, but also because of the cultural thing—that people sort of believe the computer rather than their own experience, or the real world.

[0:50:25] Angela Daly: Well I think there’s a huge amount of rhetoric from Silicon Valley, kind of, very utopian—I mean there’s a whole lot of utopian rhetoric from there anyway—but there is this kind of messianic discussion of big data as being the kind of solution to all problems: so “the more data we have, the less problems we’ll have.” But it’s very…of course there are certain reasons why these arguments are made, I mean it sells certain products, there’s that kind of self-interested reason why we’re seeing some of these arguments coming out of Silicon Valley. But also I think there’s a kind of cultural side too whereby the problems of society are much more complex than can just be solved by more data. And I think there’s kind of a wish not to engage perhaps with some of these complexities which…because engaging with some of these complexities may present rather unpalatable truths to Silicon Valley, for instance, and particularly I’m thinking around inequality, social problems, health and so on and so forth; that’s not just going to be solved by, I don’t know, people having wearable devices that tell them to walk a bit more. The causes of social problems and bad health are much more complex and more kind of overarching as well rather than just kind of the fault of the individual let’s say. But I think that some of this is somewhat lost in discussions of ‘big data’ too.

[0:51:59] Jordan Brown: Yes. Maybe I should ask about the techno-utopianism now…we kind of just did. But what are your thoughts about the prolific optimism of this culture, the deus ex machina if you like, that specific to technology, how the problems that technology causes, the sort of assumption that if you ‘throw more technology at it, we can fix some of these problems,’ social issues and things as well…

[0:52:39] Angela Daly: What I will say is that I think it’s been an interesting 10 years or so. I mean, the Internet in particular is kind of matured as a technology, I mean it’s in its…well, it’s been publicly available to the masses; we’re kind of now in the second decade. So it’s interesting to kind of see and reflect back on what’s actually happened. I think there have been kind of liberationary aspects of the technology, but of course it doesn’t exist kind of in a vacuum from the rest of society, there’s other things that have been going on in different societies throughout the world which have interacted with these technological developments. So I guess I’m thinking about things like the Arab Spring, the so-called ‘Twitter revolutions’ and so on and so forth. There’s a huge amount of commentary on them both contemporaneous and some years later where with the benefit of hindsight we can see that, yes, the technology was important but it wasn’t the only kind of determining factor there. Nevertheless, I think that this kind of, the vast, the huge amounts of data gathered and the surveillance apparatus which kind of supports and facilitates it, do make me somewhat pessimistic about the liberationary possibilities for this technology anyway, kind of, going forward. I’m more likely to kind of sympathise with a techno-dystopian of someone like Evgeny Morozov now than perhaps I was kind of 10 years ago. Nevertheless, I think we kind of see still at the edges of the Internet, which I see in a kind of non-technical way, but if we think the mainstream of the Internet is kind of mediated by big players like Google and Facebook and Microsoft and so on, still at the edges we’re seeing things like interesting kind of Peer-to-Peer activities; people kind of trying to opt-out or to decentralise what is becoming a more centralised phenomenon through you know, things like mesh networks, also through anonymising techniques—cryptography and so on; and not to say that these are all perfect or all kind of facilitate freedom however conceived or are even kind of, I mean there’s plenty of contestation not in the least by various governments too, trying to decrypt cryptographic techniques as well. But I think you can still see some kind of lawlessness, not sure whether that’s a good thing or not at kind of the darker parts of the Internet. And so even as was, prior to the Internet, the law is not enforced absolutely in its entirety in every situation, and we see that as well with the Internet. Despite this apparatus, despite centralising tendencies, we’re seeing kind of dissent…and deviation, some of which are good, some of which are not so good. That’s still happening, but perhaps the technology overall is not going to be as kind of ‘Earthshattering’…

* * *

[0:57:12] Angela Daly: I think that’s a bit dystopian: that we are…our autonomy is eroded somewhat by all of this…

[0:57:26] Jordan Brown: I think our autonomy is eroded by all of this, and the possibility for [real social change]. What I’m trying to get at is the implications on, ‘Okay, we realise that this space trending badly,’ and if we want to try and do something about that, then we have this big machine against us, right? Waiting to pick us out, to sabotage that event, or round us up before it even happens—as we see…

[0:58:01] Angela Daly: Perhaps I should say that I’m not just techno-dystopian, I’m a bit dystopian in general. I think some of, like I was trying to emphasise before, these trends in technology are not divorced from what’s going on in society overall at all and so there are other dystopian aspects of society at the moment, whether it’s kind of the economic system, whether it’s environmental, whether it’s social and so on and so forth; and certainly, I certainly don’t think what’s happening in technology is happening in isolation from these other trends too.

[0:58:47] Jordan Brown: So while we’re being critical then, does the technology we’ve discussed so far, does it empower all of us—like it claims to—or does it only empower a select few that we’ve been talking about, at the expense of the many?

[0:59:26] Angela Daly: I think there’s a complex answer to this question. I mean it’s undeniable that the many-to-many communication that’s facilitated by the Internet I think is liberationary compared to the one-to-many or one-to-one communications models that we had prior to the Internet, I’m thinking of television or broadcast media, or print media…

[0:59:49] Jordan Brown: Well that would be one-to-many.

[0:59:51] Angela Daly: Yes, but with the Internet, you could have many-to-many communications facility. Anyway, so I think that there definitely are, there have been liberationary aspects of the Internet and other kind of new technologies as well, but we shouldn’t get kind of carried away with that and I get, I think, increasingly we’re seeing that yes, it’s liberating but to a certain point, and that point really is this kind of data gathering, privacy infringing surveillance apparatus which underpins the Internet as we use it as well.

[1:00:28] Jordan Brown: Or even just the facilitating of more obedient, happy consumers. If we’re talking about targeted advertising…

* * *

[1:04:12] Jordan Brown: What about, turning to the point of inequality then? Sort of the “techno-have’s” versus the “techno-have-not’s,” that this experience isn’t for everyone, but we’re sort of creating a subset of people—people that are in the online environment and are having this completely separate experience to people that aren’t.

[1:04:34] Angela Daly: Sure, and even depends on the kind of Internet access that you have as well. So, in…certainly in kind of Northern Europe anyway there is pretty, kind of, good access, pretty fast and not too expensive, but obviously not for everyone either—not everyone can afford that. And even more so in Australia where Internet access is not as cheap or as good as in a whole lot of other even comparable countries despite the ongoing NBN and so on. So I think that some of the…particularly the Silicon Valley set are very much in a bubble whereby you know there’s very fast Internet, people have got enough money to pay for it and that’s kind of how they view the world, whereas that isn’t actually how the world is—a lot of people who do access the Internet in the world are accessing it on mobile phones, over 3G networks as well which are not particularly fast and are quite expensive. So when we talk about (going to a slightly different area) but when we talk about online education or ‘MOOCs’ taking over the place of universities, for instance, maybe to a small extent, I mean, that was actually a whole lot of hype which seems to have collapsed. I mean, universities have started offering online courses but I think it’s unlikely that everyone’s going to go online because you actually need a kind of good enough Internet connection to go online and to have online courses, participate in online courses and access online resources.

[1:06:06] Jordan Brown: So when someone like Tim Berners-Lee says, “The web is humanity connected,” we’re sort of talking about the myths—on the one hand they’re like, “The web or Internet access is like the great levelling.”

[1:06:24] Angela Daly: Oh yeah, well the levelling hasn’t happened I think via the Internet, and again this is where I think we have to move away from techno-utopianism or even techno-dystopianism and see the Internet and technology as a whole, as very much part of what’s going on in other areas too. So, inequality still exists despite the Internet; other problems still exist as well. Arguably, some of these problems are actually exacerbated by the Internet or by not having access—the have’s and the have-not’s, the digital divide. The digital divide was a term kind of used a lot 10 years ago, a bit out of favour now, but there are new divides that open up whereby many people have access to the Internet but they have access via a library or they only have access when they’re at university or school or work and they don’t have access at home. So they may not have…they may not be able to participate in the same way as someone has got access kind of everywhere, or people who only have access via their mobile phones, via people with access with other devices. So…

[1:07:35] Jordan Brown: I’m thinking now about the implications of that though—and this is another big idea too—if we’re defining what it means to bring our privileged ‘equality’ to everyone that looks like the complete corporately dominated Internet that we know and understand…

[1:08:07] Angela Daly: Well I suppose this is kind of happening already. So there are schemes like where in certain countries particularly kind of emerging economies or developing countries, certainly large Internet corporations—notably Facebook, I think there’s the ‘Facebook Zero’ service which was what it used to be called, so you can get access to certain services for free on your mobile phone, you’re not kind of spending your data allowance on this, but you’re restricted to kind of walled garden of a few usually big companies.

[1:08:45] Jordan Brown: Yes. That’s my concern too. I mean, we mentioned ‘mesh networks’ a couple of steps back there, and I was thinking about how the ‘Occupy’ movement was really keen to get away from corporately controlled access, the gateways to the Internet, but the irony is that they used their ‘mesh networks,’ which were arguably very clever, to get on Facebook. So, to bring this analogous with the point I’m trying to make about how I think this ‘great levelling’ is a myth, is that we’re just doing the same thing in the so-called ‘third world,’ right? I mean, it’s like, “Come and join the Internet so you can all get on Facebook.” I don’t know if you want to comment on that?

[1:09:29] Angela Daly: I think what is interesting though is…I think it’s this kind of blog maybe, this kind of invention from Kenya, so somewhere, I mean, I think Kenya—people talk about it being sort of the ‘Silicon Valley of Africa,’ but nevertheless, it’s certainly outside of kind of the Western global north and there’s been some kind of…this interesting device which has been developed there, which is specifically for kind of areas where there are not kind of good connections, so good mobile phone connections and apparently this device can kind of facilitate Internet access in very remote places, but this is something unlikely be developed in somewhere like Silicon Valley because there’s no need for that…

[1:10:22] Jordan Brown: Well, Google Loon, I think it is called—where they put routers up in balloons or something crazy, and they have them hovering over the place… But this is my point though: it’s like, “This is so you can come and use Google; welcome to our walled garden.” And I don’t know if I’m being too gritty in my sour distaste of all these things, but the thing I’m trying to make the point of is what I see as a trend—which also underlines a lot of the social and political issues that we’re talking about, how that’s peripheral to technology too—that social issues in the context of the time informs the technology; the point I’m trying to get at though, by bringing those ideas together is how Lewis Mumford talked a lot about ‘authoritarian technics.’ The example I’ve got here is: How would you provide your own access to the Internet without being beholden to the corporate gatekeepers? This is despite the fact that the infrastructure runs on—even the root DNS servers, the nameservers that resolve everyone’s domain names, that’s a corporately controlled service; so we’re talking about fibre-optic links; we’re talking about wired and wireless infrastructure, that’s all corporately owned. So my point is almost—and this may be a big idea too—is that it seems to me that a couple of layers deep behind this technological progression, we have that these technologies can never be democratic, because it doesn’t fit into the way they’re capable of being designed. Like the military, that [the Internet] started out as a military invention. That’s why we start with the A-B-C. So we’ve got the three power brokers, and they run the show. Everyone else sits around reacting.

[1:12:30] Angela Daly: Sure, but I think people sit around reacting but also subverting; I mean, sometimes politically, sometimes not. And I mean, there’s plenty…everything pretty much ever that’s been invented has been used for purposes which have not been the idea of the person who created it, either in good ways or bad ways. So I think there’s still…we have to kind of believe in the power of our imaginations and creativity to kind of either subvert or reuse…

* * *

[1:10:22] Jordan Brown: Well I was just about to interrupt and say that I’m really critical of that too—because one example I’ve heard before, an old example, how people subverted SMS which was a missed call service that was devised by the corporate power.

[1:13:27] Angela Daly: And in fact, SMS, they never thought it would take off, at all…

[1:13:32] Jordan Brown: Exactly, yeah, so what I’m trying to say is that even if you subvert the technology and try and use it back against itself, A-B-C is waiting there to co-opt it back in to the way they have power and control over the technology.

[1:13:47] Angela Daly: Well I suppose, this is also a little bit out of my area because I’m not really a social theorist…

(Change battery)

* * *

[1:17:15] Jordan Brown: We’ve got all these laws being built, being proposed or coming into effect, which have serious political and human rights implications—so Internet censorship regimes in so-called democracies such as the UK, Australia; laws that make it illegal for journalists and the public alike to talk about what we’re talking about—what the intelligence agencies are doing in secret to keep things secret, this sort of rapacious…

* * *

[1:18:26] Jordan Brown: I was just going to say, what are some of these laws—in broad overview—and what picture does that paint? What sort of world is being built? I guess with the fact that there’s this need to open up and have a bit of transparency, but we’re seeing the cramping down on that and the ramping up of secrecy.

[1:18:46] Angela Daly: Sure. I have to say, I know you don’t want to talk just about Australia, but I think Australia kind of shows…I think when it comes to particularly the last year or so, or since the current Abbott government came into power, we’ve seen a whole lot of laws being passed and measures being taken which seem to silence dissent in various ways and I suppose not just at the federal level, but also for instance in Tasmania as I was talking about, these kind of anti-protest laws. So there seems to be kind of a lot of laws and measures which are coming into effect which are very damaging to rights and liberties. And Australia is also particularly a kind of bad example given that there isn’t a lot in the way of constitutional rights for Australian citizens, and this is really unusual compared…Australia is really unusual in this respect, that in, I believe New Zealand, Canada, definitely the US and in the UK with the Human Rights Act, actually there are ways of challenging laws which do not—and other measures—which do not respect the human rights of citizens, and in some cases noncitizens as well. But Australia, I mean it’s really hard to challenge laws if you cannot…if there’s no actual right…there’s no Bill of Rights here. There are, at least when it comes to free expression, there is an implied right to political speech, implied into the constitution, but there is no constitutional right to privacy or constitutional protections against kind of seizures and searches by the government as it’s framed in some countries like the US as well. And so I think this kind of puts Australians in a particularly weak position, and the Australian government is really able to do a lot more legally speaking, constitutionally speaking, than even in some similar countries.

[1:20:49] Jordan Brown: But that’s crazy, right? I mean, it has huge implications.

[1:20:51] Angela Daly: Yeah and I mean, I’m obviously not from here, but I was told that…I’ve not been here for a very long time, but I was told that there have been debates in the last 10 or 20 years about a Bill of Rights and people, the discussion was very much that, you know, “Parliament is enough to protect us.”

[1:21:07] Jordan Brown: I think that’s bullshit though. I have a friend of mine who does a series of protests against advertising in public space, and in Victoria, just to do this quickly, the state of Victoria is apparently meant to have this charter of human rights and responsibilities which is reflective of the UN charter, and even the courts go out of their way to reinterpret the charter—it says in the first bit that, ‘this only applies to humans,’ but through his particular case, there’s been this really rapacious and consistent ruling to protect property rights [of corporations] over political expression. So, I’d even argue that if there were legal structures in place, there’s still the matter of having the courts get on side.

[1:22:01] Angela Daly: Exactly. Actually, for instance, there’s this ongoing case in the UK with regards to the Tempura program, part of one of these surveillance programs that was revealed by the Snowden leak or the Snowden whistleblowing which I think, there’s just been a decision at first instance which I think is going to be appealed, but at first instance the judge, or it was maybe even a tribunal but anyway the judge or equivalent found that this was probably legal or at least there was a good enough argument this was legal. So if this is what’s…so if the judges are very deferential towards administrative powers of the government which often happens in ‘national security’ cases, then that is very problematic. So even if…the rights as they exist—and particularly at least in the European sphere—there are legitimate reasons to infringe rights, they’re usually listed, each of the rights in the European convention on human rights, but nevertheless it still is very much dependent on judge’s interpretation. And certainly when confronted with a kind of ‘national security’ justification for infringing or being invasive of certain human rights and particularly privacy, judges have been very…seem to be very deferential to that justification to the point that individual rights may not be well protected. And I think there is a real kind of socio- or critical legal exercise to be done in looking at why judges are deciding cases in this way, because certainly in Europe we’ve seen that the Court of Justice of the European Union seems to be more proactive in protecting rights. It ruled earlier this year that the European data retention directive was invalid in part because of its interference or the interference that it entailed with the privacy rights and data protection rights of kind of all European citizens basically, or everyone living in the EU. And it’s a huge judgement, a very interesting judgement as well, but it’s interesting why that would come from Europe rather than, for instance, a domestic UK judge, and I actually think it would be very unlikely and that a judge in the UK would kind of come to a similar conclusion because of this deference to the executive power.

[1:24:40] Jordan Brown: And they’re beholden to that though, yeah? The European court?

[1:24:44] Angela Daly: Yeah, exactly. So this judgement kind of trumps what’s happening in the UK, and that’s another long discussion that can be had…

[1:24:55] Jordan Brown: It reminds me of, sort of, how international treaties or international agreements, or the peripheral—or even the direct effects, forget that—of globalisation. So how domestically, “We don’t want to do some thing,” but, “Screw that, there’s some international agreement that requires you to do this.” I’m thinking of the WTO or the World Health Organisation… But what I really wanted you to do though, was, in a very general way, can you pull all those ideas together? There’s big questions around the functioning of the legal system; we’ve built this sort of crazy surveillance society; things need to change—they’re all really huge, how do you pull them all together? What does it look like?

[1:25:54] Angela Daly: I think it looks depressing. Honestly, I think it’s a depressing picture. But I don’t know whether this is a picture too divorced from what’s going on outside the kind of technical sphere or in other aspects of life as well. I don’t know if we’re yet at a tipping point where things might change, but I think there’s increasingly a kind of interconnectedness of what’s happening here with what’s happening in other areas as well. So, kind of as you mentioned, the fact that kind of surveillance and these anti-terrorism powers are used against environmental protesters sometimes; the vast power of corporations in all aspects of life, so not just kind of the Internet sphere, but also when it comes to opposing regulation in other aspects of…in other spheres as well. I think this certainly cannot be seen in isolation from the other trends, which are I think depressing. I mean, I don’t think there’s a lot to be hopeful about with the current political or corporate set-up that we have at the moment. But maybe, you know, this is just part of the picture that change needs to happen across the whole.

* * *


]]> 0
Some Kind of Anthropocene Tue, 04 Nov 2014 20:21:51 +0000 Some Kind of Anthropocene by jore Titles 01. Unplatitudes (04:07) 02. Following Along The Output (To the Transhumanist) (04:10) 03. Quorum Sensing for Cultural Memory (Nostalgia of the Young) (04:42)… more



01. Unplatitudes (04:07)
02. Following Along The Output (To the Transhumanist) (04:10)
03. Quorum Sensing for Cultural Memory (Nostalgia of the Young) (04:42)
04. The Spectacle to Distract (04:34)
05. Imbue (04:52)
06. Reeling Through the Data Sets at High Speed (01:42)
07. Positive Feedback (04:45)
08. A Delicate Fluke (04:05)
09. The River Song (04:37)
10. Great Space (05:05)
11. Good Effort, Parsecond (05:27)

About this Album

Released 5th November 2014, independently.

This album was written, performed, recorded and mixed by Jordan Brown over the many on-and-off months spanning May 2011 to September 2014, in the many on-and-off rooms of a former lingerie factory with asbestos roof on Pitt St in so-called Brunswick, Melbourne—which is really Iramoo on Wurundjeri land still under duress, never ceded…

Notable musical instruments were: real drum kit, real electric and acoustic guitar, real electric bass guitar, real human voice, real upright piano, real shitty keyboard; unreal audiomulch, unreal electronic patch, and the unreal hand icon inside the computer making use of all the above in simulation.*

Artworks for this record were graciously created alongside in both digital and analogue format by Michelle Chorny, in collaboration—all with continued realisations that the natural world is indeed phenomenally and perplexingly never-endingly beautiful and primary. Yet, the world burns and the simulation turns, toxic-mimics. Ink drawings about this by Willow Darling; likewise, pencil tree and tree stencils by Nathalie Crawford. A5 layout inspired by the good work of Becca Kellaway with amazing construction ideas by Michelle Chorny—including the symbolic leaf and hessian screenprint. A very heartfelt and extended thanks to all for your amazing work, generous sincerity and support.

Life was made more pleasant throughout the creation of this record by Antonietta Melideo, Rachel Williams, Kyle Magee, Nathalie Crawford, Michelle Chorny, Joshua Lapham and Callum Bryant. Thank you, and thank you all for your work, and for being alive; for being who you are.

Earnestly and with love,

November 2014.


* Indeed this whole recorded experience (upon playback) is itself simulacra and entirely insufficient. Though rest assured I am made successively soberingly more humbled every day by the continued cyclical realising of how sad this is. For we are all real living beings not in simulation, in a world that is seriously in drawdown from the extremely real consequences of that very same simulation. The threat is existential. ‘To see it, to save it, to love it.’


Although still dialectically incorrect and still fatally flawed in definition and epistemology, the title Some kind of Anthropocene attempts to mark this distinction, in short form—that it is not only incorrect and flawed, but arrogant, narcissistic and brash to subsume innate human beings as a whole into the culpability for the substantiative global impact on the natural world—especially as we know that it is the humans of this culture, this dominant culture, and the lived ideological hegemony of its Great Death Urge (the commodification of life, the conversion of the living to the dead); of infinite growth on a finite planet; of the love of an economic system and technology and science that is, in fact, causing the ‘substantiative global impact on the natural world,’ amongst many other things and beings. This is not human nature, it’s not humanity. It’s an old story. It’s the control of the many by the few. The few who are sociopaths with great power. They are not and have never been the collective us. It didn’t and doesn’t have to be this way. Let’s get to work…

]]> 0
A chat with Derrick Jensen about The Panopticon Wed, 04 Dec 2013 18:14:24 +0000 Interview for current documentary project. Derrick Jensen is prolific author and radical environmental activist. The topics of some of his work include: science, surveillance and control; the pathology of abuse,… more


Interview for current documentary project.

Derrick Jensen is prolific author and radical environmental activist. The topics of some of his work include: science, surveillance and control; the pathology of abuse, domestic violence and dominant culture; the problems with civilisation, and strategies for resistance; to name few. He has also taught creative writing at Pelican Bay State Prison and Eastern Washington University.



0:05:04 The Panopticon
0:09:51 If you’ve got nothing to hide, you’ve got nothing to fear
0:13:12 Surveillance and the unequal relationship of power
0:15:24 Defining the technology, Lewis Mumford
0:16:34 Democratic and authoritarian technic
0:21:52 The computer is an authoritarian technic
0:25:10 Abusive culture, the abusive mindset
0:26:26 This culture is completely insane
0:32:54 Science and domination
0:37:12 R.D. Laing and the three rules, the hierarchy
0:39:27 Why we don’t talk about this culture destroying the planet
0:43:14 The Congo and why
0:46:23 Technological deus ex machina: Transhumanism and solar panels
0:54:02 You fight for what is important to you
0:55:14 The ‘logic’ of the system
0:59:03 Back to deus ex machina
1:01:52 Solar panels: Who benefits and who is harmed?
1:05:47 Personal change does not equal political change

]]> 0
Interview with Susan Greenfield about Screen Culture Thu, 05 Sep 2013 09:22:05 +0000 more


Interview for current documentary project, carried out for Interviewer by proxy (UK).

Susan Greenfield is a neuroscientist, writer, broadcaster and member of the British “House of Lords.” Specialising in the physiology of the brain, Susan researches the impact of 21st century technologies on the mind.



[0:00:00] Interviewer: Can you please outline and explain what screen culture is?

[0:00:04] Susan Greenfield: Screen culture, is, for me, as its name suggests, a whole way of life revolving around the digital devices. So when we say screen culture, whilst I wouldn’t exclude television, it’s more the interactive nature and the mobile nature of digital devices—which of course can embrace interactive TV nowadays—but it’s more specifically when you think about the amount of hours people spend either with their X-Box, or their laptop, or indeed their mobiles or perhaps even now days with Google Glass.

[0:00:56] Interviewer: Could you take us through the concept of plasticity and what that means for a mind immersed in a pervasive screen environment?

[0:01:01] Susan Greenfield: As a neuroscientist, the reason I’m particularly fascinated and I think in equal measure both excited and alarmed by the impact of screen culture on the human brain is because as a neuroscientist I know that the human brain is changing—it’s highly plastic as we say. That’s not to mean it’s made of plastic of course, but more that it’s very dynamic, it will adapt to the environment. We’ve got the evolutionary mandate more than any other species to occupy more ecological niches than any other on the planet because of this wonderful ability that other brains have but our species has it superlatively, to adapt to the environment. So this is what is meant by plasticity and one example for example, a very famous one, is of piano players where they had three groups of adult human volunteers—none of whom could play the piano—and even over five days you could see that whereas the controls who were just staring at the piano showed no change in their brains (the brains were literally unimpressed), those people who were taught five-finger piano exercises showed an astonishing change in brain territory and functional area relating to digits, even over five days. But the even more exciting group were the third people who just imagined they were playing the piano and astonishingly, their brains showed the similar changes to those that had done the physical exercise. So, I think this shows you how everything you do, even a ‘mere thought’ will literally leave its mark on your brain and therefore if that is the case, if we are so sensitive in adapting to the environment; if the environment is changing in an unprecedented way—as I argue it is with the cyber world—then it follows that the brain will change in a similarly unprecedented way.

[0:02:47] Interviewer: Can you take us through identity and how screen culture might be redefining or changing it?

[0:03:01] Susan Greenfield: Of the many issues that I think arise are what I call mind change—that is to say the impact of screen culture on our lives—I think the issue of identity is one of the most important ones and more specifically the impact of social networking sites. We have to first think of what identity is in any event, and of course this is a very complex question and issue. For me, identity goes beyond merely having a mind (we have the two words after-all). Whilst on a desert island for example you wouldn’t at least at the beginning lose your mind, you’d still be you, you’d still interpret the world in certain ways. Would you have an identity on a desert island? I’d like to think that identity arises always from interaction and from a particular context with others. So you might have an identity normally as a father or as a son or as the boss or as the employee or as the Goalie or as the other member of the choir; and according to the context in which you are at any particular moment, that will be the dominant identity that you are experiencing. And I think that up until now at least, all these different contexts, different scenarios, are subsumed in a sort of narrative that gives you an overarching generalised sense of being you and of being the person you are; whether it’s being a father or son or a team player or a singer or the boss; nonetheless they’re all aspects of you, and they are this greater idea of you. Now this requires, in our infrastructure, requires a very robust set of connectivity in the brain that has access to memories; that has personalised experiences that make you the unique individual that you are. And nonetheless, although you are influenced and interacting with the outside world, there’s this enduring sense that you’re still you: Your boss can shout at you but you’ll still be you; you can fall in love but you’ll still be you; you can have a really good game of football but you’re still you—because there’s this inner sense of robust identity irrespective of the immediate ongoing thing that you are doing. Now what I’m concerned about is with social networking sites, you are encouraged to construct your identity externally. That is to say that instead of internalising this robust private life that is you, now everything on an almost momentary basis is subject to the approval and the judgement and evaluations of others and these others are not so much friends but more an audience. And if you’re doing that, if you are constantly out there, where everything you’re thinking and feeling is subject to the audience then I think it (identity) might be more fragile more vulnerable more changeable then it would be if it’s internalised.

[0:05:52] Interviewer: Do you see perhaps an erosion of a robust identity in the world of the screens?

[0:05:55] Susan Greenfield: So this would mean that for those people who are devotees of social networking—and I want to exclude people perhaps who’ve gone to Australia from the UK and are keeping up with their friends on the social networking sites but where the communication is based on erstwhile friendship; leaving that kind of relationship aside—the sort of relationship where you have x-hundred friends whom you’ve never met, that is where the concern comes in. Indeed there’s lots of data out there now showing that there is an increase in narcissism but sadly coupled with low self-esteem also because the more you seek approval from others, the more you’ll idealise yourself. The more you idealise yourself, the more you depart from the ‘real you’, so the lonelier you will feel, the more you will go on to social networking sites because in order to appease and get the approval of this vast audience of hundreds of so-called friends, you’re not going to be the real you.

[0:06:50] Susan Greenfield: and I think the other thing that I fear is that normally when you’re friends with someone, when you have an identity in relation to other people, that’s because you’re looking some in the eye, you’re perhaps touching them on the arm or wherever depending on the sort of relationship you have; you’re registering their voice tone; if they look away; if they fold their arms; that will restrain you for a little while when you first meet someone to sort of hold back a bit so you wont self-disclose everything, because evolution and biology has introduced body language to enable you to have the safeguards; not to disclose everything to everyone because if you do that you’ll be very vulnerable, they might take you over. So this means you’ll use body language to calibrate how close you can get to someone; when you’ll get close to someone; how much you disclose to someone. Now imagine a scenario, a screen scenario, where that constraint is removed because you’re not using eye contact, you’re not using body language you’re not using voice tone, you’re not touching someone. So now you’ll just self-disclose because that’s a fun thing to do, it combats loneliness; but by so doing you’ll let more out and it will be perhaps not real because you want to impress someone and so on, this endless audience that are endlessly voraciously waiting to put their thumbs up or down upon to what you’re doing; and that will give you a vicious cycle. You’ll feel lonelier, you’ll feel less secure and therefore you’ll seek approval even more. And that’s where I think there is a shift—it’s a quantitative shift not a qualitative one—whereas identity before was something that was reasonably robust and internalised, now it’s fragile and externalised.

[0:08:35] Interviewer: Speaking of identity, and in the context of consumerism and individualism, can you please unpack for us what you’ve dubbed as the ‘someone’, the ‘anyone’ and the ‘nobody’, and how screen culture provides the opportunity to be a ‘nobody’ on a scale like never before?

[0:08:52] Susan Greenfield: If one looks away from the screen for the moment and thinks about identity more generally and the impact of culture and civilisation on the options people have to express themselves; let’s go back to the 20th century. I think that there were two very dominant options there. One was what I call the ‘somebody’ option and that is where you are literally a ‘somebody’, you are defined by what you own. Now, what’s very interesting is that this goes back even further to the 1920s with the nephew of Freud, someone called Bernays. Bernays had a tricky problem. He was in advertising and he wanted to get people to buy things they didn’t need because this was the start if you like of capitalism after the First World War. One particular example, and I think a telling one, was to try and get women to smoke because clearly there was a vast market there to sell cigarettes to half the population that weren’t buying them. And instead of saying, “Oh, smoking is great, it makes you feel good” and so on, he had a much cleverer plan. What he had was beautiful young girls holding cigarettes and the slogan was ‘the torch of freedom.’ What Bernays had done—and this is in advertising still today—is to imply that by doing something or owning something, it will say something about you. It will say that you are, in this case, you are liberated and free and modern or whatever, and so on. And you can look on the web nowadays and still the slogan ‘as individual as you are’ or the product that says something about you—and that can apply to fluffy toys to kitchens to clothes, to everything. It’s something that in our consumer world, in our capitalist world, is a means of expressing our identity. And as Oliver James a British psychologist brilliantly wrote in a book called ‘Affluenza’, amazingly, guess what—it doesn’t bring happiness. Because you get the trainers that are as individual as you are, that say something about you, and guess what—your neighbour has the same and you enter into this arms race where you are outcompeting to try and be a ‘somebody.’

[0:10:51] Susan Greenfield: We talk about being a ‘somebody’ and a ‘someone.’ I think that this is an option where you perhaps are individual but you are never fulfilled. The other option that dominated in the 20th century was what I call the ‘anyone scenario’ and this is dominant both in fascism in communism and in any ideology whether it’s political or religious—where the individual is subsumed under a collective; where what is really important is the narrative of the ideology, of the movement. And that’s still dominant of course today in certain fundamentalist organisations. Now the problem there is you might be fulfilled; you’re in some great narrative that has a noble cause whatever that might be, but you’re no longer individual, to the extreme example that you might be expected to sacrifice your own life in favour of the greater cause. So of course, the ‘anyone scenario’ isn’t that attractive either for most of us as an option. So that’s ‘anyone’ and that’s ‘someone’. There’s also, and this had possibilities of expression in the 20th century but I think now has been amplified by screen culture of being nobody.

[0:11:50] Susan Greenfield: By being nobody, I mean you literally let yourself go—you blow your mind; you lose your mind. Now of course, human beings have always done this. We’ve always had wine women and song incarnated as drugs, sex and rock ‘n’ roll where we choose to put ourselves in environments where we’re no longer ourselves—we’ve let ourselves go; we’ve blown our mind that’s so personal to us. We’re having a ‘sensational time.’ No-one wants to go out and have a ‘cognitive time.’ But you go out and have a sensational time—it’s where the senses dominate. So let’s take dancing or drinking or sex—in all those examples, you are not self-conscious; your conscious but you’re not self-conscious. And as it’s always fascinated me that humanity has always had this other mode that from time to time we’ve wanted to do, where we’ve abrogated our sense of self and we pay money to do this.

[0:12:40] Susan Greenfield: Now, what interests me a lot, is that I think for the first time, videogames in particular but screen culture generally, is giving people this opportunity to abrogate the sense of self and to blow their minds and to have at a premium, strong sensations, rather than a notion of a personal narrative. It’s a time when you are just experiencing in a sort of ‘yuk and wow’ way. One example of that was when I was saying this to a journalist a year or two ago, and I said, “Are we going to live in a society of ‘yuk and wow’ where people just either say ‘yuk or wow’ and are just reacting in that monosyllabic way?” And because I talk fast she mistyped it as ‘yukawow’, and I was astonished that within 24 hours it went viral. And I think it was attractive, and people were selling T-shirts, sadly they’d taken out the domain name, saying, “Welcome to the church of yukawow—a breezy world for no consequences.” And the whole notion of it and why it was attractive is that this silly word. “The first Church of ‘yukawow’ welcomes you to no consequences.” And I think that that has an appeal that is exaggerated beyond the normal cultures and societies that we’ve witnessed in the previous century.

[0:14:10] Interviewer: As human beings we’ve got a desire to connect and have meaningful relationships with each other—could these things perhaps be drivers in the reasons we turn (and return) to the screen?

[0:14:19] Susan Greenfield: Yes, I think people turn and return to the screen because unlike in real life, you inevitably will get instant feedback, you’ll get someone writing into you irrespective of the time or place that you are, you’ll always have someone there that will comment on you.

[0:14:42] Susan Greenfield: I think certainly the obsession with checking for text messages or Facebook far upgrades is that it fulfils, it’s almost like scratchcards—you’re getting a little bit of excitement but it’s never enough and there’s always the possibility of another and you don’t quite know when it’s going to be. Now we know that that’s just the situation in the brain that precipitates the release a chemical called dopamine which is related to addiction and reward.

[0:15:08] Interviewer: Do you think that screen culture, as it is today, on the whole, exacerbates insular or introspective traits in people?

[0:15:44] Susan Greenfield: I don’t know why people—as some have—say that perhaps the screen culture is encouraging introspection. I think probably it’s quite the opposite: If you’re in a world where what you see is what you get; where everything is reduced to simplified symbols or simplified statements—even the very word ‘YOLO’ (you only live once), which is apparently put at the end of statements that you make; where you show and tell; where you just download the chocolate cake that you’re about to eat without saying anything about the chocolate cake—this isn’t encouraging introspection or an individualisation, this is just being in the here and now reacting to things as they happen. And therefore I think perhaps the opposite, it will discourage people to reflect and have the time to think about who they are.

[0:16:31] Interviewer: What about addiction? And how does that addiction work? But before we do that, perhaps speaking a little bit about how we can modify and project the ‘best sense of self’; the thin waistline, the dragon slayer instead of an accountant in Second Life; how you can edit and revise communications in e-mail, unlike in real life; and therefore how people with Autistic spectrum disorders are very comfortable in Second Life, etc.

[0:17:10] Susan Greenfield: If we think that with social networking sites, perhaps you’re not rehearsing body language then you’re not very good at reading the signs that most of us use in face-to-face conversation to establish empathy, which goes beyond words—it’s being able to interpret the body language and the voice tone. What’s very interesting is people with autistic spectrum disorder have problems with empathy; they fail and have great impairment to interpret just by interacting with someone, how they might be feeling. What’s very interesting is people with autistic spectrum disorder are particularly comfortable in the cyber world and my own view is that that is because in the cyber world we are all in the same position—they playing field has been levelled—that is to say we are not using either all the tricks of body language to interpret how someone else might be feeling which is why for autistic people to live in a world were ‘what you see is what you get’ and where actions speak louder than words is probably a more comfortable place than it would be in the real world where you’re aware that people are picking up on something that you yourself are not. Moreover, to go further, and this is highly controversial, but I think something that is worth reflecting on, is whether or not people that are compulsively interacting with the screen might develop themselves autistic-type and I say autistic-type not clinically autistic—autistic type traits. One example of that is a study where they were recording the EEG from people when they showed them a face as opposed to a table. Now, in people that are not autistic, in adults, when people see a face, the EEG is a more marked response when they an object like a table. And for autistic people the EEG is the same—they don’t differentiate between a face and a table in terms of excitement or importance. Guess what, people that are spending a lot of time on the Internet their EEG doesn’t differentiate between the face and the table. So I think clearly this is an area that must be explored, but it does follow, it would make sense for me that if you’re not rehearsing empathy, you’re not going to be very good at it and therefore there might be these consequences of not being so good at interpersonal interaction.

[0:19:35] Interviewer: Talking about addiction, the thrill of the moment trumping long-term consequences…

[0:19:44] Susan Greenfield: I think that when you’re playing videogames for example, one of the ideas, one of the tenets of videogame design, is that there shouldn’t be long-term consequences, and what concerns me there is that if you’re living in a world where people can be obligingly undead the next time around, this is not a particularly good lesson to learn about life. It might make you more reckless certainly, but at the same time you don’t have a sense of enduring long-term meaning or significance, because if everything is reversible it’s not significant or meaningful. If I drop a piece of paper on the floor and pick it up again, that’s not very meaningful because I’ve reversed it. If I punch someone in the face and their teeth fall out, that’s highly meaningful because it’s irreversible—they’ll no longer have their teeth again. What’s very interesting about that world, that exciting world, where you are aroused and you’re in the moment, is we know that that’s a good condition within the brain to release the chemical dopamine. And we know that dopamine is a very useful transmitter in the brain—chemical messenger—it mediates things like movement; and is in excess in schizophrenia; but among its many jobs, one of the things it seems to participate in is reward systems but also addiction. Every drug of addiction—every drug—irrespective of its chemical status and its immediate target; every drug of addiction will cause a release of dopamine in the brain. So we know that if there’s lots of dopamine being released in the brain; that could tie in with feelings of reward and also possibly addiction. Now one has to be careful because addiction has a narrow definitions or using words like ‘compulsive’, but now the American Psychiatric Association has got it on its list to be considered, Internet addiction as a condition, a psychiatric condition. Numbers vary around the world but there are estimates that roughly about 10% would be picked what would be called addictive for what that’s worth.

[0:21:39] Interviewer: Exploring the concept of process over content in the context of screen culture: What do you mean by process over content?

[0:21:43] Susan Greenfield: I’ve often spoken about the benefits of screen culture being one of agile processing, but how that mustn’t be confused with content. There is a book written a while ago by someone called Steven Johnson called Everything Bad Is Good For You which extolled the benefits of screen culture. And one of them is that it could be linked to high IQ because the skills that you rehearse when you play videogames are similar to those that are required to do well in an IQ test—that is to say you don’t need a lot of facts, or infrastructure, or hinterland; but you do have to be very agile at looking at patterns and connections and getting to an answer in a rather fast timeframe. So the idea is that because the human brain is always good what it rehearses, if you’re rehearsing that, you’re going to be good at it, and so then you go to an IQ test and guess what—you’ll be good. But even Steven Johnson says just because as many claim we’re seeing an increase in IQ scores in certain societies, we’re not seeing an increase in empathy or understanding or creativity or insight, and we’re not seeing an increase in brilliant novels being written or insights into the economic woes of the world or the ‘Middle East’ crisis, you know, no one has suddenly come up with their superior cognitive abilities with insights that we would regard as important. Now I think what you need to do therefore, is on the one hand, yes it’s very good for mental processing for what used to be called ‘fluid intelligence’—that is to say where the emphasis on giving a right response to an input—my own little brother for example when he was three and I was 16 I used to force him to learn Shakespeare, he could recite off in a way that an adult perhaps wouldn’t have done so efficiently, great swathes of Shakespeare, because he had learnt it by heart, because I’d taught him, he was like a parrot. So we know that processing fluid intelligence might be linked to rehearsals with those kind of activities on the screen, but that’s not the same as understanding.

[0:23:36] Susan Greenfield: Information is not knowledge and I think this is what people confuse. And by knowledge, I refer to content, and content for me is true intelligence which is where you can see one thing in terms of another. So for example the bit in Mac Beth, “Out, out brief candle”, in order to really understand that you have to see the analogy between the extinction of the candle and the extinction of life. You can’t just say take a candle literally. And I think that that’s what we are missing out on; that’s what might be in jeopardy; that’s certainly not enhanced by doing very fast clever things with videogames even though you might be able to give impressive responses very quickly, correct responses very quickly, that’s not the same as understanding. And I think we have to be careful about differentiating them.

[0:24:20] Interviewer: Can you talk a little bit about the aspects of learning versus simply accessing facts, contrasted to building knowledge and understanding?

[0:24:28] Susan Greenfield: So following on from that idea we think about learning, and lets take the example the British education Minister Michael Gove who recently said, “Yes, young children now, everyone should learn a poem.” And I think that’s wrong, I think the emphasis should be that they should understand the poem, because of a child can be a parrot as my brother was. So really what we need is not just to access facts but we need to see one fact in terms of something else. Facts on their own are pretty boring, you know who cares the height of a mountain, or the date of a battle, the name of a King—these things are only relevant if you relate them to others and you see a trend and you can generalise, or if you’re learning the name of a king, you know more about that king and why he was important. So a fact on its own is why trivial pursuits and pub quizzes are not held up as the pinnacle of intellectual achievement. Facts on their own are not the same as interpreting the facts; putting the facts into a framework. And even more important than facts, are ideas; and you do not get automatically ideas coming out of an iPad. You won’t have that. You’ll have access to facts, but it requires an inspired teacher, it requires your thinking processes, it requires something in addition for you personally to join up the dots. And that’s, for me, that’s real knowledge.

[0:25:45] Interviewer: So just amplifying that, can we just talk about not so much internalising and understanding the facts, because you can just Google them on impulse…

[0:25:51] Susan Greenfield: Recently there was some work done by someone called Sparrow were she claimed that we were changing our memory processing because you can just look something up on Google and what she found was that people remembered very well where to access things, but not the facts themselves and that does concern me because lets take that ad absurdum: If you feel you can look anything up and you don’t have to learn anything, this would mean conversation is going to be pretty clunky because if I normally meet someone of roughly my generation and culture, I will assume they know where Barcelona is and I will assume they know who Napoleon was or who Henry VIII was; I will assume they’ll know where New York is; so you can have a conversation—and we all know the delight of having conversations with people where you share a lot of background knowledge that can develop ideas—but imagine having a conversation with someone who knew nothing; who didn’t know who Hitler was, who hadn’t heard of fascism; who had to look it up each time. This would mean that you couldn’t really have a very fluid or interactive conversation and I know that sounds extreme but if we are always having recourse to an external source of memory than I think it’s going to have a severe impact on how we interact, how fast we have ideas, and what we do with them; and I think that the younger generation perhaps might be disadvantaged in not having such ready agile processes and have a much more cut-and-paste mentality to what they learning.

[0:27:40] Susan Greenfield: One of the most alarming and I think insidious developments which people haven’t really talked about a lot is the advent of Google Glass which hasn’t really taken off yet, but I gather it’s going to hit the mass market towards the end of 2013, the end of this year, and I’m sure most people know this but anyway, its where you wear these rimless glasses with a funny black oblong device in one corner which of course will augment reality for you. So as you’re going around, you’ll know more than your five senses are telling you. And this seems a fantastic setting but imagine living in a world that is unremittingly giving you additional information all the time and you don’t have time to digest it or do anything with it because you’re onto the next new thing that is coming along. More over, it can give a readout of where you are and what you’re thinking. So its like a constant Facebook or a constant Twitter feed where all the time you are interactive, you’re plugged in, you’re hyper-connected to everyone else and I think that this, may make people more vulnerable. First of all, it will become compulsive and that not having it, you’ll suddenly feel that you’ve lost a sense or you’ve lost something, you’re at some disadvantage, so I predict that once these things become on the mass market it will be like mobile phones—everyone will have to have one and have one and it will change the culture. But in a way that is passive, that is to say with a mobile phone at least you have to access the thing, you have to take it out of your pocket, you have to press the screen, or you have to do something. Here, you won’t have to do anything. You’ll just be walking around and there it will be, this augmented reality all the time. And it also means that perhaps you could be manipulated yourself according to the feed that you’re giving, according to where you are and what you’re doing, perhaps even more than is the case at the moment with Google. Things can be tailored to be put in. We now know that when you go shopping online your tastes are logged so that that can be personalised for you in a perspective way. So imagine that amplified even more where everything you’ve done is going into some horrible vast Big Brother database where you can then be manipulated; it can be accessed. I don’t think people have realised just how pervasive Google Glass is going to be once it really takes off, but its something that for ethical reasons concerns me, as an neuroscientist it concerns me because I think it’s the final last barricade of privacy that will be stormed, and at the same time I can see people finding it so alluring and so exciting that they won’t want to use it intermittently.

[0:30:15] Interviewer: Can you talk a little about how Googling things effects the deep thinking?

[0:30:20] Susan Greenfield: Even now I think when you Google something it really has changed the tables—turned the tables around. In the old days I remember as a student I lived in a question rich, answer poor world, that is to say, in order to find something out I’d have to go to the library; I would have to forage around the books; perhaps some other student had taken out that book, I’d have to wait, I’d have to order it out from the stack—I had to work very hard to get access to certain facts and to certain information, but nonetheless I had the time therefore to know exactly what I wanted to know; to formulate a very crisp question; to know exactly what would satisfy me or not, comparing it with other answers. Now are we living in a world where we are bombarded with answers and we’re so bombarded with answers I don’t think we have time to formulate questions anymore. And we are so perhaps transported by the experience of accessing Google, and of surfing, and of watching YouTube—most of which has no real significance. Watching a dog as a trick cyclist—so what? Watching people planking, watching the ‘Harlem Shake’—why are we doing this? Because the experience of it is obviously by definition pleasurable but were not doing it to find anything out, because we’re not on a quest anymore because we know we can go on to the next thing and the next thing and this in a sense the means has outpaced the ends and I think that again this will have seriously implications for education.

[0:31:49] Interviewer: What about distraction? And short attention spans? Can you take us through that?

[0:31:56] Susan Greenfield: So of course one asks why does the screen environment…why is it so appealing compared to the real world because after all, its only stimulating your hearing and your vision; it’s two-dimensional—why should that be more attractive than climbing a tree or feeling the sun on your the face or the wind in your hair or the smell of the ocean? Why should it be better than that? And I think the answer is because it’s fast and furious. Because the experiences are supra-sensory. You only have to look at modern videogames to see how fast and bright and loud and fast-paced and exhilarating and arousing they are for many people. Now the price you pay for that, for that super sensory stimulation, is that because it’s so fast, you as a human being, as always obligingly adapting to your environment, will adapt to an environment that mandates a short attention span and I think that the increase in prescriptions for methylphenidate—the most common example Ritalin, should make us think about whether there is a link with attentional disorders and in particular videogames but screen culture

[0:33:05] Susan Greenfield: If we look to see how methylphenidate prescriptions of ?? that’s Ritalin, and drugs that are prescribed for attentional problems, that could be that the drugs are being given more liberally; it could be that ADHD has been medicalised in a way it wasn’t before; or it could just be that if you take a young brain with the evolutionary mandate to adapt to the environment and the environment is a fast paced one, you’ll adapt to that, of course is what you do. And then you’ll go to school and you’ll fidget a bit and someone will say you have an attentional problem. We know that there is a link with this because what’s interesting is people are suggesting a bidirectional causality between in particular videogames and attention. It turns out that people that play videogames a lot have a history of attention issues and vice versa. And it could be that when you play a video game and you’re releasing this dopamine in the brain in a way that is simulating what Ritalin does which is to release dopamine in the brain. So it could be almost that kids perhaps who have this mindset are self-medicating when they play videogames almost.

[0:34:08] Interviewer: Please take us through the how screen experiences are literal; it’s a ‘sensational time’. How does that contrast with other forms of interaction or brain processes?

[0:34:14] Susan Greenfield: Now of course with screens, they stimulate only hearing and vision but that they have to do that, which means that—leaving aside issues such as Kindle and reading from the screen and I’m happy to talk about that later—on the whole, especially for younger people, much of their screen experience is very visual. It’s not just words. And I think if you are encouraged always to traffic in symbols and icons and vision, then that is how your world will be—it will be very literal world where things don’t have a significance or a meaning; what you see is what you get. So let’s take something like this example of a princess in a videogame. When you play the videogame to rescue the princess, she’s an icon. She doesn’t mean anything, she’s just there; she’s the thing that you rescue. Whereas when you read about Princess in a novel, the Princess has a past, like you; she has relationships, like you; she has a future hopefully, like you hopefully. She has therefore a meaning and a significance which is the reason why you keep turning the pages—you want to find out what happens to the princess. So she has an identity and a significance in a way because she’s this abstract persona in that you can’t see, but you can imagine—she has a significance that is much more significant and important that this icon a videogame who literally means nothing. And I think that if you’re always trafficking in icons that have no significance, there’s no context, there’s no relevance, they’re just there; then the world will just be there—it will be like a child’s world. You know how for small children, the world is a literal one—and as soon as, you know, someone goes away they’re no longer there and you look at the next thing; you look at the ice cream or you look at the bird or you’re on to the next immediate thing in your environment. You haven’t been able to internalise anything because you’re still in an early stage of your development where you’re just reacting as animals do, to your immediate environment.

[0:36:11] Susan Greenfield: I fear that if people give young people especially an environment where you’re reacting to an external visual literal input, then we might be denying them what until now has been the birthright of humans when they read books which is the ability to have a longer attention and above all, imagination.

[0:36:30] Interviewer: Where screen interaction is literal, what are some of the impacts specifically on communication?

[0:36:47] Susan Greenfield: So, I think that when people are literally ‘rescuing the princess’ or are playing a fast-paced first-person-shooter game, or even adopting an avatar persona with the Internet videogames, where you’re playing with a lot of other people continuously; this is in a sense taking you into a different world than the real messy world where there is the real you; where actions do have consequences; where things don’t cooperate, things don’t feedback immediately. And this world which is much more complex and messy and less satisfactory than the bright, fast, immediate world, where you are important and special and anyway, things don’t matter when you do them; it’s a very different world; and I think it’s the world very much of the children. What really gets me about videogames is it’s always, you know, intergalactic warfare, or princesses, or demons, or fairy tales almost; of a level and that normally only a small child would read about rather than the real world. And I think that by living in this world of magic and superhuman properties, one is overlooking what really makes a person a person. I don’t think, I can’t ever imagine for example, Pride and Prejudice as a videogame, for example—it would never happen—or King Lear, as a videogame. No of course not. So what I worry about is this simplified world which is fast and you can give cosmetically and superficially the impression you’re being very agile and clever by giving responses; but actually are you not now missing out on the sort of things that previous generations have gained from King Lear and Jane Austen.

[0:38:26] Interviewer: Does this also mean there’s been a decline in empathy as well?

[0:38:28] Susan Greenfield: I would also predict that therefore, if you’re not rehearsing the skills that has been your biological mandate, you wonder how a very small child, they will just stare at you, they don’t automatically have interpersonal skills—these are the things you learn as you play with kids, you learn to interpret body language, eye contact, voice tone. if you’re not rehearsing those things, how can you be good at them?

[0:38:51] Interviewer: Don’t you think your approach is luddite? You know, technology has always been around, doesn’t that mean that things get better?

[0:38:58] Susan Greenfield: Some people have obviously, inevitably…If you have any idea people are going to criticise you. And that’s healthy and it’s appropriate and it’s good. So one of the ones that has been lodged against me is that I’m just some old baby boomer and technology has always been there—look at the printing press and the television, and so on; and so let’s have a think about that, because I’m sufficiently ancient to remember the advent of the television which came on at six in the evening or thereabouts, or five; it went off at 11 o’clock; there were two channels—but most importantly, most importantly, the whole family would gather around one TV set. The TV was much more like the piano in Victorian times, it was the catalyst for family interaction, family discussion, it was part of a family activity. It wasn’t something you did in isolation in your bedroom. So when one talks about for example, the TV, it was used in a very different way. It’s how it’s used. And similarly, when you think of other technologies like the printing press, like the car, like the refrigerator—they’ve all been means to an end. They’ve been a means to an end. Books give you insights for real life whether it is fact or fiction that you’re reading, that give you information and insights into how other people think and feel. Moreover, very few people read books all day continuously whereas with the screen technologies, people can be up to 10 hours a day living—in theory you could wake up in the morning, do your emails, go dating, go shopping, learn things, go surfing; everything can be done short of actually eating food and going to the loo—those two biological needs; you could spend your entire life living in parallel, doing the things people do in the real world but online. So this for me is very different. Whereas technology in the past was a means to an end, now it’s an end in-and-of itself. And that’s where I worry. Obviously people will personally attack you rather like in the 50s they attacked people that dared to say that smoking might be linked to cancer because if people are having fun and other people are making money out of them, the last thing you want is for this happy relationship is someone saying, “Well, actually is this right? Is there not a problem here?”

[0:41:11] Susan Greenfield: I think that we shouldn’t be complacent and assume that everything is just fine; we should question and think about it. And if in the end it turns out that it is all fantastic than so be it but we have to have the debate first.

[0:41:25] Interviewer: How does screen culture shake up our concept of space and time?

[0:41:29] Susan Greenfield: What’s very interesting is that as we live our lives up until now, space and time have been the great coordinates by which everything is measured, everything is done. We have episodes at a certain time in our lives that occurred in a certain place in our lives; and the linking of these very specific episodes in space and time—especially for humans—is, if you like, our whole construct of reality. We know that very small children and indeed animals don’t have this very strong memory for very specific episodes. We know that if you damage the frontal part of the brain which is underdeveloped in children, you get something called source amnesia which is you can have generic memories, but you can’t place them in space or time. Now my own concern is that if we’re in the screen culture, of course you can go online any time, there’ll always be people answering you; and space has shrivelled, I mean, it doesn’t exist anymore—you can talk to someone in Australia at any time of the day—so this means the normal constraints of having an episode in your life that was defined by being a certain time of day or being in a certain place are no longer there. Cyberspace—where is cyberspace? It’s something where you’re just out there. And I think that whilst in moderation that’s fine, if that becomes the default condition, then it might make you a little bit more like—dare I say it, the infant, or the nonhuman—where you’re living in a generic world, that you don’t anchor in specific episodes in your life because it’s all a kind of cyber-fuzz.

[0:43:13] Interviewer: Where do you see the development of technology headed? Transhumanism?

[0:43:16] Susan Greenfield: Of course the interesting and important question we should ask is where is this cyber-culture headed? I think we have to put into a wider context and even the geekiest will have illnesses, get old eventually; eventually die. And what I find interesting if we look at biotechnology, how that now is transcending the generations in terms of what we look like, how long and healthy we’re going to live for and then eventually reproduction and we can all reproduce for longer now and perhaps eventually everyone of any age will be able to have a child of any sexual orientation with genetic material extracted from any cell in their bodies. So I think there are some science-fiction type scenarios just round the corner, beyond the screen technologies—even nanotechnology, where we can have smart devices within our bodies that perhaps breach the firewall previously that differentiated you from the outside world. So I think that those technologies should be viewed hand-in-hand with Information Technology. What’s very interesting is so-called transhumanism—the more ambitious we become with manipulating small sensing devices, or a small implants in the body, there is possibly the grounds for actually making you ‘better’, making you ‘superhuman’; that’s what transhumanism is. This was dubbed by one magazine as one of the most dangerous ideas. The problem is transhumanism is it does beg the question why. Why do you want to run faster than human beings normally run or see beyond the visible spectrum? Is it because you want to be better? You want to outperform others? In which case, isn’t that a rather pernicious journey to embark on? Surely, it would be: Couldn’t you be more individual? Couldn’t you be more creative? Couldn’t you have better ideas than the other person? Not: Can’t you see something they can’t see? And I find that rather sinister, that this rather primitive idea of one-upmanship over emphasising the individual. And then of course, these technologies are expensive and we’d be facing a colonialism, that outpaced the old colonialism where you have the techno-have’s and the techno-have-not’s. So transhumanism isn’t necessarily going to ensue from the screen technologies, or screen culture, it’s something that is there because people will always want to be special and better than other people. But it’s something I think that we can have a culture where we don’t applaud it, where we do applaud individuality.

[0:45:41] Interviewer: So you’re especially concerned about this distinction between techno-have’s and the techno-have-not’s?

[0:45:42] Susan Greenfield: Of all my worries, I think the distinction between the techno-have’s and the techno-have-not’s is not one of my more pressing ones because actually in terms of the immediate digital technologies this has actually had a huge benefit in the developing world. Mobile phones have done a lot and one can access many many things now on the Internet that would’ve been very hard to have established as physical infrastructure. So I’m not so worried about that. What I’m worried about more is the mindset that creates the demand for things; rather like one wanted when one at looked at Bernays in 1920s, saying, “This is the torch of freedom”—that you wanted to say something about yourself by having or owning something. I’m concerned that perhaps in the future people will feel so insecure, that instead of putting a premium on being the wonderful individual that each of us are; they’ll just want to be better than other people and I think that that will be a very sad way to go.

* * *


]]> 0
The Erase Tapes Tue, 13 Aug 2013 17:49:40 +0000 the erase tapes by jore Titles 01. In Blur (03:41) 02. Lonely (07:15) 03. Just Afterwards (10:08) 04. Counterintuitive (06:20) 05. Snow Sneaking (Cold Both Ways) (06:52) 06. A Cautious… more



01. In Blur (03:41)
02. Lonely (07:15)
03. Just Afterwards (10:08)
04. Counterintuitive (06:20)
05. Snow Sneaking (Cold Both Ways) (06:52)
06. A Cautious Afternoon (08:52)
07. Glimpses of Warm (05:05)
08. Crumbly Cake and Eat It Too (06:51)
09. The Erase Tapes (10:41)
10. Feel of Poppies (05:59)

About this Album

Released 14 August 2013, independently.


The Erase Tapes is a soundscape record of slow, washed, audio soundscapes and textures; representing a journey through the multifaceted aspects and emotions of what it means to be alive during the collapse of industrial civilisation. Written, performed and recorded by Jordan Brown throughout the many months spanning 2012-2013. Photograph on cover by Roksana Mical and Ken Rosenthal in mash-up.

Anti© – Commons BY-NC-SA 2013.

]]> 0
Feel of Poppies Sun, 14 Jul 2013 17:39:31 +0000 Detail A short social commentary inspired by the reactions on social media to a death in the television series Game of Thrones at the time, meanwhile the real world burns.… more



A short social commentary inspired by the reactions on social media to a death in the television series Game of Thrones at the time, meanwhile the real world burns. So which world do we live in and relate to most? The TV world or the real physical world, and our very own lives?


Words by A. Person. Original music, edits and talking by Jordan Brown. Additional footage taken from various sources with credits to respective creators, some of whom are Ron Fricke, Michel Benjamin, Dominique Gentil, Micha Peled, Patrick Forestier, Peter Mettler, Noah Weinzweig, Lukas Eisenhauer, Mårten Nilsson, Jean Counet, and Wikileaks. And all further credit where credit is due for unknown or unattributed creators whose work appears.

]]> 0