At the time, in Helsinki, I was putting together a talk for Web Directions 2014 in Australia, which was a month down the road. Walking over the bridge to Aalto University and back over the next few days, this vision of a domestic landscape of 'haunted machines' started to form. Machines who's 'thingness' is hidden from us, either because they're dead or because we're not told enough and we become distant from them, assigning meanings to meaningless and impenetrable behaviours. This line of thought became the Haunted Machines keynote I gave at Web Directions and subsequently at KIKK and numerous other places. Later, with my friend and colleague Natalie Kane it became the ongoing Haunted Machines project and conference, where we expanded the scope to include all sort of haunted and magical goings on in technology and design.
So, well, ten months later the video for the talk was finally released and I reckoned, because I used to and it always seems an effective way to begin discussion, I would write it up. I'm not going to go verbatim, this writeup will be different from the original talk (which you should still check out.) Almost a year has passed and my interpretation has probably changed somewhat since I originally wrestled the idea together. I guess you could imagine this as... what if I had the same slides and was doing the talk a year later.
(post-writing note: I have edited a handful of things out for flow on writing.)
I'm an artist and designer from London. I generally work and think around an area called 'technopolitics' which is, broadly, the technologies of politics and the politics of technologies. Luckily for me, that encompasses most of human civilisation so I get a pretty wide range of stuff I'm able to talk about. I was going to come here and show some pictures of protests and be all 'oh my god, Facebook.' but. Today is Halloween, and I'm a huge fan of the horror genre so I couldn't resist using it as the gimmick that runs through what the current, near-future weird of the connected world is.
I'm not just throwing horror tropes around arbitrarily. It's about more than the visceral thrill of a good ghost story: I love the horror genre because it's about us, about our fears and misunderstandings. Horror stories are one of our tools that we use to tackle the unknown and it seems like we're surrounded by more and more of an unknown, a 'technological sublime' all the time in which times, horror becomes particularly relevant.
So sometime ago, around the time of the Snowden revelations I started to realise my computer was doing lots of things that weren't apparent on the surface. It was having conversations with other entities and channelling foreign energies well beyond my control. When the motherboard started to pack in I took it to the store only to find that the folk there could access all sorts of parts of it and do all sorts of processes that I wasn't allowed to do.
I realised that my relationship of it was less of the kind of smug-and-snug work-buddy aesthetic promulgated by its manufacturers and more of an occult seance. My relationship with it is mediated by this ritualistic interface (that is an interface which is, in iteself, conducted of rituals distant from their practical effect), both in the actual GUI itself and with the company that built it and the permitted level of literacy I'm allowed to have. The actions I and others perform around it aren't having any technical effect, the whole thing is a spectacle to distract from the reality. My machine is haunted by the ghost of my perception of it.
The idea of the 'ghost in the machine' is hardly new. This is a pretty cliched image to use, but the Mechanical Turk, more contemporaneously known as just 'The Turk' or 'The Automoton' was a seemingly mechanical chess player that at various times contained some of the world's greatest chess players. Crowds would gather to see the Turk play opponents as luminary as Napoleon, all the while unaware that the Turk in reality was 'piloted' by a human hidden in the cabinet. To this day there are various competing theories on the workings of the Turk and it's notable that Edgar Allen Poe, one of the horror genre's greats began his method of 'literary analysis' with a piece deconstructing the workings of the Turk - Maelzel's Chess Player (1836):
No exhibition of the kind has ever elicited so general attention as the Chess-Player of Maelzel. Wherever seen it has been an object of intense curiosity, to all persons who think. Yet the question of its "modus operandi is "still undetermined. Nothing has been written on this topic which can be considered as decisive--and accordingly we find every where men of mechanical genius, of great general acuteness, and discriminative understanding, who make no scruple in pronouncing the Automaton a "pure machine, "unconnected with human agency in its movements, and consequently, beyond all comparison, the most astonishing of the inventions of mankind. And such it would undoubtedly be, were they right in their supposition.
The Turk eludes to another cliche, Arthur C. Clarke's third law, which I probably don't need to bring out. There's various ways to interpret this law but the simplest one is that once technology attains a sufficient level of advancement, such that its working are impossible for someone to mentally reverse-engineer, it may as well be magic. By this definition, magic is just technology (technique) who's effects are observable and understandable but who's method is unknown or misinterpreted: The technology loses it's 'techniqueness.'
We can look to the Pacific Cargo Cults as an interesting artifact of this disconnect. These were particularly prevalent in the post-war period but are now largely just tourist attractions as in the one above. During the Second World War, the US Army placed airbases all over the Pacific Islands during the campaign against Japan. They shipped supplies and food into these bases to support the forces based in the area. Inevitably, some of these supplies ended up in the hands of the islanders. At the end of the war most of the bases were shutdown, the planes stopped coming and with them the supplies.
The islanders, in an attempt to summon back the supplies began to re-enact and reconstruct the artifacts and rituals they observed amongst the soldiers resulting in scenes like this image - native islanders dressed in blue jeans, with 'US' painted on their chests, parading with carved 'rifles.' The islanders didn't understand the concept of a nation-state let alone one with an army or flight. To them, this was magic, and so they performed the rituals they observed to achieve the effect of food. These rituals are devoid of technique to actually bring in food but to someone alien to this technology, the functioning is unclear.
Like I said, most of these cargo cults are just tourist attractions now and it seems somewhat ridiculous to us but look at, for instance, this convincing Photoshop advert from 4chan last year. 4chan managed to convince loads of (mostly) American teenagers that upgrading to the new iOS would make your phone waterproof. YouTube is littered with videos of teens dumping their iPhones in the sink.
This is funny and kinda stupid but it talks volumes about technological literacy in pop culture when folk think a firmware update makes electronics waterproof, or circumvents the popularly held understanding that metal plus microwaves equals explosions. A large part of this is the authenticity of the fake ads. They look convincing, they carry the same visual language - think again of the cargo cults and how it's all about perception and not technique.
So the first section of this talk is called 'Black Boxes and Demon Runes.' For those that don't know this is a Lemanchard's Box from the Hellraiser series. In the films a hapless and usually greedy Faust-wannabe solves the puzzle box to open a gateway to hell with... unenviable results. In the spirit of using the horror genre as a running gag, this is a stand-in metaphor for your phone.
In a tech-theory sense, what we're talking about here is the Black Box. Bruno Latour came up with this idea that basically as technology becomes more advanced, it becomes less and less knowable:
...the way scientific and technical work is made invisible by its own success. When a machine runs efficiently, when a matter of fact is settled, one need focus only on its inputs and outputs and not on its internal complexity. Thus, paradoxically, the more science and technology succeed, the more opaque and obscure they become.So, for instance, you put 'kittens' into Google, you instantly get kittens yet have very little understanding at how that happened. None-the-less, Google is number one source for kittens for a reason; it's very good at delivering on kittens.
But now it's more than just you putting 'kittens' in and getting kittens back. All sorts of other things happen, Google takes that data and uses ti to figure out how popular kittens are (not hard.) Other organisations request that data and get it sent off, your searches and browser activity might be logged with ISPs, you might have spyware, the data gets fed back into targeted advertising. Basically, the Black Box is so black that you can't actually see that other people are taking part in the process, that there are other forces who do know the workings of the black box, who do have access to it and are using that to their advantage. The Black Box in your pocket is doing all sorts of things without your knowledge, request or explicit permission.
Here's a recent and pretty explicit example, in the Yosemite update, Apple loaded up an opt-out feature that sent everything you typed into Spotlight back to Apple. This really should have been an opt-in feature and of course Apple covered for it with the Glomar response of the tech world; 'improving consumer experience.' The point is that there is no way you would know about this stuff, it's buried so deep in layers of stuff we as consumers are either not allowed to know about or simply aren't told about, plausible deniability-chic.
We now have to talk about the fact that there's a very different orientation and set of definitions around possession, ownership and control in the connected world. These three concepts are now very separate and very differently understood and legislated for. The 'modern' world of physically mass-produced objects where possession was nine-tenths of the law and those things were largely bundled together is dead - the value of consumers is now in data, not in objects. And that has shifted these principles around.
A classic example in the instance of Apple and Amazon is eBooks: You don't actually own an eBook from Amazon. You own a licence to access an eBook. Now, on an everyday basis this isn't such a notable distinction. You can't easily lend it to your friends or peruse it on a bookshelf, but we're kinda used to that now. It only really comes up if, for instance, you got to another country where the book is banned (where it suddenly disappears from your library) or it is stealth-edited by the legal owners. Here you have purchased a licence for an eBook that you don't own and ultimately can't control except within the architecture of the platform you've signed up for. It isn't yours outright, you possess it.
The reason why so much of this granular detail is lost on us is because most of it is buried deep in the terms of service and terms and conditions of these services and platforms. This is an example of bike rental place in Canada that wanted the user to read through 128 pages of terms and services before they were allowed a bike. It's well known that we don't read this stuff and that's resulted in many amusing anecdotes. Now, this is probably largely a UI problem someone could solve if they took the time, but no-one has. And there's a reason for that. This legalese is where the power dynamics are established. Where your position and legal rights as a consumer are laid out relative to the supplier and the legal jurisdiction of the state. Ultimately, this legalese is the language of power and the place where that power establishes it's legitimacy. So of course it needs to be protected, even if that protection is via the stage magic of spectacle to distract from the mechanics of the wider sociocultural system - the terms and conditions.
This is one of the classic horror tropes, particularly memorable in the Scream series that was regrettably killed off by Caller ID. The victim receives a threatening call from the murderer but they're calling from inside the house the whole time and I'm using this to talk about the 'smart home' ok?
Earlier this year a couple of infosec folk managed to replace the firmware on a home printer with the game Doom over wi-fi. It's just office stationary; so what? But it exposes a massive flaw in the Internet of Things rhetoric; that the system upon which we're meant to ostensibly be trusting our lives as biological animals is fundamentally untrustworthy. What if this wasn't just a printer but your car or fridge? Things that could actually be used to kill you. Or something more prosaic? A disagreeable and obnoxious toaster? When I talk about 'haunted machines' that begin to perform inexplicable and impenetrable actions without our control, this is a prefiguration of that.
So we have to revisit that Arthur C. Clarke law. Which, you know, at this stage is over fifty years old and rephrase it for our specific situation here: Any sufficiently advanced hacking is indistinguishable from a haunting. In the same way that many Internet of Things objects are referred to as 'enchanting' or 'magical,' with an intervention, they can very quickly become haunted.
Nest, which is kind of the poster child of this future has been hacked a couple of times now, with all the good humour of hackers everywhere. What's more significant is that last year marked the first time a Nest was used as a zombie node in a DDOS attack. In the same way that these devices can be used to haunt you, they can be hijacked and used to haunt others without you knowing. Think again of that modified Black Box, connected to all the other black boxes and none of us (bar the privileged) able to see what they're talking about.
Facebook pulled out the 'improving consumer experience' card when they were caught with their hands in the till earlier this year. Turns out they were conducting psychological experiments on users. Now, again, they got away with this because, medical ethics aside, it was all laid out in the terms and conditions of use. But being gaslight-ed by your social network through which you interpret and visualise the world is always going to come at a human cost. Facebook and others like it are the systems upon which the whole narrative of 'Smart' is based. If Facebook can psychologically experiment on you then don't even doubt that your smart home will; psychological experimentation will be boot-loaded into the operating system.
The Smart Fridge is like the great resounding myth of technology. It's the kind of proving ground; once we have smart fridge, then we have smart. But this is a device you need to live. At the moment only you can access your fridge. You decide what food to put in it, what food to take out and when. It's your fridge. Think of those eBooks - there's a sheer insanity in relinquishing that control to a company that is going to decide what you eat and how you live based only on alienated data. I'm going to touch on the existential elements of this later but in our haste to smart everything we're forgetting the fact that given the choice of people you'd trust to keep you alive, in most cases, you'd probably pick yourself first and that's fine and works.
The friendly ghost is kinda the flipside of the haunting poltergeist. Think here of Casper or Obi-Wan or Patrick Swayze (who was a very friendly ghost.) The friendly ghost is usually that unnoticed, unremarked-upon presence that you only really notice when it's not there, like a silent guardian or helper.
Nest again is a great example of this. It's a truly awesome idea - this device that learns your habits to best optimise your energy use. Good for you, good for the planet. But my problem with it is one of existentialism. Much like the fridge, looking after your home and your environment is really your problem. Nest offsets the existential burden of being human onto an algorithm. Sure, that's probably more efficient than you constantly running back to your thermostat with incremental degree changes but you then lack the crucial awareness of how everything you do impacts the world. You're laying that burden on a machine to make your life easier and guilt free. On some level, if we're going to be better humans then we need to practise being better humans, not invent machines that do it for us. In a purely hauntological interpretation, we are haunting Nest.
Incidentally, Nest just released this 'Works With Nest' promo. They're kind of doing the Internet of Things right in building partnerships and collaborations rather than just doing takeovers, even though they themselves are owned by Google now. Otherwise you just end up stuck in some horrifically proprietary architecture that has full run of your house.
I just like the idea of this house where the smoke alarm goes off and in your bleary-eyed semi-woken state the walls are flashing red with blood.
More importantly, it transpired that Bloomberg Business Week reckoned 95% of cash machines were running on XP at the time. This resulted in this massive scramble to update the operating systems of cash machines which would suddenly stop receiving security updates. Unsurprisingly, the same thing had happened a few years previously with Windows 95 as well. This is a classic example of where progress and advancement force out something that was actually working quite well but also where the web of dependencies technological, legal and financial go into brief conflict with each other after years of general amenability. This exposes another problem with the connected vision is that objects like smart fridges, thermostats or whatever work on two cycles: Your own life cycle where you replace white goods maybe every five to ten years and then business models that expect new products, services and obsolescence every one to two years. This web of dependencies is not suited to keeping you alive, it's suited to spectacle and profit. We rely on friendly ghosts that might be snatched away at any time.
HAL 9000 has kind of become the benchmark of how we think about the visual aesthetic of machine intelligence - the sinister glowing eye of the machine represented in the film 2001: A Space Oddessy. (Clearly no-one at Nest had every actually seen 2001.) 2001 is often pointed at by designers as just an incredibly well-designed film that really considered how these interactions with machines and devices would start to look and it was amazingly prescient in many ways.
Moon follows 2001 in this tradition. It's really hard to design a machine intelligence without giving respect to HAL 9000 so it's to Duncan Jones' credit that he managed to slightly prefigure the popularity of emojis with Kevin Spacey's voice. GERTY uses emjois because we get so much information from faces and emojis are specifically designed to communicate quite complex subliminal meanings in a simple symbol making them a lot more effective than the written or spoken word. This was at the root of some of the sinisterism of the disembodied voice of HAL 9000. It's impossible to interpret the meaning of what HAL is saying without a face to go with it, it's just a voice.
GERTY obviously went on to inspire something of BAXTER, the production line robot. BAXTER is endowed with simple facial expressions in order to make it seem less sinister and more human to other workers on the production line. (Less sinister?) The truth is that BAXTER is still operated by engineers who understand its inner workings and code and understand how it really works. The workers next to it are presented with The Turk - the spectacle of a human-like presence that cheats them of the reality of the machine. Whilst probably fucking terrifying them. BAXTER presents a disparity between reality and reality-as-experienced. I'm heavily reminded here of the Chinese Room Problem - how do you get two groups of people, with fundamentally different language to talk to each other? At some stage there has to be a concession and so far the machines have made the most concession and that's building a picture of machine interaction which is predicated on a constructed facade of communication.
Paro is another example of this disparity. Paro was developed in response to Japan's ageing crisis. For years, Japanese robotics companies were working at the rich market potential of care robots for elderly people and in the process created a whole host of appalling monstrosities. The only one to achieve real success was Paro.
Paro serves no practical function except one of companionship. It purrs when stroked, vibrates, warms up, wiggles around and basically simulates the effect of being a real seal-pup. This has remarkably therapeutic and calming effects on the people with whom it interacts but is, essentially, an algorithm in a seal suit. Think back to Nest; what happened where instead of encouraging greater human interaction or even working with real animals, we invented a robot to bring emotional joy to others? There are all sorts of circumstances where this is a good idea and, like I said, it does have positive effects, but we're essentially lying to old people. A lot of the value of the experience is based on the assumption that it's two-way, that this seal genuinely enjoys your company - it doesn't, it's an assemblage of sensors programmed to respond in a certain way - a disparity between reality and reality-as-experienced in a cute furry object - a constructed fantasy.
Our robots are starting to come in all sorts of exciting forms now, far beyond that image of the 'robot' we might find in the classic literature. Horse eBooks was this incredible bot that scraped eBooks about horses and just blurted it out on Twitter. If the Internet did poetry, this would be it. It was, and is, a hugely important cultural artifact.
Except that it wasn't a bot. It was a guy from Buzzfeed. When Horse eBooks stopped tweeting, he did a load of interviews and people were pretty annoyed and somewhat bemused. People loved this thing. But it goes to show you how the interface can deceive. How this twisted, mutilated language presented by something claiming to be a bot convinced us that this was the Internet talking to us, not Jacob from Buzzfeed.
There's something about the fidelity of how the machines talk to use here as well: Whether we assign faces or interpret machine babble from carefully constructed click-bait. We bundle in expectations and meanings in order to make that concession to incommunicado work.
It's this ease of deception and unaccountability that forms the basis of JTRIG's Gambits for Deception - a guidebook for spies on how to deceive and emotionally manipulate people online. Think back to our haunted machines - just as these platforms and services can be used to bring you great joy and happiness, so too can they be used to manipulate and destroy you as has happened uncountable times at the hands of individuals, and probably states and corporations. The distance that the web allows us doesn't just make us all dogs on the Internet, it makes us all the gaslit and potential gaslighters.
This is the fascinating story of the robot cold-call service. Essentially, this cold-call line uses voice recognition software to generate responses to people it speaks to. Except it's only got a limited number of pre-recorded responses so it's quite easy to force it to loop and repeat, as shown in the clip.
But. But the reality is, it's not a robot call centre. It is in fact a real person sat behind a keyboard who has a set number of responses they can push to activate based on what they hear you saying. You're talking to a real person through the mediation of a limiting machine. Think back to Paro, it's pretty sad that we're at a stage where machine-mediated human contact is preferable to just talking to each other, even if it's for something pithy and annoying like cold-calls. And, again, it's indicative of our cultural assumptions that we hear these looping responses and assume; 'machine!' or see mindless twittering on horses and think; 'machine!' Whilst celebrating one and condemning the other. We're haunted by these assumptions into twisted and deceptive relationships.
Not really a horror trope so much as a Queen lyric, but I've been talking a lot about the confusion of realities and I just kind of wanted to bang on about it a bit more because it's important.
I found out that 26% of Ikea catalogue images are renders. This is hardly surprising - CAD, CGI and simulation technology is so good now that it's far cheaper to hire some folk to knock up a controlled environment on screen that you can make your perfect image out of than to spend time hiring studios, getting a crew together and taking the time to do a shoot. But I find it a weird disparity that more and more, the world is filled with renders that we're meant to aspire to, with imagined realities that were never real in the first place - with second-order simulacra.
So I want to revisit Clarke again and say that any sufficiently advanced render is indistinguishable from reality. In other words, any VR, 3D render or CGI effect done to a sufficiently high degree that the seams become invisible, is indistinguishable from stuff that actually happened. This becomes particularly important when these renders are perceived as realities in situations where that distinction is vital to wider context.
Here's an example. This was one of the earliest and most prevalent images of a drone which was pasted everywhere when the mainstream media began to talk about drone warfare. You can search Google Images and it still comes up pretty high in the ranking. But after a bit of investigation, the artist James Bridle discovered this wasn't a drone at all but a rendering of a drone. It wasn't done for any sinister reason, there was no intentional deception intended, the creator was simply an avid 3D artist and did it. But on the Internet, devoid of context and distinction it became a real drone for the purposes of images of drones. It became an objective reality of a thing that never happened, a drone that never existed, never flew, never went over those Afghan mountains.
When we're talking about something as contentious as drone warfare, the use of unrealities to talk about potential truths is problematic. This image was never presented as legal evidence (as far as I know) but it serves to construct a reality in the minds of those who encounter it which, even if only in a slight way, is false.
I'm not entirely sure why this is relevant but it's too good not to talk about. This is the amazing subculture of folk who construct computers in Minecraft. Minecraft is such a thing of the network, all based on voxels that do simple things according to dependable rules with very little in the way of variation or randomness. This is a 16bit computer with a 128 byte hard drive. These things really advance at the speed of Moore's law as well and while looking at this (a lot) I came across a load of these folk who want to build a computer good enough to play Minecraft, in Minecraft.
The other thing about these is how beautiful they are. If the Internet had a physical landscape, it would be Minecraft computers.
We often underestimate how realities-as-imagined start to shape realities-as-experienced. A good example of this is the entanglement between pop culture, particularly science-fiction and technology. Minority Report is a prime instance of this. Similarly to 2001 and Moon it had a great team of designers who really thought about and considered what kind of technologies and interfaces might populate the world and how they would work. Now, despite the fact that Tom Cruise got pretty significant injuries from using his hand-wavey interface thing, it inspired a whole generation of UI designers to make it a reality. (If you build it, they will come.)
And it went on to inspire over a decade of tech-journalism. Ten years of CES fairs of almost Minority Report technology. (Not to mention the huge military-industrial investment in pre-crime software.) This imagined reality was constructed almost 15 years ago and yet technological innovation is going toward it. These visions have a powerful hold over those who go on to shape the world. When a reality-as-imagined is so good, so convincing, so goddamn tasty, people will try and build it.
And this complex entanglement between realities is the reason why when I see Google's Street View camel dutifully documenting the Abu Dhabi desert...
...I can only wonder how much of the inspiration for it came from Dino-Riders. We all grew up on the same stuff and form has a tendancy to repeat itself.
So how do we start to challenge these ghosts and visions and hauntings? How can we start to expose and understand the inner workings of the machine? To see past the spectacle and the magic and into the gears of the automaton?
Well, language is pretty important. I talked earlier about how jealously guarded the legalese of terms of services are guarded. That's where the power happens. But even an understanding of how words are used to obfuscate opens us up to a critical view of the systems at play. The 'Cloud' for instance, this mythical haven of data, is actually a shit-ton of massive server centres using 2-4% of the world's energy. All heavily guarded behind private armies and propped up by a surveillance state. Question your ownership of the cloud - it's limited, you give away a lot when you trust a company which has a vastly different plan for you than you do.
Some folk won't even use the term 'Internet' any more. It harks back to the image of a global village of connected horizontal democracy that the early hippy pioneers dreamt of. The series of increasingly expensive walled gardens where we're spied on isn't 'Internet', it's 'Shitnet.'
We can begin to grasp the physicality of this stuff. This is my website, it's in a building in San Antonio. All my stuff is in there being looked after by people I've never met and don't know. A friend of mine calls it my horcrux.
And it's only one part of the vast piece of giga-infrastructure that constitutes the Internet. There's no magic here, just a lot of engineering.
And that super-structure leaves its scars. Vast open pit mines like this one. The stuff that comes out of here, the copper and the gold, goes into your phones and laptops. It leaves huge physical scars on the surface of the planet which are directly connected to the very digital things you build...
...and I'm tempted to say it's worth it. The very stuff of the network - a cat, on a Roomba, in a shark costume, chasing a duck, uploaded to YouTube and watched millions of times. All that from an open-pit mine.
A lot of the stuff I've talked about here has been sinister, dark, alarming and sometimes downright scary. The Internet is scary, a lot of the things happening on it are terrifying. But there are no ghosts. We are them. We are the ghosts. It's just that we're so separated by fallen curtains, interfaces, literacy and distance that when we see the curtain twitch or something slither out of the corner of our eye, we call it a ghost. It's how we've dealt with the unknown for 20,000 years. The Internet is just another ghost story so far. But behind every interface, every emoji, every robot is just another person.
I'm also deeply aware that I come to a conference of web devotees as a jaded cynic. Rest assured I love the Internet and some of the stuff here has been completely amazing and awe-inspiring. But I must leave you with one very dark thought before we all start drinking. I am sorry. But; you control the perception of objective reality.
You control the perception of objective reality. It's a meaty and difficult one: The people who visit your apps, your websites, your services and platforms construct their understanding of the world through it. And so you have to figure out where you sit on the sliding scale between making magic, which is a fun spectacle that might bring people joy but also potentially horror. And making sense, which might not be as enchanting but is, in some ways, more real.