When god wishes to punish a man he first deprives him of reason.
I really don’t know what to write about this week. Normally I start and then it all sort of falls into place but I’ve been feeling scattered and tired, I’m not sure I have it in me. I’ve watched a lot of horror on TV which is a nice escape, have you got any good recommendations? I’ve been enjoying the Netflix haunting anthologies. I’m doing a lot of cycling this weekend, overnight to Whistable and back which around 200km then a club ride of 80 on Sunday. I’m just looking forward to those to be honest, is escapism a mark of age?
I’m sure you saw two blunders this week from the UK government in their inability to qualify the arts. Firstly, a skills identifying quiz that decided everybody should be a boxer and then a quickly rescinded campaign suggest ballet dancers retrain in ‘cyber.’ There’s nothing I can write that hasn’t been said elsewhere and better but I worry that in this crisis we lose sight of value. I’ve seen designers and artist fall into this trap recently too; extolling the financialisaiton of education and design, positioning graduates as arch-capitalists who should be ‘hustling.’ The financial dimensions of design and academic practice are real but are not ends in and of themselves. You need to understand them but only so that you can move through it and change them. Competitiveness is only going to be more and more encouraged; government believes in herd immunity for culture too.
Here’s Sony’s own teardown of the PS5. I actually uttered ‘Christ, look at the size of those fans.’
I was in Otford on the bicycle last week and came across Uranus in its model of the solar system. I had no idea this thing existed. It’s a scale model of the solar system based in Otford but its furthest extents are in New Zealand and Los Angeles.
Rain on a window has this strange effect of collapsing sound and space. You can’t hear dogs and sirens and traffic anymore – the things that give the city scale and make you feel small in it. There’s just the immediacy of rippling and tapping water on the window, as if there’s nothing and no-one else at all.
Sorry this is another one in the ‘I got too lazy to stop’ type of post. Thank you 148-ish weekly readers. If you like this let me know, if you don’t, keep it to yourself. [The first bit here is a bit introspective, self-aggrandising, pathetic and reflexive, though it does talk a bit about how I go about making these sketches. If you’d rather some maths and a fun speculation, skip to part 2.]
I Can’t Keep Up With My Dissatisfaction
I was watching Grayson’s Art Club last night. I really like it in the same way that I like watching videos of animals being rescued on Facebook; it’s sort of meaningless but it makes you feel good. It’s probably quite staged or at least misrepresented but it feels intimate and real and I’m perfectly happy to shut down critical faculties to smile a bit. Anyway, he was talking about flow and where to start and where to stop when making art and I was disheartened at how satisfying he finds that process and those questions. I find it endlessly frustrating. I’m not trying to claim these sketches as art so perhaps the comparison is mute but I get to this horrible point every week where the idea of the sketch starts receding into impossibility. Like this week, I started with a really simple premise ‘What would a frozen explosion look like?’ Not like a bullet-time Hollywood thing more like what if someone froze an explosion and put it in a museum?
I was inspired by the images of the Challenger shuttle disaster which are almost like a museum piece; like a moment in time that was captured by dozens of cameras from dozens of angles. It’s profoundly sad for many reasons but it’s a spectacle in its sadness. The subject also provided a technical challenge which was to use Blender’s new Mantaflow smoke simulator to try and get as close as possible to a specific explosion. So, I began by endlessly reiterating the simulation; running it, tweaking, running it, tweaking, running it ad infinitum. Most of my Saturday morning was just fram-ing (I’m using this word I just made up in the same way that you might say ‘inching’ but moving a frame at a time) keyframes to get the effect as close as possible.
There was lots of new stuff to figure out here. Mnataflow is quite different to Blender’s previous in-house system. It’s faster and more accurate but needs more care to get there. Then I wanted to get the lighting right, so I imagined it would be artificially lit with blue indirect sky light and yellow-white from the sun. So I set up area lights and position them and then modelled big stadium floodlights. Then it needs something to control it, so I put in rods to hold it up and a server to keep the fire ‘warm.’ Anyway, it goes on and on and after six or seven hours I just realise that the point where I’m happy is only getting further away. And that’s when I chose to stop and send it to render. And it’s the same every time. I’ve never reached the fabled ‘yeah, I’m happy with that, time to stop.’ I give up at ‘I can’t keep up with my dissatisfaction.’
It’s a feeling I get a lot in other things too; that I can keep working forever and never get it right, so I stop there. It’s the same, for example, with hard books. If after a few chapters I’m still in ‘no, I don’t get it’ I give up because I know there’s just more to re-read if I keep going. The acceleration of my not-understanding outpaces the possibility of my understanding and there’s only so much damn time.
Anyway, this all made me think about another accelerating relative cost; the energy required to accurately simulate energy:
The Solar Cost of Realism
This week’s digital sketch was produced in Cycles which is Blender‘s ray-tracing engine. As a quick explanation of what those words mean, Blender is a 3D rendering program that can model and simulate things in 3D. You can use it for architecture or 3D printing but it’s best geared for images and animation which is what I do. Blender has a couple of engines which are basically the ways it runs certain processes. An engine is a good analogy actually, it’s similar to having to having a car which you can switch between electric and diesel; it moves forward but the way in which it does that is different and has different qualities. (This is a bad analogy in that electric engines are clearly superior to diesel engines while the different engines in Blender serve different purposes.)
Last year, Blender released Eevee to great excitement, and rightly so. Eevee is an attempt to compete with Epic’s Unreal engine, which is basically what a load of video games and increasing amount of cinematic special effects are made in. I’ve been playing around with Eevee since release last summer and all the sketches I’ve done here have been made with it. The difference between Eevee and Cycles is simple: Eevee is quick, Cycles is accurate. The main way they do this is a bit more complex but not too complex:
Cycles uses ray-tracing. This is a process where the engine attempts to trace the path of all the light rays from all the light sources as they bounce around all the stuff in the scene and end up in the camera. This means you get a greater degree of accuracy the more sampling you do: The more time you leave the engine to iterate over and over again on the rays being traces, the more accurate the lighting. So in this week’s rendering it used 256 samples; each pixel has been checked 256 times for the right colours based on the light bouncing around in the scene. This takes about 8 and a half minutes per full HD frame, this is not real-time rendering when you think that a frame is about a twenty-fifth or so of a second on a good day.
Eevee is a real-time (ish) rendering engine that uses light-mapping AKA rasterizing AKA direct lighting instead of ray-tracing. Instead of tracing the path of the light sources, the engine sort of ‘paints’ things like shadows on to surfaces receiving them and generates a flat image (or ‘map’) of the colours of those surfaces. This is how video games generally work, although contemporary hardware is able to get pretty good at real-time ray tracing. The down side of this process is a loss of accuracy; reflections, bouncing, indirect lighting, ambient light and other incidental artefacts of physical reality are very difficult to get in real-time so you get much less accuracy. But, a frame takes about 20 seconds to render.
Unreal is a remarkable exception to this law of cost vs accuracy they’re very close to closing the gap on accuracy and speed as the Unreal 5 demos showed and their use in recent on-set special effects shows. So, while tapping my feet for 58 hours, waiting for this week’s sketch to render I wondered about the relative costs. How far do we go for realism? When does realism actuall start to tug at reality in the cost of producing it? How much light do you need to produce realistic light?
Well, Cycles runs off a graphics card and I can calculate, to a reasonable degree of accuracy the power that card draws…
The NVIDIA software on the PC shows it averaging 67.5% of maximum power which is 150 x 0.675 = 101.25 watts.
The rendering is taking about 8 minutes 30 seconds per frame for 415 frames: 8.5 x 415 = 3527.5 minutes or 58.791667 hours or 58 hours and 47 minutes.
So, in kilowatt-hours that’s 101.2 x 58.791667 = 5,949.71667004 watt-hours or 5.95 kilowatt-hours.
How much power a solar panel produces is 🤷 dependent on a bunch of stuff but seems to be in the range of 250-400 watts. If we had continuous sunlight for 58.7 hours then this Blender would be using approximately half of it. At the moment the UK is getting around 11.5 hours sunlight a day but… it’s Britain. On a cloudy day, solar panels operate at approximately 10-25% efficiency. So…
Best-case scenario: 400 watts x 0.25 x 11.5 hours = 1150 watt-hours or 1.15 kilowatt-hours.
Worst-case scenario: 250 watts x 0.1 x 11.5 hours = 287.5 watt-hours or 0.2875 kilowatt-hours
Let’s average it out to 0.8625kwh per day.
It’s going to take (58.791667 / 24) = 2.4496527917 days to render. 0.8625 x 2.4496527917 = 2.1128255328kwh produced by our imaginary solar panel in that time in British weather. While we need 5.95 kwh. So we’d need three solar panels, just to run the GTX 1070 to produce this week’s digital sketch. This obviously doesn’t include any other part of the computer, the screen or my flat.
So Cycles is terrible for the environment and I’m never happy anyway so back to Eevee next week. At least it doens’t put a big dent in the planet to use it.
Short Stuff
I already opined about the ‘oh my god they kept all the health data in Microsoft Excel’ story over on the hateful Irish microblog. Listen, what did you think big data was exactly, it was always just bigger Excel files? In parallel, this story about Palantir’s upcoming IPO came out recently. The short version is that Palantir basically makes nice interfaces and isn’t actually that smart.
I loved the new Watchmenseries. I don’t know if that’s the right thing to say or if it’s problematic or not but I just though it was so fucking smart. I’m sure someone will tell me why I’m wrong.
Open AI continues to undermine its credentials, its mission and, you know, its actual name by licensing GPT-3 exclusively to Microsoft. (I made a promise, which I’ve definitely told you about before, to stop spending energy being outraged at hypocrisy, so this is just an FYI rather than a ‘can you fucking believe it?!’)
I’ve been spending a lot of time on the proper bike and not much on the fixed gear, I got on it the other day to go into town and was shocked at how hard it was to push along for something that for the last fifteen years has bene as natural as walking. Anyway, I’ve definitely posted this before but here again is Ana Puga’s Hotline which I always turn to for inspiration. She’s an incredible mover; proper ‘I’m a leaf on the wind’ stuff. Which is nice to see again after how janky men’s racing has been over the last few weeks.
I’ve been trying to call people I haven’t spoken to in ages just to chat. Lots of folks have reflected on how the various states of lockdown have given them perspective and I realised that I don’t spend time talking to the people who inspire me enough. Anyway, if you’d like to just have a chat, let me know. I’d love to chat with you. If not, I’ll write you again next week. Love you as always.
I like waking up in the dark. It feels like a I have a secret few hours all to myself where no-one knows I’m awake, just laying out the seating. At about 0731 every morning I can hear the squeak of my neighbours pulling up their blinds and from then on the whole orchestra of the city begins to warm up: I can hear the muffled mechanical noises of kettles and toasters being tuned, windows glint as they are swung back and forth to get the pitch of light and heat just right and (somewhere, presumably) the conductor shakes out their arms for another performance.
Hardware Lottery
The lamentable failure and forgetting of the sodium vapour process that I mentioned last week has stuck with me as a metaphor. As a reminder, sodium vapour lighting was a doomed method of post-production that was better than blue or green screening for many decades. The exact colour of the sodium light is easily isolatable in post-production making it easy to matte on special effects. It was most famously used in Mary Poppins (1964). I don’t know if cinema would have been radically different had Disney not held on to it so tightly and instead allowed it to be used, experimented and iterated on. But it’s worth entertaining the idea that the processes, pipelines and even words we take for granted today in visual image production might have been changed had an entirely different process taken root.
‘What if things had been different?’ is a well known sub-genre of speculation but one that’s not as explored as future-oriented work. Sascha Pohflepp’s Golden Institute or Tim Clarke’s High Speed Horizons are both great examples of where a different social or political decision results in different hardware. In Pohflepp’s case Jimmy Carter wins the 1980 US presidential election and begins an agenda of innovation of environmental technologies. In Clarke’s alternative present, commercial flight hasn’t evolved from military-industrial innovations but from other places. Phillip Ronnenburg’s Post-Cyberwar Series, another great project, explores an Internet over TV radio waves and a GPS system based on seismic sensors.
Often what is total serendipity is post-fact chalked up to some sort of inevitability. The story of sodium vapour lighting isn’t told any more because it didn’t take hold, but for decades it was the easiest and most accurate matte process. So instead, in stories about plucky underdogs like Industrial Light and Magic, blue screening is narrated as a sort of inevitable innovation that we now take for granted everywhere while at the time it was a niche and unlikely successor. Katrin Fritsch has written about this tendency to post-rationalise serendipitous advances as inevitable or part of a myth of progress in machine learning research. In her interpretation it’s a way of blending the often unachievable hype with technical reality – a sort of ‘yeah, that was what I intended all along‘ – for catching the falling glass of innovation at the last minute.
This idea of chance and rationalisation came up a lot this week. A morbid conversation on opportunistic technical relations with Mrs Revell’s elder and younger the other day turned to suicide and we discussed about how the change from coal gas to natural gas in the late sixties in Britain led to a sudden drop in suicide rates: Coal gas, once used in domestic ovens and heating, is significantly more lethal than natural gas because of its large carbon monoxide quantity. Researchers exploring the almost halving of the suicide rate (from 5714 to 3693 a year between 1963 and 1975) concluded that ‘means reduction saves lives.’ In other words, opportunity was as much a driver as determination. Mrs Revell the younger cemented this idea with the Dorothy Parker poem Resumé:
Razors pain you; Rivers are damp; Acids stain you; And drugs cause cramp. Guns aren’t lawful; Nooses give; Gas smells awful; You might as well live.
Dorothy Parker – Resumé
All of this wraps up nicely with a paper I found from Sara Hooker (I can’t remember where I saw it, sorry) called the Hardware Lottery which examines the outsize role that hardware plays in artificial intelligence research. Hooker proposes that the serendipitous nature of working with whatever is easily to hand is as much responsible for advances in AI as anything else. Importantly for Hooker’s argument, research directions are not pursued because they are superior or more promising but because they are more technically feasible with the tools to hand.
For example, she writes of how the Graphics Processing Unit (GPU), a niche tool developed for games and 3D graphics just happened to conform to the specifications of computation that AI researchers had been clamouring for. Machine learning in the late 20th century was a small niche within the niche of AI research but the serendipitous arrival of the GPU catapulted this subfield to the top of the research agenda. Until that point, researchers were chaining Central Processing Units (CPUs) to get the level of parallel processing needed. Over the coming decades, the GPU increasingly took the lead in innovation. Hooker describes how in 2012 it took 16000 CPUs to do what four GPUs could do in 2013.
Of course the advances in machine learning aided by GPUs had a similarly serendipitous affect on GPU manufacturers. NVIDIA, the prime mover in this space, serving games (the world’s largest media industry), crypto-currencies (a hip new fad the teenagers are into) and AI research dominates commercial computation, particularly with its recent acquisition of ARM even if Wikipedia still refers to them as a ‘video game company.’ This total domination has seen journalists call for the replacement of Moore’s Law – the notion named for Gordon Moore, founder of Fairchild Semiconductor that the number of transistors on a chip would double every two years – with Huang’s Law (named after NVIDIA CEO Jensen Huang). In Huang’s Law, the power of GPUs double every two years, although there’s more nuance to it than that.
I want to highlight that Hooker’s paper also acknowledges that though these advances are exciting, they are running into limits. The software architectures designed for GPUs are still incredibly energy inefficient and expensive compared to the human brain which ‘runs on the equivalent of an electric shaver.’ Hooker suggests that though technical advances are heavily influenced by the convenience of available hardware, when this hardware landscape is too homogenous it prohibits new serendipitous advances: The biggest discussion out there is machine learning, GPUs and AI, what else are we precluding? Are we hitting the limits of what machine learning can actually usefully do? Huang’s Law can continue to accelerate GPU power but what if there are radical and untouched methods to be explored similar to the position machine learning was in 30 years ago. GPT-3, which has people all in a tizz cost 12 million dollars just to train and it’s still racist and a bit rubbish. We’ve had semi-convincing nonsense generative text machines since the Oulipo and it doens’t really do more than that. As Dan Hon twittered, GPT-3 is “…like my kids running to me and saying LOOK AT ALL THE STICKS WE FOUND and carefully telling me about how each one is interesting and… they’re not wrong.”
Short Stuff
About the only time I ever credit Elon Musk with something is on his insistence that people not use acronyms – that they exclude people from conversations. Returning to work and even I’m struggling to remember all the acronyms a large organisation develops. They’re counter-revolutionary. There must be some sort of law that governs the ration of acronyms to the scale of an organisation.
I have a new job advert out, let me know if you want to talk about it. It’s a really exciting time for this course so think about it.
Semisopochnoi Island, off the Alaskan island chain is both the most eastern point in the United States and one of its most western points. Here’s an XKCD as a clue.
I was one of like three people to watch this talk from Sheldon Brown live at MAAT the other week which is lucky because it was a stage-by-stage deconstruction of what my PhD research is about.
That was a really short short stuff to make up for the longer short stuffs of recent weeks. I am now out of Content. Love you, speak to you next week.