Because he loves his mother

Watching Jeff Chan’s Code 8 for New Scientist, 7 May 2020

AROUND 4 per cent of humans are Special. Connor is one of them. Lightning shoots from his hands. His mother is Special, too. She freezes things, including – since a tumour began pressing on her brain – patches of her own skin. Connor needs money to save his mother. And, since Specials have been pushed to the social margins, this means he needs to rob a bank.

Code 8′s director, Jeff Chan, is a relative newcomer whose screenplays co-written with producer Chris Pare fold well-trodden movie ideas into interesting shapes. Grace: The Possession from 2014 was a retread of The Exorcist seen from the possessed girl’s point of view. Code 8, released to streaming services all over the world last December (but not, for some reason, in the UK until now), is a low-budget sci-fi crime thriller.

Connor, played by Robbie Amell, works in construction, wiring up houses with his bare hands. A nicely understated sequence sees his workmates walk past carrying concrete bollards under their arms, when a police raid on “illegals” drops robots from the sky that shoot a worker in the back.

After this, Connor decides he can’t take any more and ends up under the wing of Garrett (Stephen Amell, Robbie Amell’s cousin in real life), a thief whose professionalism is sorely tested by his boss, the telepathic drug lord Marcus (Greg Bryk).

Code 8 is a masterclass in how to wring a believable world out of unbelievably few dollars. This doesn’t come from its premise, which is so generic that it is hardly noticeable. Instead, what sets the film apart is the way it marries contemporary American crime fiction to sci-fi. This fusion is harder than it looks.

Since James M. Cain wrote The Postman Always Rings Twice in 1934, American crime fiction has primarily been an exercise in social realism. It’s about life at the bottom, steeped as it is in poverty, addiction, ignorance and marginalisation. The American crime genre tries to tell the truth about these things, and the best of it succeeds.

Science fiction, on the other hand, is a literature of ideas. Detective plots are tempting for science fiction writers. Put a detective in a made-up world and get them to ask the right questions, and they can show your audience how your made-up world operates.

But that, of course, is precisely the problem: it’s only a made-up world. We aren’t being told anything about the way the real world ticks. Inventive sci-fi can feel an awful lot like under-researched crime fiction.

Somehow, Code 8 manages to be both a cracking crime caper and a solid piece of science fiction. While spotting influences is a hazardous game, my guess is it is an homage to Michael Mann’s L.A. Takedown, a fabulous TV pilot from 1989 that provided the skeleton for Mann’s much more famous 1995 blockbuster Heat.

But it is Code 8′s science-fiction element that impressed me most: a cleverly underplayed cat-cradle of a plot, tangling superpowers, social prejudice, drug addiction and state prohibition so as to create a set of intractable social problems that are both strange and instantly familiar.

Robbie and Stephen Amell have championed the film and its ideas since working on the 2016 short film of the same name. Now a TV spin-off is in the works. I do hope Stephen, in particular, attaches his name to this. Anything to get him out from under his role as the DC Multiverse’s Green Arrow…

Goodbye to all that

Reading Technologies of the Human Corpse by John Troyer for the Spectator, 11 April 2020

John Troyer, the director of the Centre for Death and Society at the University of Bath, has moves. You can find his interpretative dances punctuating a number of his lectures, which go by such arresting titles as ‘150 Years of the Human Corpse in American History in Under 15 Minutes with Jaunty Background Music’ and ‘Abusing the Corpse Even More: Understanding Necrophilia Laws in the USA — Now with more Necro! And more Philia!’ (Wisconsin and Ohio are, according to Troyer’s eccentric looking and always fascinating website, ‘two states that just keep giving and giving when it comes to American necrophilia cases’.)

Troyer’s budding stand-up career has taken a couple of recent knocks. First was the ever more pressing need for him to crack on with his PhD (his dilatoriness was becoming a family joke). Technologies of the Human Corpse is yanked, not without injury, from that career-establishing academic work. Even as he assembled the present volume, however, there came another, far more personal, blow.

Late in July 2017 Troyer’s younger sister Julie was diagnosed with an aggressive brain cancer. Her condition deteriorated far more quickly than anyone expected, and on 29 July 2018 she died. This left Troyer — the engaging young American death scholar sprung from a family of funeral directors — having to square his erudite and cerebral thoughts on death and dead bodies with the fact he’d just kissed his sister goodbye. He interleaves poetical journal entries composed during Julie’s dying and her death, her funeral and her commemoration, between chapters written by a younger, jollier and of course shallower self.

To be brutal, the poems aren’t up to much, and on their own they wouldn’t add a great deal by way of nuance or tragedy. Happily for us, however, and to Troyer’s credit, he has transformed them into a deeply moving 30-page memoir that now serves as the book’s preface. This, then, is Troyer’s monster: a powerful essay about dying and bereavement; a set of poems written off the cuff and under great stress; and seven rather disconnected chapters about what’s befallen the human corpse in the past century or so.

Even as the book was going to print, Troyer explains in a hurried postscript, his father, a retired undertaker, lost consciousness following a cardiac arrest and was very obviously dying:

“And seeing my father suddenly fall into a comatose state so soon after watching my sister die is impossible to fully describe: I understand what is happening, yet I do not want to understand what is happening.”

This deceptively simple statement from Troyer the writer is streets ahead of anything Troyer the postgrad can pull off.

But to the meat of the book. The American civil war saw several thousand corpses embalmed and transported on new-fangled railway routes across the continent. The ability to preserve bodies, and even lend them a lifelike appearance months after death, created a new industry that, in various configurations and under several names, now goes by the daunting neologism of ‘deathcare provision’. In the future, this industry will be seen ‘transforming the mostly funeralisation side of the business into a much broader, human body parts and tissue distribution system’, as technical advances make increasing use of cadavers and processed cadaver parts.

So how much is a dead body worth? Between $30,000 and $50,000, says Troyer — five times as much for donors processed into medical implants, dermal implants and demineralised bone matrices. Funds and materials are exchanged through a network of body brokers who serve as middlemen between biomedical corporations such as Johnson & Johnson and the usual sources of human cadavers — medical schools, funeral homes and mortuaries. It is by no stretch an illegal trade, nor is it morally problematic in most instances; but it is rife with scandal. As one involved party remarks: ‘If you’re cremated, no one is ever going to know if you’re missing your shoulders or knees or your head.’

Troyer is out to show how various industries serve to turn our dead bodies into ‘an unfettered source of capital’. The ‘fluid men’ of Civil War America — who toured the battlefields showing keen students how to embalm a corpse (and almost always badly) — had no idea what a strange story they had started. Today, as the anatomist Gunther von Hagens poses human cadavers in sexual positions to pique and titillate worldwide audiences, we begin to get a measure of how far we have come. Hagens’s posthumous pornography reveals, says Troyer, ‘the ultimate taxonomic power over nature: we humans, or at least our bodies, can live forever because we pull ourselves from nature’.

Technologies of the Human Corpse is a bit of a mess, but I have a lot of time for Troyer. His insights are sound, and his recent travails may yet (and at high human cost — but it was ever thus) make him a writer of some force.

 

“Fat with smell, dissonant and dirty”

The revolt against scentlessness has been gathering for a while. Muchembled namechecks avant garde perfumes with names like Bat and Rhinoceros. A dear friend of mine favours Musc Kublai Khan for its faecal notes. Another spends a small fortune to smell like cat’s piss. Right now I’m wearing Andy Tauer’s Orange Star — don’t approach unless you like Quality Street orange cremes macerated in petrol…

Reading Robert Muchembled’s Smells: A Cultural History of Odours in Early Modern Times and Isabel Bannerman’s Scent Magic: Notes from a Gardener for the Telegraph, 18 April 2020

Cog ergo sum

Reading Matthew Cobb’s The Idea of the Brain for New Scientist 15 April 2020

Ask a passer-by in 2nd-century Rome where consciousness resided — in the heart or in the head — and he was sure to say, in the heart. The surgeon-philosopher Galen of Pergamon had other ideas. During one show he had someone press upon the exposed brain of a pig, which promptly (and mercifully) passed out. Letting go brought the pig back to consciousness.

Is the brain one organ, or many? Are our mental faculties localised in the brain? 1600 years after, Galen a Parisian gentleman tried to blow his brains out with a pistol. Instead he shot away his frontal bone, while leaving the anterior lobes of his brain bare but undamaged. He was rushed to the Hôpital St. Louis, where Ernest Aubertin spent a few vain hours trying to save his life. Aubertin discovered that if he pressed a spatula on the patient’s brain while he was speaking, his speech “was suddenly suspended; a word begun was cut in two. Speech returned as soon as pressure was removed,” Aubertin reported.

Does the brain contain all we are? Eighty years after Aubertin, Montreal neurosurgeon Wilder Penfield was carrying out hundreds of brain operations to relieve chronic temporal-lobe epilepsy. Using delicate electrodes, he would map the safest cuts to make — ones that would not excise vital brain functions. For the patient, the tiniest regions, when stimulated, accessed the strangest experiences. A piano being played. A telephone conversation between two family members. A man and a dog walking along a road. They weren’t memories, so much as dreamlike glimpses of another world.

Cobb’s history of brain science will fascinate readers quite as much as it occasionally horrifies. Cobb, a zoologist by training, has focused for much of his career on the sense of smell and the neurology of the humble fruit fly maggot. The Idea of the Brain sees him coming up for air, taking in the big picture before diving once again into the minutiae of his profession.

He makes a hell of a splash, too, explaining how the analogies we use to describe the brain both enrich our understanding of that mysterious organ, and hamstring our further progress. He shows how mechanical metaphors for brain function lasted well into the era of electricity. And he explains why computational metaphors, though unimaginably more fertile, are now throttling his science.

Study the brain as though it were a machine and in the end (and after much good work) you will run into three kinds of trouble.

First you will find that reverse engineering very complex systems is impossible. In 2017 two neuroscientists, Eric Jonas and Konrad Paul Kording employed the techniques they normally used to analyse the brain to study the Mos 6507 processor — a chip found in computers from the late 1970s and early 1980s that enabled machines to run video games such as Donkey Kong, Space Invaders or Pitfall. Despite their powerful analytical armoury, and despite the fact that there is a clear explanation for how the chip works, they admitted that their study fell short of producing “a meaningful understanding”.

Another problem is the way the meanings of technical terms expand over time, warping the way we think about a subject. The French neuroscientist Romain Brette has a particular hatred for that staple of neuroscience, “coding”, an term first invoked by Adrian in the 1920s in a technical sense, in which there is a link between a stimulus and the activity of the neuron. Today almost everybody think of neural codes as representions of that stimulus, which is a real problem, because it implies that there must be an ideal observer or reader within the brain, watching and interpreting those representations. It may be better to think of the brain as constructing information, rather than simply representing it — only we have no idea (yet) how such an organ would function. For sure, it wouldn’t be a computer.

Which brings us neatly to our third and final obstacle to understanding the brain: we take far too much comfort and encouragement from our own metaphors. Do recent advances in AI bring us closer to understanding how our brains work? Cobb’s hollow laughter is all but audible. “My view is that it will probably take fifty years before we understand the maggot brain,” he writes.

One last history lesson. In the 1970s, twenty years after Penfield electrostimulation studies, Michael Gazzaniga, a cognitive neuroscientist at the University of California, Santa Barbara, studied the experiences of people whose brains had been split down the middle in a desperate effort to control their epilepsy. He discovered that each half of the brain was, on its own, sufficient to produce a mind, albeit with slightly different abilities and outlooks in each half. “From one mind, you had two,” Cobb remarks. “Try that with a computer.”

Hearing the news brought veteran psychologist William Estes to despair: “Great,” he snapped, “now we have two things we don’t understand.”

Pluck

Reading Gunpowder and Glory: The Explosive Life of Frank Brock OBE by Harry Smee and Henry Macrory for the Spectator, 21 March 2020

Early one morning in October 1874, a barge carrying three barrels of benzoline and five tons of gunpowder blew up in the Regent’s Canal, close to London Zoo. The crew of three were killed outright, scores of houses were badly damaged, the explosion could be heard 25 miles away, and “dead fish rained from the sky in the West End.”

This is a book about the weird, if obvious, intersection between firework manufacture and warfare. It is, ostensibly, the biography of a hero of the First World War, Frank Brock. And if it were the work of more ambitious literary hands, Brock would have been all you got. His heritage, his school adventures, his international career as a showman, his inventions, his war work, his violent death. Enough for a whole book, surely?

But Gunpowder and Glory is not a “literary” work, by which I mean it is neither self-conscious nor overwrought. Instead Henry Macrory (who anyway has already proved his literary chops with his 2018 biography of the swindler Whitaker Wright) has opted for what looks like a very light touch here, assembling and ordering the anecdotes and reflections of Frank Brock’s grandson Harry Smee about his family, their business as pyrotechnical artists, and, finally, about Frank, his illustrious forebear.

I suspect a lot of sweat went into such artlessness, and it’s paid off, creating a book that reads like fascinating dinner conversation. Reading its best passages, I felt I was discovering Brock the way Harry had as a child, looking into his mother’s “ancient oak chests filled with papers, medals, newspapers, books, photographs, an Intelligence-issue knuckleduster and pieces of Zeppelin and Zeppelin bomb shrapnel.”

For eight generations, the Brock family produced pyrotechnic spectaculars of a unique kind. Typical set piece displays in the eighteenth century included “Jupiter discharging lightning and thunder, Two gladiators combating with fire and sword, and Neptune finely carv’d seated in his chair, drawn by two sea horses on fire-wheels, spearing a dolphin.”

Come the twentieth century, Brock’s shows were a signature of Empire. It would take a write like Thomas Pynchon to do full justice to “a sixty foot-high mechanical depiction of the Victorian music-hall performer, Lottie Collins, singing the chorus of her famous song ‘Ta-ra-ra-boom-de-ay’ and giving a spirited kick of an automated leg each time the word ‘boom’ rang out.”

Frank was a Dulwich College boy, and one of that generation lost to the slaughter of the Great War. A spy and an inventor — James Bond and Q in one — he applied his inherited chemical and pyrotechnical genius to the war effort — by making a chemical weapon. It wasn’t any good, though: Jellite, developed during the summer of 1915 and named after its jelly-like consistency during manufacture, proved insufficiently lethal.

On such turns of chance do reputations depend, since we remember Frank Brock for his many less problematic inventions. Dover flares burned for seven and a half minutes
and lit up an area of three miles radius, as Winston Churchill put it, “as bright as Piccadilly”. U boats, diving to avoid these lights, encountered mines. Frank’s artificial fogs, hardly bettered since, concealed whole British fleets, entire Allied battle lines.

Then there are his incendiary bullets.

At the time of the Great War a decent Zeppelin could climb to 20,000 feet, travel at 47 mph for more than 1,000 miles, and stay aloft for 36 hours. Smee and Mcrory are well within their rights to call them “the stealth bombers of their time”.

Brock’s bullets tore them out of the sky. Sir William Pope, Brock’s advisor, and a professor of chemistry at Cambridge University, explained: “You need to imagine a bullet proceeding at several thousand feet a second, and firing as it passes through a piece of fabric which is no thicker than a pocket handkerchief.” All to rupture a gigantic sac of hydrogen sufficiently to make the gas explode. (Much less easy than you think; the Hindenburg only crashed because its entire outer envelope was set on fire.)

Frank died in an assault on the mole at Zeebrugge in 1918. He shouldn’t have been there. He should have been in a lab somewhere, cooking up another bullet, another light, another poison gas. Today, he surely would be suitably contained, his efforts efficiently channeled, his spirit carefully and surgically broken.

Frank lived at a time when it was possible — and men, at any rate, were encouraged — to be more than one thing. That this heroic idea overreached itself — that rugby field and school chemistry lab both dissolved seamlessly into the Somme — needs no rehearsing.

Still, we have lost something. When Frank went to school there was a bookstall near the station which sold “a magazine called Pluck, containing ‘the daring deeds of plucky sailors, plucky soldiers, plucky firemen, plucky explorers, plucky detectives, plucky railwaymen, plucky boys and plucky girls and all sorts of conditions of British heroes’.”

Frank was a boy moulded thus, and sneer as much as you want, we will not see his like again.

 

Are you experienced?

Reading Wildhood by Barbara Natterson-Horowitz and Kathryn Bowers for New Scientist, 18 March 2020

A king penguin raised on Georgia Island, off the coast of Antarctica. A European wolf. A spotted hyena in the Ngorongoro crater in Tanzania. A north Atlantic humpback whale born near the Dominican Republic. What could these four animals have in common?

What if they were all experiencing the same life event? After all, all animals are born, and all of them die. We’re all hungry sometimes, for food, or a mate.

How far can we push this idea? Do non-human animals have mid-life crises, for example? (Don’t mock; there’s evidence that some primates experience the same happiness curves through their life-course as humans do.)

Barbara Natterson-Horowitz, an evolutionary biologist, and Kathryn Bowers, an animal behaviorist, have for some years been devising and teaching courses at Harvard and the University of California at Los Angeles, looking for “horizontal identities” across species boundaries.

The term comes from Andrew Solomon’s 2014 book Far from the Tree, which contrasts vertical identities (between you and your parents and grandparents, say) with horizontal identities, which are “those among peers with whom you share similar attributes but no family ties”. The authors of Wildhood have expanded Solomon’s concept to include other species; “we suggest that adolescents share a horizontal identity,” they write: “temporary membership in a planet-wide tribe of adolescents.”

The heroes of Wildhood — Ursula the penguin, Shrink the hyena, Salt the whale and Slavc the wolf are all, (loosely speaking) “teens”, and like teens everywhere, they have several mountains to climb at once. They must learn how to stay safe, how to navigate social hierarchies, how to communicate sexually, and how to care for themselves. They need to become experienced, and for that, they need to have experiences.

Well into the 1980s, researchers were discouraged from discussing the mental lives of animals. The change in scientific culture came largely thanks to the video camera. Suddenly it was possible for behavioral scientists to observe, not just closely, but repeatedly, and in slow motion. Soon discoveries arose that could not possibly have been arrived at with the naked eye alone. An animal’s supposedly rote, mechanical behaviours turned out to be the product of learning, experiment, and experience. Stereotyped calls and gestures were unpacked to reveal, not language in the human sense, but languages nonetheless, and many were of dizzying complexity. Animals that we thought were driven by instinct (a word you’ll never even hear used these days), turned out to be lively, engaged, conscious beings, scrabbling for purchase in a confusing and unpredictable world.

The four tales that make up the bulk of Wildhood are more than “Just So” stories. “Every detail,” the authors explain, “is based on and validated by data from GPS satellite or radio collar studies, peer-reviewed scientific literature, published reports and interviews with the investigators involved”.

In addition, each offers a different angle on a wealth of material about animal behaviour. Examples of animal friendship, bullying, nepotism, exploitation and even property inheritance arrive in such numbers and at such a level of detail, it takes an ordinary, baggy human word like “friendship” or “bullying” to contain them.

“Level playing fields don’t exist in nature”, the authors assert, and this is an important point, given the book’s central claim that by understanding the “wildhoods” of other animals, we can develop better approaches “for compassionately and skillfully guiding human adolescents toward adulthood.”

The point is not to use non-human behaviour as an excuse for human misbehaviour. Lots of animals kill and exploit each other, but that shouldn’t make exploitation or murder acceptable. The point is to know which battles to pick. Making young school playtimes boring by quashing the least sign of competitiveness makes little sense, given the amount of biological machinery dedicated to judging and ranking in every animal species from humans to lobsters. On the other hand, every young animal, when it returns to its parents, gets a well-earned break from the “playground” — but young humans don’t. They’re now tied 24-7 to social media that prolongue, exaggerate and exacerbate the ranking process. Is the rise in mental health problems among the affluent young triggered by this added stress?

These are speculations and discussions for another book, for which Wildhood may prove a necessary and charming foundation. Introduced in the first couple of pages to California sea otters, swimming up to sharks one moment then fleeing from a plastic boat the next, the reader can’t help but recognise, in the animals’ overly bold and overly cautious behaviour, the gawkiness and tremor of their own adolescence.

Pollen count

THEY are red, they have stalks that look like eels, and no leaves. But Karl, the boss of the laboratory – played by the unsettling David Wilmot – has his eye on them for the forthcoming flower fair. He tells visiting investors that these genetically engineered creations are “the first mood-lifting, antidepressant, happy plant”.

Ben Whishaw’s character, Chris, smirks: “You’ll love this plant like your own child.”

Chris is in love with Alice, played by Emily Beecham, who is in love with her creations, her “Little Joes”, even to the point of neglecting her own son, Joe.

Owning and caring for a flower that, treated properly, will emit pollen that can induce happiness, would surely be a good thing for these characters. But the plant has been bred to be sterile, and it is determined to propagate itself by any means necessary.

Little Joe is an exercise in brooding paranoia, and it feeds off some of the more colourful fears around the genetic modification of plants.

Kerry Fox plays Bella, whose disappointments and lack of kids seem to put her in the frame of mind to realise what these innocent-looking blooms are up to. “The ability to reproduce is what gives every living thing meaning!” she exclaims. Her colleagues might just be sceptical about this because she is an unhappy presence in the lab, or they may already have fallen under the sway of Little Joe’s psychoactive pollen.

Popular fears around GM – the sort that dominated newspapers and scuppered the industry’s experimental programmes in the mid-1990s – are nearly as old as the science of genetics itself.

At about the turn of the 20th century, agricultural scientists in the US combined inbred lines of maize and found that crop yields were radically increased. Farmers who bought the specially bred seed found that their yields tailed off in subsequent years, so it made sense to buy fresh seed yearly because the profits from bigger crops more than covered the cost of new seeds.

In the 2000s, Monsanto, a multinational agribusiness, added “terminator” genes to the seed it was developing to prevent farmers resowing the product of the previous year’s crop. This didn’t matter to most farmers, but the world’s poorest, who still rely on replanting last year’s seed, were vociferous in their complaints, and a global scandal loomed.

Monsanto chose not, in the end, to commercialise its terminator technologies, but found it had already created a monster: an urban myth of thwarted plant fecundity that provides Jessica Hausner’s Little Joe with its science fictional plot.

What does Little Joe’s pollen do to people? Is it a vegetal telepath, controlling the behaviour of its subjects? Or does it simply make the people who enjoy its scent happier, more sure of themselves, more capable of making healthy life choices? Would that be so terrible? As Karl says, “Who can prove the genuineness of feelings? Moreover, who cares?”

Well, we do, or we should. If, like Karl, we come to believe that the “soul” is nothing more than behaviour, then people could become zombies tomorrow and no one would notice.

Little Joe’s GM paranoia may set some New Scientist readers’ teeth on edge, but this isn’t ultimately, what the movie is about. It is after bigger game: the nature of human freedom.

All fall down

Talking to Scott Grafton about his book Physical Intelligence (Pantheon), 10 March 2020.

“We didn’t emerge as a species sitting around.”

So says University of California neuroscientist Scott Grafton in the introduction to his provoking new book Physical Intelligence. In it, Grafton assembles and explores all the neurological abilities that we take for granted — “simple” skills that in truth can only be acquired with time, effort and practice. Perceiving the world in three dimensions is one such skill; so is steadily carrying a cup of tea.

At UCLA, Grafton began his career mapping brain activity using positron emission tomography, to see how the brain learns new motor skills and recovers from injury or neurodegeneration. After a career developing new scanning techniques, and a lifetime’s walking, wild camping and climbing, Grafton believes he’s able to trace the neural architectures behind so-called “goal-directed behavior” — the business of how we represent and act physically in the world.

Grafton is interested in all those situations where “smart talk, texting, virtual goggles, reading, and rationalizing won’t get the job done” — those moments when the body accomplishes a complex task without much, if any, conscious intervention.. A good example might be bagging groceries. Suppose you are packing six different items into two bags. There are 720 possible ways to do this, and — assuming that like most people you want heavy items on the bottom, fragile items on the top, and cold items together — more than 700 of the possible solutions are wrong. And yet we almost always pack things so they don’t break or spoil, and we almost never have to agonise over the countless micro-decisions required to get the job done.

The grocery-bagging example is trivial, but often, what’s at stake in a task is much more serious — crossing the road, for example — and sometimes the experience required to accomplish it is much harder to come by. A keen hiker and scrambler, Grafton studs his book with first-hand accounts, at one point recalling how someone peeled off the side of a snow bank in front of him, in what escalated rapidly into a ghastly climbing accident. “At the spot where he fell,” he writes, “all I could think was how senseless his mistake had been. It was a steep section but entirely manageable. Knowing just a little bit more about how to use his ice axe, he could have readily stopped himself.”

To acquire experience, we have to have experiences. To acquire life-saving skills, we have to risk our lives. The temptation, now that we live most of our lives in urban comfort, is to create a world safe enough that we don’t need expose ourselves to such risks, or acquire such skills.

But this, Grafton tells me, when we speak on the phone, would be a big mistake. “If all you ever are walking on is a smooth, nice sidewalk, the only thing you can be graceful on is that sidewalk, and nothing else,” he explains. “And that sets you up for a fall.”

He means this literally: “The number one reason people are in emergency rooms is from what emergency rooms call ‘ground-level falls’. I’ve seen statistics which show that more and more of us are falling over for no very good reason. Not because we’re dizzy. Not because we’re weak. But because we’re inept. ”

For more than 1.3 million years of evolutionary time, hominids have lived without pavements or chairs, handling an uneven and often unpredictable environment. We evolved to handle a complex world, and a certain amount of constant risk. “Very enriched physical problem solving, which requires a lot of understanding of physical relationships, a lot of motor control, and some deftness in putting all those understandings together — all the while being constantly challenged by new situations — I believe this is really what drives brain networks towards better health,” Grafton says.

Our chat turns speculative. The more we removed risks and challenges from our everyday environment, Grafton suggests, the more we’re likely to want to complicate and add problems to the environment, to create challenges for ourselves that require the acquisition of unusual motor skills. Might this be a major driver behind cultural activities like music-making, craft and dance?

Speculation is one thing; serious findings are another. At the moment, Grafton is gathering medical and social data to support an anecdotal observation of his: that the experience of walking in the wild not only improves our motor abilities, but also promotes our mental health.

“A friend of mine runs a wilderness programme in the Sierra Nevada for at-risk teenagers,” he explains, “and one of the things he does is to teach them how to get by for a day or two in the wilderness, on their own. It’s life-transforming. They come out of there owning their choices and their behaviour. Essentially, they’ve grown up.”

Over-performing human

Talking to choreographer Alexander Whitley for the Financial Times,  28 February 2020

On a dim and empty stage, six masked black-clad dancers, half-visible, their limbs edged in light, run through attitude after attitude, emotion after emotion. Above the dancers, a long tube of white light slowly rises, falls, tips and circles, drawing the dancers’ limbs and faces towards itself like a magnet. Under its variable cold light, movements become more expressive, more laden with emotion, more violent.

Alexander Whitley, formerly of the Royal Ballet School and the Birmingham Royal Ballet, is six years into a project to expand the staging of dance with new media. He has collaborated with filmmakers, designers, digital artists and composers. Most of all, he has played games with light.

The experiments began with The Measures Taken, in 2014. Whitley used motion-tracking technology to project visuals that interacted with the performers’ movements. Then, dissatisfied with the way the projections obscured the dancers, in 2018 he used haze and narrowly focused bars of light to create, for Strange Stranger, a virtual “maze” in which his dancers found themselves alternately liberated and constrained.

At 70 minutes Overflow, commissioned by Sadler’s Wells Theatre, represents a massive leap in ambition. With several long-time collaborators — in particular the Dutch artist-designers Children of the Light — Whitley has worked out how to reveal, to an audience sat just a few feet away, exactly what he wants them to see.

Whitley is busy nursing Overflow up to speed in time for its spring tour. The company begin with a night at the Lowry in Salford on 18 March, before performing at Sadler’s Wells on 17 and 18 April.

Overflow, nearly two years in the making, has consumed money as well as time. The company is performing at Stereolux in Nantes in April and will need more overseas bookings if it is to flourish. “There’s serious doubt about the status of the UK and UK touring companies now,” says Whitley (snapping at my cheaply dangled Brexit bait); “I hope there’s enough common will to build relationships in spite of the political situation.”

It is easy to talk politics with Whitley (he is very well read), but his dances are anything but mere vehicles for ideas. And while Overflow is a political piece by any measure — a survey of our spiritual condition under survellance capitalism, for heaven’s sake — its effects are strikingly classical. It’s not just the tricksy lighting that has me thinking of the figures on ancient Greek vases. It’s the dancers themselves and their clean, elegant, tragedian’s gestures.

A dancer kneels, and takes hold of his head. He tilts it up into the light as it turns and tilts, inches from his face, and, in a shocking piece of trompe l’ioel — can he really be pulling his face apart?

Overflow is about our relationship to the machines that increasingly govern our lives. But there’s not a hint of regimentation here, or mechanisation. These dancers are not trying to perform machine. They’re trying to perform human.

Whitley laughs at this observation. “I guess, as far as that goes, they’re over-performing human. They’re caught up in the excitement and hyper-stimulation of their activity. Which is exactly how we interact with social media. We’re being hyperstimulated into excessive activity. Keep scrolling, keep consuming, keep engaging!”

It was an earlier piece, 2016’s Pattern Recognition, that set Whitley on the road to Overflow. “I’d decided to have the lights moving around the stage, to give us the sense of depth we’d struggled to achieve in The Measures Taken. But very few people I talked to afterwards realised or understood that our mobile stage lights were being driven by real-time tracking. They thought you could achieve what we’d achieved just through choreography. At which point a really obvious insight arrived: that interactivity is interesting, first and foremost, for the actor involved in the interaction.”

In Overflow, that the audience feels left out is no longer a technical problem: it’s the whole point of the piece. “We’re all watching things we shouldn’t be watching, somehow, through social media and the internet,” says Whitley. “That the world has become so revealed is unpleasant. It’s over-exposed us to elements of human nature that should perhaps remain private. But we’re all bound up in it. Even if we’re not doing it, we’re watching it.”

The movements of the ensemble in Overflow are the equivalent of emoji: “I was interested in how we could think of human emotions just as bits of data,” Whitley explains. In the 1980s a psychologist called Robert Plutchik stated that there were eight basic emotions: joy, trust, fear, surprise, sadness, anticipation, anger, and disgust. “We stuck pins at random into this wheel chart he invented, choosing an emotion at random, and from that creating an action that somehow embodied or represented it. And the incentive was to do so as quickly and concisely as possible, and as soon it’s done, choose another one. So the dancers are literally jumping at random between all these different human emotions. It’s not real communication, just an outpouring of emotional information.”

The solos are built using material drawn from each dancer’s movement diary. “The dancers made diary entries, which I then filmed, based on how they were feeling each day. They’re movement dairies: personal documents of their emotional lives, which I then chopped up and jumbled around and gave back to them as a video to learn.”

In Whitley’s vision, the digital realm isn’t George Orwell’s Big Brother, dictating our every move from above. It’s more like the fox and the cat in the Pinnochio story, egging a naive child into the worst behaviours, all in the name of independence and free expression. “Social media encourage us to act more, to feel more, to express more, because the more we do that, the more capital they can generate from our data, and the more they can understand and predict what we’re likely to do next.”

This is where the politics comes in: the way “emotion, which incidentally is the real currency of dance, is now the major currency of the digital economy”.

It’s been a job of work, packing such cerebral content into an emotional form like dance. But Whitley says it’s what keeps him working, ” that sheer impossibility of pinning down ideas that otherwise exist almost entirely in words. As soon as you scratch the surface, you realise there’s huge amount of communication always at work through the body and drawing ideas from a more cerebral world into the physical, into the emotional, is a constant fascination. There are lifetimes of enquiry here. It’s what keeps me coming back.”