Episode 3 of the Baillie Gifford Prize’s ReadSmart Podcast, out Friday 29 May, has me discussing trends in artificial intelligence with the mathematician Hannah Fry. Razia Iqbal is in the chair. Available from all high-class streaming emporia; this is the Spotify one spoti.fi/3bJa5v7
Is Michel Comte’s past celebrity a burden? “You carry it on your fucking back,” he says. “It took ten years for people to notice I was visiting Africa for months at a time. It took twenty years before people starting listening to what I’ve been saying since my first gallery show.”
Heteropoda davidbowie is a species of huntsman spider. Though rare, it has been found in parts of Malaysia, Singapore, Indonesia and possibly Thailand. (The uncertainty arises because it’s often mistaken for a similar-looking species, the Heteropoda javana.) In 2008 a German collector sent photos of his unusual-looking “pet” to Peter Jäger, an arachnologist at the Senckenberg Research Institute in Frankfurt. Consequently, and in common with most other living finds, David Bowie’s spider was discovered twice: once in the field, and once in the collection.
Bowie’s spider is famous, but not exceptional. Jäger has discovered more than 200 species of spider in the last decade, and names them after politicians, comedians and rock stars to highlight our ecological plight. Other researchers find more pointed ways to further the same cause. In the first month of Donald Trump’s administration, Iranian-Canadian entomologist Vazrick Nazari discovered a moth with a head crowned with large, blond comb-over scales. There’s more to Neopalpa donaldtrumpi than a striking physical resemblance: it lives in a federally protected area around where the border wall with Mexico is supposed to go. Cue headlines.
Species are becoming extinct 100 times faster than they did before modern humans arrived. This makes reading a book about the naming of species a curiously queasy affair. Nor is there much comfort to be had in evolutionary ecologist Stephen Heard’s observation that, having described 1.5 million species, we’ve (at very best) only recorded half of what’s out there. There is, you may recall, that devastating passage in Cormac McCarthy’s western novel Blood Meridian in which Judge Holden meticulously records a Native American artifact in his sketchbook — then destroys it. Given that to discover a species you must, by definition, invade its environment, Holden’s sketch-and-burn habit appears to be a painfully accurate metonym for what the human species is doing to the planet. Since the 1970s (when there used to be twice as many wild animals than there are now) we’ve been discovering and endangering new species in almost the same breath.
Richard Spruce, one of the Victorian era’s great botanical explorers, who spent 15 years exploring the Amazon from the Andes to its mouth, is a star of this short, charming book about how we have named and ordered the living world. No detail of his bravery, resilience and grace under pressure come close to the eloquence of this passing quotation, however: “Whenever rains, swollen streams, and grumbling Indians combined to overwhelm me with chagrin,” he wrote in his account of his travels, “I found reason to thank heaven which had enabled me to forget for the moment all my troubles in the contemplation of a simple moss.”
Stephen Heard, an evolutionary ecologist based in Canada, explains how extraordinary amounts of curiosity have been codified to create a map of the living world. The legalistic-sounding codes by which species are named are, it turns out, admirably egalitarian, ensuring that the names amateurs give species are just as valid as those of professional scientists.
Formal names are necessary because of the difficulty we have in distinguishing between similiar species. Common names run into this difficulty all the time. There too many of them, so the same species gets different names in different languages. At the same time, there aren’t enough of them, so that, as Heard points out, “Darwin’s finches aren’t finches, African violets aren’t violets, and electric eels aren’t eels;” Robins, blackbirds and badgers are entirely different animals in Europe and North America; and virtually every flower has at one time or another been called a daisy.
Also names tend, reasonably enough, to be descriptive. This is fine when you’re distinguishing between, say, five different types of fish When there are 500 different fish to sort through, however, absurdity beckons. Heard lovingly transcribes the pre-Linnaean species name of the English whiting, formulated around 1738: “Gadus, dorso tripterygio, ore cirrato, longitudine ad latitudinem tripla, pinna ani prima officulorum trigiata“. So there.
It takes nothing away from the genius of Swedish physician Carl Linnaeus, who formulated the naming system we still use today, to say that he came along at the right time. By Linnaeus’s day, it was possible to look things up. Advances in printing and distribution had made reference works possible. Linnaeus’s innovation was to decouple names from descriptions. And this, as Heard reveals in anecdote after anecdote, is where the fun now slips in: the mythopoeic cool of the baboon Papio anubis, the mischevious smarts of the beetle Agra vation, the nerd celebrity of lemur Avahi cleesi.
Hearst’s taxonomy of taxonomies makes for somewhat thin reading; this is less of a book, more like a dozen interesting magazine articles flying in close formation. But its close focus, bringing to life minutiae of both the living world and the practice of science, is welcome.
I once met Michael Land, the neurobiologist who figured out how the lobster’s eye works. He told me that the trouble with big ideas is that they get in the way of the small ones. Heard’s lesson, delivered with such a light touch, is the same. The joy, and much of the accompanying wisdom, lies in the detail.
AROUND 4 per cent of humans are Special. Connor is one of them. Lightning shoots from his hands. His mother is Special, too. She freezes things, including – since a tumour began pressing on her brain – patches of her own skin. Connor needs money to save his mother. And, since Specials have been pushed to the social margins, this means he needs to rob a bank.
Code 8′s director, Jeff Chan, is a relative newcomer whose screenplays co-written with producer Chris Pare fold well-trodden movie ideas into interesting shapes. Grace: The Possession from 2014 was a retread of The Exorcist seen from the possessed girl’s point of view. Code 8, released to streaming services all over the world last December (but not, for some reason, in the UK until now), is a low-budget sci-fi crime thriller.
Connor, played by Robbie Amell, works in construction, wiring up houses with his bare hands. A nicely understated sequence sees his workmates walk past carrying concrete bollards under their arms, when a police raid on “illegals” drops robots from the sky that shoot a worker in the back.
After this, Connor decides he can’t take any more and ends up under the wing of Garrett (Stephen Amell, Robbie Amell’s cousin in real life), a thief whose professionalism is sorely tested by his boss, the telepathic drug lord Marcus (Greg Bryk).
Code 8 is a masterclass in how to wring a believable world out of unbelievably few dollars. This doesn’t come from its premise, which is so generic that it is hardly noticeable. Instead, what sets the film apart is the way it marries contemporary American crime fiction to sci-fi. This fusion is harder than it looks.
Since James M. Cain wrote The Postman Always Rings Twice in 1934, American crime fiction has primarily been an exercise in social realism. It’s about life at the bottom, steeped as it is in poverty, addiction, ignorance and marginalisation. The American crime genre tries to tell the truth about these things, and the best of it succeeds.
Science fiction, on the other hand, is a literature of ideas. Detective plots are tempting for science fiction writers. Put a detective in a made-up world and get them to ask the right questions, and they can show your audience how your made-up world operates.
But that, of course, is precisely the problem: it’s only a made-up world. We aren’t being told anything about the way the real world ticks. Inventive sci-fi can feel an awful lot like under-researched crime fiction.
Somehow, Code 8 manages to be both a cracking crime caper and a solid piece of science fiction. While spotting influences is a hazardous game, my guess is it is an homage to Michael Mann’s L.A. Takedown, a fabulous TV pilot from 1989 that provided the skeleton for Mann’s much more famous 1995 blockbuster Heat.
But it is Code 8′s science-fiction element that impressed me most: a cleverly underplayed cat-cradle of a plot, tangling superpowers, social prejudice, drug addiction and state prohibition so as to create a set of intractable social problems that are both strange and instantly familiar.
Robbie and Stephen Amell have championed the film and its ideas since working on the 2016 short film of the same name. Now a TV spin-off is in the works. I do hope Stephen, in particular, attaches his name to this. Anything to get him out from under his role as the DC Multiverse’s Green Arrow…
John Troyer, the director of the Centre for Death and Society at the University of Bath, has moves. You can find his interpretative dances punctuating a number of his lectures, which go by such arresting titles as ‘150 Years of the Human Corpse in American History in Under 15 Minutes with Jaunty Background Music’ and ‘Abusing the Corpse Even More: Understanding Necrophilia Laws in the USA — Now with more Necro! And more Philia!’ (Wisconsin and Ohio are, according to Troyer’s eccentric looking and always fascinating website, ‘two states that just keep giving and giving when it comes to American necrophilia cases’.)
Troyer’s budding stand-up career has taken a couple of recent knocks. First was the ever more pressing need for him to crack on with his PhD (his dilatoriness was becoming a family joke). Technologies of the Human Corpse is yanked, not without injury, from that career-establishing academic work. Even as he assembled the present volume, however, there came another, far more personal, blow.
Late in July 2017 Troyer’s younger sister Julie was diagnosed with an aggressive brain cancer. Her condition deteriorated far more quickly than anyone expected, and on 29 July 2018 she died. This left Troyer — the engaging young American death scholar sprung from a family of funeral directors — having to square his erudite and cerebral thoughts on death and dead bodies with the fact he’d just kissed his sister goodbye. He interleaves poetical journal entries composed during Julie’s dying and her death, her funeral and her commemoration, between chapters written by a younger, jollier and of course shallower self.
To be brutal, the poems aren’t up to much, and on their own they wouldn’t add a great deal by way of nuance or tragedy. Happily for us, however, and to Troyer’s credit, he has transformed them into a deeply moving 30-page memoir that now serves as the book’s preface. This, then, is Troyer’s monster: a powerful essay about dying and bereavement; a set of poems written off the cuff and under great stress; and seven rather disconnected chapters about what’s befallen the human corpse in the past century or so.
Even as the book was going to print, Troyer explains in a hurried postscript, his father, a retired undertaker, lost consciousness following a cardiac arrest and was very obviously dying:
“And seeing my father suddenly fall into a comatose state so soon after watching my sister die is impossible to fully describe: I understand what is happening, yet I do not want to understand what is happening.”
This deceptively simple statement from Troyer the writer is streets ahead of anything Troyer the postgrad can pull off.
But to the meat of the book. The American civil war saw several thousand corpses embalmed and transported on new-fangled railway routes across the continent. The ability to preserve bodies, and even lend them a lifelike appearance months after death, created a new industry that, in various configurations and under several names, now goes by the daunting neologism of ‘deathcare provision’. In the future, this industry will be seen ‘transforming the mostly funeralisation side of the business into a much broader, human body parts and tissue distribution system’, as technical advances make increasing use of cadavers and processed cadaver parts.
So how much is a dead body worth? Between $30,000 and $50,000, says Troyer — five times as much for donors processed into medical implants, dermal implants and demineralised bone matrices. Funds and materials are exchanged through a network of body brokers who serve as middlemen between biomedical corporations such as Johnson & Johnson and the usual sources of human cadavers — medical schools, funeral homes and mortuaries. It is by no stretch an illegal trade, nor is it morally problematic in most instances; but it is rife with scandal. As one involved party remarks: ‘If you’re cremated, no one is ever going to know if you’re missing your shoulders or knees or your head.’
Troyer is out to show how various industries serve to turn our dead bodies into ‘an unfettered source of capital’. The ‘fluid men’ of Civil War America — who toured the battlefields showing keen students how to embalm a corpse (and almost always badly) — had no idea what a strange story they had started. Today, as the anatomist Gunther von Hagens poses human cadavers in sexual positions to pique and titillate worldwide audiences, we begin to get a measure of how far we have come. Hagens’s posthumous pornography reveals, says Troyer, ‘the ultimate taxonomic power over nature: we humans, or at least our bodies, can live forever because we pull ourselves from nature’.
Technologies of the Human Corpse is a bit of a mess, but I have a lot of time for Troyer. His insights are sound, and his recent travails may yet (and at high human cost — but it was ever thus) make him a writer of some force.
The revolt against scentlessness has been gathering for a while. Muchembled namechecks avant garde perfumes with names like Bat and Rhinoceros. A dear friend of mine favours Musc Kublai Khan for its faecal notes. Another spends a small fortune to smell like cat’s piss. Right now I’m wearing Andy Tauer’s Orange Star — don’t approach unless you like Quality Street orange cremes macerated in petrol…
Ask a passer-by in 2nd-century Rome where consciousness resided — in the heart or in the head — and he was sure to say, in the heart. The surgeon-philosopher Galen of Pergamon had other ideas. During one show he had someone press upon the exposed brain of a pig, which promptly (and mercifully) passed out. Letting go brought the pig back to consciousness.
Is the brain one organ, or many? Are our mental faculties localised in the brain? 1600 years after, Galen a Parisian gentleman tried to blow his brains out with a pistol. Instead he shot away his frontal bone, while leaving the anterior lobes of his brain bare but undamaged. He was rushed to the Hôpital St. Louis, where Ernest Aubertin spent a few vain hours trying to save his life. Aubertin discovered that if he pressed a spatula on the patient’s brain while he was speaking, his speech “was suddenly suspended; a word begun was cut in two. Speech returned as soon as pressure was removed,” Aubertin reported.
Does the brain contain all we are? Eighty years after Aubertin, Montreal neurosurgeon Wilder Penfield was carrying out hundreds of brain operations to relieve chronic temporal-lobe epilepsy. Using delicate electrodes, he would map the safest cuts to make — ones that would not excise vital brain functions. For the patient, the tiniest regions, when stimulated, accessed the strangest experiences. A piano being played. A telephone conversation between two family members. A man and a dog walking along a road. They weren’t memories, so much as dreamlike glimpses of another world.
Cobb’s history of brain science will fascinate readers quite as much as it occasionally horrifies. Cobb, a zoologist by training, has focused for much of his career on the sense of smell and the neurology of the humble fruit fly maggot. The Idea of the Brain sees him coming up for air, taking in the big picture before diving once again into the minutiae of his profession.
He makes a hell of a splash, too, explaining how the analogies we use to describe the brain both enrich our understanding of that mysterious organ, and hamstring our further progress. He shows how mechanical metaphors for brain function lasted well into the era of electricity. And he explains why computational metaphors, though unimaginably more fertile, are now throttling his science.
Study the brain as though it were a machine and in the end (and after much good work) you will run into three kinds of trouble.
First you will find that reverse engineering very complex systems is impossible. In 2017 two neuroscientists, Eric Jonas and Konrad Paul Kording employed the techniques they normally used to analyse the brain to study the Mos 6507 processor — a chip found in computers from the late 1970s and early 1980s that enabled machines to run video games such as Donkey Kong, Space Invaders or Pitfall. Despite their powerful analytical armoury, and despite the fact that there is a clear explanation for how the chip works, they admitted that their study fell short of producing “a meaningful understanding”.
Another problem is the way the meanings of technical terms expand over time, warping the way we think about a subject. The French neuroscientist Romain Brette has a particular hatred for that staple of neuroscience, “coding”, an term first invoked by Adrian in the 1920s in a technical sense, in which there is a link between a stimulus and the activity of the neuron. Today almost everybody think of neural codes as representions of that stimulus, which is a real problem, because it implies that there must be an ideal observer or reader within the brain, watching and interpreting those representations. It may be better to think of the brain as constructing information, rather than simply representing it — only we have no idea (yet) how such an organ would function. For sure, it wouldn’t be a computer.
Which brings us neatly to our third and final obstacle to understanding the brain: we take far too much comfort and encouragement from our own metaphors. Do recent advances in AI bring us closer to understanding how our brains work? Cobb’s hollow laughter is all but audible. “My view is that it will probably take fifty years before we understand the maggot brain,” he writes.
One last history lesson. In the 1970s, twenty years after Penfield electrostimulation studies, Michael Gazzaniga, a cognitive neuroscientist at the University of California, Santa Barbara, studied the experiences of people whose brains had been split down the middle in a desperate effort to control their epilepsy. He discovered that each half of the brain was, on its own, sufficient to produce a mind, albeit with slightly different abilities and outlooks in each half. “From one mind, you had two,” Cobb remarks. “Try that with a computer.”
Hearing the news brought veteran psychologist William Estes to despair: “Great,” he snapped, “now we have two things we don’t understand.”
This year I’ll be helping judge the Baillie Gifford prize for non-fiction. Radio 4 presenter Martha Kearney will chair the panel. Fellow judges are professor and author Shahidha Bari, New Statesmen writer Leo Robson, New York Times editor Max Strasser and journalist and author Bee Wilson. The winner will be announced on Thursday 19 November.
Early one morning in October 1874, a barge carrying three barrels of benzoline and five tons of gunpowder blew up in the Regent’s Canal, close to London Zoo. The crew of three were killed outright, scores of houses were badly damaged, the explosion could be heard 25 miles away, and “dead fish rained from the sky in the West End.”
This is a book about the weird, if obvious, intersection between firework manufacture and warfare. It is, ostensibly, the biography of a hero of the First World War, Frank Brock. And if it were the work of more ambitious literary hands, Brock would have been all you got. His heritage, his school adventures, his international career as a showman, his inventions, his war work, his violent death. Enough for a whole book, surely?
But Gunpowder and Glory is not a “literary” work, by which I mean it is neither self-conscious nor overwrought. Instead Henry Macrory (who anyway has already proved his literary chops with his 2018 biography of the swindler Whitaker Wright) has opted for what looks like a very light touch here, assembling and ordering the anecdotes and reflections of Frank Brock’s grandson Harry Smee about his family, their business as pyrotechnical artists, and, finally, about Frank, his illustrious forebear.
I suspect a lot of sweat went into such artlessness, and it’s paid off, creating a book that reads like fascinating dinner conversation. Reading its best passages, I felt I was discovering Brock the way Harry had as a child, looking into his mother’s “ancient oak chests filled with papers, medals, newspapers, books, photographs, an Intelligence-issue knuckleduster and pieces of Zeppelin and Zeppelin bomb shrapnel.”
For eight generations, the Brock family produced pyrotechnic spectaculars of a unique kind. Typical set piece displays in the eighteenth century included “Jupiter discharging lightning and thunder, Two gladiators combating with fire and sword, and Neptune finely carv’d seated in his chair, drawn by two sea horses on fire-wheels, spearing a dolphin.”
Come the twentieth century, Brock’s shows were a signature of Empire. It would take a write like Thomas Pynchon to do full justice to “a sixty foot-high mechanical depiction of the Victorian music-hall performer, Lottie Collins, singing the chorus of her famous song ‘Ta-ra-ra-boom-de-ay’ and giving a spirited kick of an automated leg each time the word ‘boom’ rang out.”
Frank was a Dulwich College boy, and one of that generation lost to the slaughter of the Great War. A spy and an inventor — James Bond and Q in one — he applied his inherited chemical and pyrotechnical genius to the war effort — by making a chemical weapon. It wasn’t any good, though: Jellite, developed during the summer of 1915 and named after its jelly-like consistency during manufacture, proved insufficiently lethal.
On such turns of chance do reputations depend, since we remember Frank Brock for his many less problematic inventions. Dover flares burned for seven and a half minutes
and lit up an area of three miles radius, as Winston Churchill put it, “as bright as Piccadilly”. U boats, diving to avoid these lights, encountered mines. Frank’s artificial fogs, hardly bettered since, concealed whole British fleets, entire Allied battle lines.
Then there are his incendiary bullets.
At the time of the Great War a decent Zeppelin could climb to 20,000 feet, travel at 47 mph for more than 1,000 miles, and stay aloft for 36 hours. Smee and Mcrory are well within their rights to call them “the stealth bombers of their time”.
Brock’s bullets tore them out of the sky. Sir William Pope, Brock’s advisor, and a professor of chemistry at Cambridge University, explained: “You need to imagine a bullet proceeding at several thousand feet a second, and firing as it passes through a piece of fabric which is no thicker than a pocket handkerchief.” All to rupture a gigantic sac of hydrogen sufficiently to make the gas explode. (Much less easy than you think; the Hindenburg only crashed because its entire outer envelope was set on fire.)
Frank died in an assault on the mole at Zeebrugge in 1918. He shouldn’t have been there. He should have been in a lab somewhere, cooking up another bullet, another light, another poison gas. Today, he surely would be suitably contained, his efforts efficiently channeled, his spirit carefully and surgically broken.
Frank lived at a time when it was possible — and men, at any rate, were encouraged — to be more than one thing. That this heroic idea overreached itself — that rugby field and school chemistry lab both dissolved seamlessly into the Somme — needs no rehearsing.
Still, we have lost something. When Frank went to school there was a bookstall near the station which sold “a magazine called Pluck, containing ‘the daring deeds of plucky sailors, plucky soldiers, plucky firemen, plucky explorers, plucky detectives, plucky railwaymen, plucky boys and plucky girls and all sorts of conditions of British heroes’.”
Frank was a boy moulded thus, and sneer as much as you want, we will not see his like again.
A king penguin raised on Georgia Island, off the coast of Antarctica. A European wolf. A spotted hyena in the Ngorongoro crater in Tanzania. A north Atlantic humpback whale born near the Dominican Republic. What could these four animals have in common?
What if they were all experiencing the same life event? After all, all animals are born, and all of them die. We’re all hungry sometimes, for food, or a mate.
How far can we push this idea? Do non-human animals have mid-life crises, for example? (Don’t mock; there’s evidence that some primates experience the same happiness curves through their life-course as humans do.)
Barbara Natterson-Horowitz, an evolutionary biologist, and Kathryn Bowers, an animal behaviorist, have for some years been devising and teaching courses at Harvard and the University of California at Los Angeles, looking for “horizontal identities” across species boundaries.
The term comes from Andrew Solomon’s 2014 book Far from the Tree, which contrasts vertical identities (between you and your parents and grandparents, say) with horizontal identities, which are “those among peers with whom you share similar attributes but no family ties”. The authors of Wildhood have expanded Solomon’s concept to include other species; “we suggest that adolescents share a horizontal identity,” they write: “temporary membership in a planet-wide tribe of adolescents.”
The heroes of Wildhood — Ursula the penguin, Shrink the hyena, Salt the whale and Slavc the wolf are all, (loosely speaking) “teens”, and like teens everywhere, they have several mountains to climb at once. They must learn how to stay safe, how to navigate social hierarchies, how to communicate sexually, and how to care for themselves. They need to become experienced, and for that, they need to have experiences.
Well into the 1980s, researchers were discouraged from discussing the mental lives of animals. The change in scientific culture came largely thanks to the video camera. Suddenly it was possible for behavioral scientists to observe, not just closely, but repeatedly, and in slow motion. Soon discoveries arose that could not possibly have been arrived at with the naked eye alone. An animal’s supposedly rote, mechanical behaviours turned out to be the product of learning, experiment, and experience. Stereotyped calls and gestures were unpacked to reveal, not language in the human sense, but languages nonetheless, and many were of dizzying complexity. Animals that we thought were driven by instinct (a word you’ll never even hear used these days), turned out to be lively, engaged, conscious beings, scrabbling for purchase in a confusing and unpredictable world.
The four tales that make up the bulk of Wildhood are more than “Just So” stories. “Every detail,” the authors explain, “is based on and validated by data from GPS satellite or radio collar studies, peer-reviewed scientific literature, published reports and interviews with the investigators involved”.
In addition, each offers a different angle on a wealth of material about animal behaviour. Examples of animal friendship, bullying, nepotism, exploitation and even property inheritance arrive in such numbers and at such a level of detail, it takes an ordinary, baggy human word like “friendship” or “bullying” to contain them.
“Level playing fields don’t exist in nature”, the authors assert, and this is an important point, given the book’s central claim that by understanding the “wildhoods” of other animals, we can develop better approaches “for compassionately and skillfully guiding human adolescents toward adulthood.”
The point is not to use non-human behaviour as an excuse for human misbehaviour. Lots of animals kill and exploit each other, but that shouldn’t make exploitation or murder acceptable. The point is to know which battles to pick. Making young school playtimes boring by quashing the least sign of competitiveness makes little sense, given the amount of biological machinery dedicated to judging and ranking in every animal species from humans to lobsters. On the other hand, every young animal, when it returns to its parents, gets a well-earned break from the “playground” — but young humans don’t. They’re now tied 24-7 to social media that prolongue, exaggerate and exacerbate the ranking process. Is the rise in mental health problems among the affluent young triggered by this added stress?
These are speculations and discussions for another book, for which Wildhood may prove a necessary and charming foundation. Introduced in the first couple of pages to California sea otters, swimming up to sharks one moment then fleeing from a plastic boat the next, the reader can’t help but recognise, in the animals’ overly bold and overly cautious behaviour, the gawkiness and tremor of their own adolescence.