An intellectual variant of whack-a-mole

Reading Joseph Mazur’s The Clock Mirage for The Spectator, 27 June 2020 

Some books elucidate their subject, mapping and sharpening its boundaries. The Clock Mirage, by the mathematician Joseph Mazur, is not one of them. Mazur is out to muddy time’s waters, dismantling the easy opposition between clock time and mental time, between physics and philosophy, between science and feeling.

That split made little sense even in 1922, when the philosopher Henri Bergson and the young physicist Albert Einstein (much against his better judgment) went head-to-head at the Société française de philosophie in Paris to discuss the meaning of relativity. (Or that was the idea. Actually they talked at complete cross-purposes.)

Einstein won. At the time, there was more novel insight to be got from physics than from psychological introspection. But time passes, knowledge accrues and fashions change. The inference (not Einstein’s, though people associate it with him) that time is a fourth dimension, commensurable with the three dimensions of space, is looking decidedly frayed. Meanwhile Bergson’s psychology of time has been pruned by neurologists and put out new shoots.

Our lives and perceptions are governed, to some extent, by circadian rhythms, but there is no internal clock by which we measure time in the abstract. Instead we construct events, and organise their relations, in space. Drivers, thinking they can make up time with speed, acquire tickets faster than they save seconds. Such errors are mathematically obvious, but spring from the irresistible association we make (poor vulnerable animals that we are) between speed and survival.

The more we understand about non-human minds, the more eccentric and sui generis our own time sense seems to be. Mazur ignores the welter of recent work on other animals’ sense of time — indeed, he winds the clock back several decades in his careless talk of animal ‘instincts’ (no one in animal behaviour uses the ‘I’ word any more). For this, though, I think he can be forgiven. He has put enough on his plate.

Mazur begins by rehearsing how the Earth turns, how clocks were developed, and how the idea of universal clock time came hot on the heels of the railway (mistimed passenger trains kept running into each other). His mind is engaged well enough throughout this long introduction, but around page 47 his heart beats noticeably faster. Mazur’s first love is theory, and he handles it well, using Zeno’s paradoxes to unpack the close relationship between psychology and mathematics.

In Zeno’s famous foot race, by the time fleet-footed Achilles catches up to the place where the plodding tortoise was, the tortoise has moved a little bit ahead. That keeps happening ad infinitum, or at least until Newton (or Leibniz, depending on who you think got to it first) pulls calculus out of his hat. Calculus is an algebraic way of handling (well, fudging) the continuity of the number line. It handles vectors and curves and smooth changes — the sorts of phenomena you can measure only if you’re prepared to stop counting.

But what if reality is granular after all, and time is quantised, arriving in discrete packets like the frames of a celluloid film stuttering through the gate of a projector? In this model of time, calculus is redundant and continuity is merely an illusion. Does it solve Zeno’s paradox? Perhaps it makes it 100 times more intractable. Just as motion needs time, time needs motion, and ‘we might wonder what happens to the existence of the world between those falling bits of time sand’.

This is all beautifully done, and Mazur, having hit his stride, maintains form throughout the rest of the book, though I suspect he has bitten off more than any reader could reasonably want to swallow. Rather than containing and spotlighting his subject, Mazur’s questions about time turn out (time and again, I’m tempted to say) to be about something completely different, as though we were playing an intellectual variant of whack-a-mole.

But this, I suppose, is the point. Mazur quotes Henri Poincaré:

Not only have we not direct intuition of the equality of two periods, but we have not even direct intuition of the simultaneity of two events occurring in two different places.

Our perception of time is so fractured, so much an ad hoc amalgam of the chatter of numerous, separately evolved systems (for the perception of motion; for the perception of daylight; for the perception of risk, and on and on — it’s a very long list), it may in the end be easier to abandon talk of time altogether, and for the same reason that psychologists, talking shop among themselves, eschew vague terms suchas ‘love’.

So much of what we mean by time, as we perceive it day to day, is really rhythm. So much of what physicists mean by time is really space. Time exists, as love exists, as a myth: real because contingent, real because constructed, a catch-all term for phenomena bigger, more numerous and far stranger than we can yet comprehend.

In praise of tiny backpacks

Reading The Bird Way by Jennifer Ackerman for New Scientist, 17 June 2020

Visit the Australian National Botanic Gardens in Canberra, and you may stumble upon an odd sight: a human figure, festooned with futuristic-looking monitoring gear. Is it a statue? No: when children poke it (this happens a lot) the statue blinks.

Meet Jessica McLachlan, a researcher at Australian National University, at work studying the fine, never-before-detected details of bird behaviour. The gear she wears is what it takes to observe the world as birds themselves see it. Not to put too fine a point on it, birds are a lot faster than we are.

Bird brains are miracles of miniaturisation. Their neurons are smaller, more numerous, and more densely packed. They differ architecturally too, cramming more and faster processing power into a smaller space.

This means that in the bird world, things happen fast — sometimes too fast for us to see. (Unless you film blue-capped cordon-bleus at 300 frames-per-second video, how could you possibly know that they tap-dancing in time with their singing?)

Modern recording equipment enables us to study an otherwise invisible world. For example, by strapping tiny backpacks to seabirds, and correlating their flight with known ocean activities, we’ve discovered that birds have a quite extraordinary sense of smell, following krill across apparently featureless horizons that are, for them, “elaborate landscapes of eddying odor plumes.”

Science and nature writer Jennifer Ackerman’s fresh, re-invigorated account of the world of birds is arranged on traditional lines (sections cover her subjects’ singing, work play, love and parenting), but it is informed, and transformed, by accounts of cybernetically enhanced field studies, forensic analyses and ambitious global collaboration. She has travelled far and talked to many, she is generous to a fault with her academic sources, and her descriptions of her visits, field trips and adventures are engaging, but never obtrusive.

Her account centers on the bird life of Australia. This is reasonable: bird song began here. Songbirds, parrots and pigeons evolved here. In Australia, birds fill more ecological niches, are smarter, and longer-lived.

Like every ornithological writer before her, Ackerman is besotted by the sheer variety of her subjects. One of the most endearing passages in her book compares parrots and corvids. Both are intelligent, highly social species, but, after 92 million years of evolutionary separation, the similarities stop there. Ravens are frightened of novelty. Keas lap it up. Keas, confronted by a new researcher, will abandon a task to go play with the stranger. Ravens will simply wig out. Ravens have a strict feeding hierarchy. Curious male Kea will good-naturedly stuff food down an unfamiliar youngster’s gullet to the point where it’s fending them off.

Ackerman’s account is often jaw-dropping, and never more shocking than when she assembles the evidence for the cultural sophistication of bird song. Birds decode far more from sounds than we do and until recently we’ve been deaf to their acoustic complexity. Japanese tits use 11 different notes in their songs, and it’s the combination of notes that encodes information. Swap two notes around, and you elicit different responses. If this isn’t quite syntax, it’s something very like it. The drongo has absolute conscious control over its song, using up to 45 mimicked alarm calls to frighten other species, such as meerkats, into dropping their lunch — and it will target specific warning calls at individuals so they don’t twig what’s going on.

Meanwhile, many different species of bird worldwide, from Australia to Africa to the Himalayas, appear to have developed a universal, and universally comprehensible signal to warn of the approach of brood parasites (cuckoos and the like).

If the twentieth century was the golden age of laboratory study, the 21st is shaping up to become a renaissance for the sorts of field studies Charles Darwin would recognise. Now that cybernetically enhanced researchers like Jessica McLachlan can follow individuals, we have a chance to gauge and understand the intelligence exhibited by the animals around us. Intelligence en masse is, after all, really hard to spot. It doesn’t suit tabulation, or statistical analysis. It doesn’t make itself known from a distance. Intelligent behaviour is unusual. It’s novel. It takes a long time to spot.

If birds are as intelligent as so many of the stories in Ackerman’s eye-opening book suggest, then this, of course, may only be the start of our problems. In Sweden, corvid researchers Mathias and Helena Osvath have befriended a raven who turns up for their experiments, aces them, then flaps off again. “In the ethics section of grant applications,” says Matthias, “it’s difficult to explain…”

Humanity unleashed

Reading Rutger Bregman’s Humankind: A hopeful history for New Scientist, 10 June 2020

In 1651 English philosopher Thomas Hobbes startled the world with Leviathan, an account of the good and evil lurking in human nature. Hobbes argued that people, left to their own devices, were naturally viscious. (Writing in the aftermath of Europe’s cataclysmic Thirty Years War, the evidence was all around.) Ultimately, though, Hobbes’s vision was positive. Humans are also naturally gregarious. Gathering in groups, eventually we become more together than we were apart. We become villages, societies, whole civilisations.

Hobbes’ argument can be taken in two ways. We can glory in what we have built over the course of generations. Or we can live in terror of that future moment when the thin veneer of our civilisation cracks and lets all the devils out.

Even as I was writing this review, I came across the following story. In April, at the height of the surge in coronavirus cases, Americans purchased more guns than at any other point since the FBI began collecting data over 20 years ago. I would contend that these are not stupid people. They are, I suspect, people who have embraced a negatively Hobbesian view of the world, and are expecting the apocalypse.

Belief in innate human badness is self-fulfilling. (Bregman borrows from medical jargon and calls it a nocebo: a negative expectation that make one’s circumstances progressively worse.) And we do seem to try everything in our power to think the worst of ourselves. For instance, we give our schoolchildren William Golding’s 1954 novel Lord of the Flies to read (and fair enough — it’s a very good book) but nobody thinks to mention that the one time boys really were trapped on a desert island for over a year — on a rocky Polynesian atoll without fresh water, in 1965 — they survived, stayed fit and healthy, successfully set a lad’s broken leg, and formed friendships that have lasted a lifetime. Before Bregman came along you couldn’t even find this story on the internet.

From this anecdotal foundation, Bregman assembles his ferocious argument, demolishing one Hobbesian shibboleth after another. Once the settlers of Easter Island had chopped down all the trees on their island (the subject of historian Jared Diamond’s bestelling 2011 book Collapse), their civilisation did not fall apart. It thrived — until European voyagers arrived, bringing diseases and the slave trade. When Catherine Susan Genovese was murdered in New York City on 13 March 1964 (the notorious incident which added the expression “bystander effect” to the psychological lexicon), her neighbours did not watch from out of their windows and do nothing. They called the police. Her neighbour rushed out into the street and held her while she was dying.

Historians and reporters can’t be trusted; neither, alas, can scientists. Philip Zimbardo’s Stanford prison experiment, conducted in August 1971, was supposed to have spun out of control in less than two days, as students playing prison guards set about abusing and torturing fellow students cast in the role of prisoners. Almost everything that has been written about the experiment is not merely exaggerated, it’s wrong. When in 2001 the BBC restaged the experiment, the guards and the prisoners spent the whole time sitting around drinking tea together.

Bregman also discusses the classic “memory” experiment by Stanley Milgram, in which a volunteer is persuaded to electrocute a person nearly to death (an actor, in fact, and in on the wheeze). The problem here is less the experimental design and more the way the experiment was intepreted.

Early accounts took the experiment to mean that people are robots, obeying orders unthinkingly. Subsequent close study of the transcripts shows something rather different: that people are desperate to do the right thing, and their anxiety makes them frighteningly easy to manipulate.

If we’re all desperate to be good people, then we need a new realism when it comes to human nature. We can’t any longer assume that because we are good, those who oppose us must be bad. We must learn to give people a chance. We must learn to stop manipulating people the whole time. From schools to prisons, from police forces to political systems, Bregman visits projects around the world that, by behaving in ways that can seem surreally naive, have resolved conflicts, reformed felons, encouraged excellence and righted whole economies.

This isn’t an argument between left and right, between socialist and conservative; it’s but about what we know about human nature and how we can accommodate a better model of it into our lives. With Humankind Bregman moves from politics, his usual playground, into psychological, even spiritual territory. I am fascinated to know where his journey will lead.

 

Joy in the detail

Reading Charles Darwin’s Barnacle and David Bowie’s Spider by Stephen Heard for the Spectator, 16 May 2020

Heteropoda davidbowie is a species of huntsman spider. Though rare, it has been found in parts of Malaysia, Singapore, Indonesia and possibly Thailand. (The uncertainty arises because it’s often mistaken for a similar-looking species, the Heteropoda javana.) In 2008 a German collector sent photos of his unusual-looking “pet” to Peter Jäger, an arachnologist at the Senckenberg Research Institute in Frankfurt. Consequently, and in common with most other living finds, David Bowie’s spider was discovered twice: once in the field, and once in the collection.

Bowie’s spider is famous, but not exceptional. Jäger has discovered more than 200 species of spider in the last decade, and names them after politicians, comedians and rock stars to highlight our ecological plight. Other researchers find more pointed ways to further the same cause. In the first month of Donald Trump’s administration, Iranian-Canadian entomologist Vazrick Nazari discovered a moth with a head crowned with large, blond comb-over scales. There’s more to Neopalpa donaldtrumpi than a striking physical resemblance: it lives in a federally protected area around where the border wall with Mexico is supposed to go. Cue headlines.

Species are becoming extinct 100 times faster than they did before modern humans arrived. This makes reading a book about the naming of species a curiously queasy affair. Nor is there much comfort to be had in evolutionary ecologist Stephen Heard’s observation that, having described 1.5 million species, we’ve (at very best) only recorded half of what’s out there. There is, you may recall, that devastating passage in Cormac McCarthy’s western novel Blood Meridian in which Judge Holden meticulously records a Native American artifact in his sketchbook — then destroys it. Given that to discover a species you must, by definition, invade its environment, Holden’s sketch-and-burn habit appears to be a painfully accurate metonym for what the human species is doing to the planet. Since the 1970s (when there used to be twice as many wild animals than there are now) we’ve been discovering and endangering new species in almost the same breath.

Richard Spruce, one of the Victorian era’s great botanical explorers, who spent 15 years exploring the Amazon from the Andes to its mouth, is a star of this short, charming book about how we have named and ordered the living world. No detail of his bravery, resilience and grace under pressure come close to the eloquence of this passing quotation, however: “Whenever rains, swollen streams, and grumbling Indians combined to overwhelm me with chagrin,” he wrote in his account of his travels, “I found reason to thank heaven which had enabled me to forget for the moment all my troubles in the contemplation of a simple moss.”

Stephen Heard, an evolutionary ecologist based in Canada, explains how extraordinary amounts of curiosity have been codified to create a map of the living world. The legalistic-sounding codes by which species are named are, it turns out, admirably egalitarian, ensuring that the names amateurs give species are just as valid as those of professional scientists.

Formal names are necessary because of the difficulty we have in distinguishing between similiar species. Common names run into this difficulty all the time. There too many of them, so the same species gets different names in different languages. At the same time, there aren’t enough of them, so that, as Heard points out, “Darwin’s finches aren’t finches, African violets aren’t violets, and electric eels aren’t eels;” Robins, blackbirds and badgers are entirely different animals in Europe and North America; and virtually every flower has at one time or another been called a daisy.

Also names tend, reasonably enough, to be descriptive. This is fine when you’re distinguishing between, say, five different types of fish When there are 500 different fish to sort through, however, absurdity beckons. Heard lovingly transcribes the pre-Linnaean species name of the English whiting, formulated around 1738: “Gadus, dorso tripterygio, ore cirrato, longitudine ad latitudinem tripla, pinna ani prima officulorum trigiata“. So there.

It takes nothing away from the genius of Swedish physician Carl Linnaeus, who formulated the naming system we still use today, to say that he came along at the right time. By Linnaeus’s day, it was possible to look things up. Advances in printing and distribution had made reference works possible. Linnaeus’s innovation was to decouple names from descriptions. And this, as Heard reveals in anecdote after anecdote, is where the fun now slips in: the mythopoeic cool of the baboon Papio anubis, the mischevious smarts of the beetle Agra vation, the nerd celebrity of lemur Avahi cleesi.

Hearst’s taxonomy of taxonomies makes for somewhat thin reading; this is less of a book, more like a dozen interesting magazine articles flying in close formation. But its close focus, bringing to life minutiae of both the living world and the practice of science, is welcome.

I once met Michael Land, the neurobiologist who figured out how the lobster’s eye works. He told me that the trouble with big ideas is that they get in the way of the small ones. Heard’s lesson, delivered with such a light touch, is the same. The joy, and much of the accompanying wisdom, lies in the detail.

Goodbye to all that

Reading Technologies of the Human Corpse by John Troyer for the Spectator, 11 April 2020

John Troyer, the director of the Centre for Death and Society at the University of Bath, has moves. You can find his interpretative dances punctuating a number of his lectures, which go by such arresting titles as ‘150 Years of the Human Corpse in American History in Under 15 Minutes with Jaunty Background Music’ and ‘Abusing the Corpse Even More: Understanding Necrophilia Laws in the USA — Now with more Necro! And more Philia!’ (Wisconsin and Ohio are, according to Troyer’s eccentric looking and always fascinating website, ‘two states that just keep giving and giving when it comes to American necrophilia cases’.)

Troyer’s budding stand-up career has taken a couple of recent knocks. First was the ever more pressing need for him to crack on with his PhD (his dilatoriness was becoming a family joke). Technologies of the Human Corpse is yanked, not without injury, from that career-establishing academic work. Even as he assembled the present volume, however, there came another, far more personal, blow.

Late in July 2017 Troyer’s younger sister Julie was diagnosed with an aggressive brain cancer. Her condition deteriorated far more quickly than anyone expected, and on 29 July 2018 she died. This left Troyer — the engaging young American death scholar sprung from a family of funeral directors — having to square his erudite and cerebral thoughts on death and dead bodies with the fact he’d just kissed his sister goodbye. He interleaves poetical journal entries composed during Julie’s dying and her death, her funeral and her commemoration, between chapters written by a younger, jollier and of course shallower self.

To be brutal, the poems aren’t up to much, and on their own they wouldn’t add a great deal by way of nuance or tragedy. Happily for us, however, and to Troyer’s credit, he has transformed them into a deeply moving 30-page memoir that now serves as the book’s preface. This, then, is Troyer’s monster: a powerful essay about dying and bereavement; a set of poems written off the cuff and under great stress; and seven rather disconnected chapters about what’s befallen the human corpse in the past century or so.

Even as the book was going to print, Troyer explains in a hurried postscript, his father, a retired undertaker, lost consciousness following a cardiac arrest and was very obviously dying:

“And seeing my father suddenly fall into a comatose state so soon after watching my sister die is impossible to fully describe: I understand what is happening, yet I do not want to understand what is happening.”

This deceptively simple statement from Troyer the writer is streets ahead of anything Troyer the postgrad can pull off.

But to the meat of the book. The American civil war saw several thousand corpses embalmed and transported on new-fangled railway routes across the continent. The ability to preserve bodies, and even lend them a lifelike appearance months after death, created a new industry that, in various configurations and under several names, now goes by the daunting neologism of ‘deathcare provision’. In the future, this industry will be seen ‘transforming the mostly funeralisation side of the business into a much broader, human body parts and tissue distribution system’, as technical advances make increasing use of cadavers and processed cadaver parts.

So how much is a dead body worth? Between $30,000 and $50,000, says Troyer — five times as much for donors processed into medical implants, dermal implants and demineralised bone matrices. Funds and materials are exchanged through a network of body brokers who serve as middlemen between biomedical corporations such as Johnson & Johnson and the usual sources of human cadavers — medical schools, funeral homes and mortuaries. It is by no stretch an illegal trade, nor is it morally problematic in most instances; but it is rife with scandal. As one involved party remarks: ‘If you’re cremated, no one is ever going to know if you’re missing your shoulders or knees or your head.’

Troyer is out to show how various industries serve to turn our dead bodies into ‘an unfettered source of capital’. The ‘fluid men’ of Civil War America — who toured the battlefields showing keen students how to embalm a corpse (and almost always badly) — had no idea what a strange story they had started. Today, as the anatomist Gunther von Hagens poses human cadavers in sexual positions to pique and titillate worldwide audiences, we begin to get a measure of how far we have come. Hagens’s posthumous pornography reveals, says Troyer, ‘the ultimate taxonomic power over nature: we humans, or at least our bodies, can live forever because we pull ourselves from nature’.

Technologies of the Human Corpse is a bit of a mess, but I have a lot of time for Troyer. His insights are sound, and his recent travails may yet (and at high human cost — but it was ever thus) make him a writer of some force.

 

“Fat with smell, dissonant and dirty”

The revolt against scentlessness has been gathering for a while. Muchembled namechecks avant garde perfumes with names like Bat and Rhinoceros. A dear friend of mine favours Musc Kublai Khan for its faecal notes. Another spends a small fortune to smell like cat’s piss. Right now I’m wearing Andy Tauer’s Orange Star — don’t approach unless you like Quality Street orange cremes macerated in petrol…

Reading Robert Muchembled’s Smells: A Cultural History of Odours in Early Modern Times and Isabel Bannerman’s Scent Magic: Notes from a Gardener for the Telegraph, 18 April 2020

Cog ergo sum

Reading Matthew Cobb’s The Idea of the Brain for New Scientist 15 April 2020

Ask a passer-by in 2nd-century Rome where consciousness resided — in the heart or in the head — and he was sure to say, in the heart. The surgeon-philosopher Galen of Pergamon had other ideas. During one show he had someone press upon the exposed brain of a pig, which promptly (and mercifully) passed out. Letting go brought the pig back to consciousness.

Is the brain one organ, or many? Are our mental faculties localised in the brain? 1600 years after, Galen a Parisian gentleman tried to blow his brains out with a pistol. Instead he shot away his frontal bone, while leaving the anterior lobes of his brain bare but undamaged. He was rushed to the Hôpital St. Louis, where Ernest Aubertin spent a few vain hours trying to save his life. Aubertin discovered that if he pressed a spatula on the patient’s brain while he was speaking, his speech “was suddenly suspended; a word begun was cut in two. Speech returned as soon as pressure was removed,” Aubertin reported.

Does the brain contain all we are? Eighty years after Aubertin, Montreal neurosurgeon Wilder Penfield was carrying out hundreds of brain operations to relieve chronic temporal-lobe epilepsy. Using delicate electrodes, he would map the safest cuts to make — ones that would not excise vital brain functions. For the patient, the tiniest regions, when stimulated, accessed the strangest experiences. A piano being played. A telephone conversation between two family members. A man and a dog walking along a road. They weren’t memories, so much as dreamlike glimpses of another world.

Cobb’s history of brain science will fascinate readers quite as much as it occasionally horrifies. Cobb, a zoologist by training, has focused for much of his career on the sense of smell and the neurology of the humble fruit fly maggot. The Idea of the Brain sees him coming up for air, taking in the big picture before diving once again into the minutiae of his profession.

He makes a hell of a splash, too, explaining how the analogies we use to describe the brain both enrich our understanding of that mysterious organ, and hamstring our further progress. He shows how mechanical metaphors for brain function lasted well into the era of electricity. And he explains why computational metaphors, though unimaginably more fertile, are now throttling his science.

Study the brain as though it were a machine and in the end (and after much good work) you will run into three kinds of trouble.

First you will find that reverse engineering very complex systems is impossible. In 2017 two neuroscientists, Eric Jonas and Konrad Paul Kording employed the techniques they normally used to analyse the brain to study the Mos 6507 processor — a chip found in computers from the late 1970s and early 1980s that enabled machines to run video games such as Donkey Kong, Space Invaders or Pitfall. Despite their powerful analytical armoury, and despite the fact that there is a clear explanation for how the chip works, they admitted that their study fell short of producing “a meaningful understanding”.

Another problem is the way the meanings of technical terms expand over time, warping the way we think about a subject. The French neuroscientist Romain Brette has a particular hatred for that staple of neuroscience, “coding”, an term first invoked by Adrian in the 1920s in a technical sense, in which there is a link between a stimulus and the activity of the neuron. Today almost everybody think of neural codes as representions of that stimulus, which is a real problem, because it implies that there must be an ideal observer or reader within the brain, watching and interpreting those representations. It may be better to think of the brain as constructing information, rather than simply representing it — only we have no idea (yet) how such an organ would function. For sure, it wouldn’t be a computer.

Which brings us neatly to our third and final obstacle to understanding the brain: we take far too much comfort and encouragement from our own metaphors. Do recent advances in AI bring us closer to understanding how our brains work? Cobb’s hollow laughter is all but audible. “My view is that it will probably take fifty years before we understand the maggot brain,” he writes.

One last history lesson. In the 1970s, twenty years after Penfield electrostimulation studies, Michael Gazzaniga, a cognitive neuroscientist at the University of California, Santa Barbara, studied the experiences of people whose brains had been split down the middle in a desperate effort to control their epilepsy. He discovered that each half of the brain was, on its own, sufficient to produce a mind, albeit with slightly different abilities and outlooks in each half. “From one mind, you had two,” Cobb remarks. “Try that with a computer.”

Hearing the news brought veteran psychologist William Estes to despair: “Great,” he snapped, “now we have two things we don’t understand.”

Pluck

Reading Gunpowder and Glory: The Explosive Life of Frank Brock OBE by Harry Smee and Henry Macrory for the Spectator, 21 March 2020

Early one morning in October 1874, a barge carrying three barrels of benzoline and five tons of gunpowder blew up in the Regent’s Canal, close to London Zoo. The crew of three were killed outright, scores of houses were badly damaged, the explosion could be heard 25 miles away, and “dead fish rained from the sky in the West End.”

This is a book about the weird, if obvious, intersection between firework manufacture and warfare. It is, ostensibly, the biography of a hero of the First World War, Frank Brock. And if it were the work of more ambitious literary hands, Brock would have been all you got. His heritage, his school adventures, his international career as a showman, his inventions, his war work, his violent death. Enough for a whole book, surely?

But Gunpowder and Glory is not a “literary” work, by which I mean it is neither self-conscious nor overwrought. Instead Henry Macrory (who anyway has already proved his literary chops with his 2018 biography of the swindler Whitaker Wright) has opted for what looks like a very light touch here, assembling and ordering the anecdotes and reflections of Frank Brock’s grandson Harry Smee about his family, their business as pyrotechnical artists, and, finally, about Frank, his illustrious forebear.

I suspect a lot of sweat went into such artlessness, and it’s paid off, creating a book that reads like fascinating dinner conversation. Reading its best passages, I felt I was discovering Brock the way Harry had as a child, looking into his mother’s “ancient oak chests filled with papers, medals, newspapers, books, photographs, an Intelligence-issue knuckleduster and pieces of Zeppelin and Zeppelin bomb shrapnel.”

For eight generations, the Brock family produced pyrotechnic spectaculars of a unique kind. Typical set piece displays in the eighteenth century included “Jupiter discharging lightning and thunder, Two gladiators combating with fire and sword, and Neptune finely carv’d seated in his chair, drawn by two sea horses on fire-wheels, spearing a dolphin.”

Come the twentieth century, Brock’s shows were a signature of Empire. It would take a write like Thomas Pynchon to do full justice to “a sixty foot-high mechanical depiction of the Victorian music-hall performer, Lottie Collins, singing the chorus of her famous song ‘Ta-ra-ra-boom-de-ay’ and giving a spirited kick of an automated leg each time the word ‘boom’ rang out.”

Frank was a Dulwich College boy, and one of that generation lost to the slaughter of the Great War. A spy and an inventor — James Bond and Q in one — he applied his inherited chemical and pyrotechnical genius to the war effort — by making a chemical weapon. It wasn’t any good, though: Jellite, developed during the summer of 1915 and named after its jelly-like consistency during manufacture, proved insufficiently lethal.

On such turns of chance do reputations depend, since we remember Frank Brock for his many less problematic inventions. Dover flares burned for seven and a half minutes
and lit up an area of three miles radius, as Winston Churchill put it, “as bright as Piccadilly”. U boats, diving to avoid these lights, encountered mines. Frank’s artificial fogs, hardly bettered since, concealed whole British fleets, entire Allied battle lines.

Then there are his incendiary bullets.

At the time of the Great War a decent Zeppelin could climb to 20,000 feet, travel at 47 mph for more than 1,000 miles, and stay aloft for 36 hours. Smee and Mcrory are well within their rights to call them “the stealth bombers of their time”.

Brock’s bullets tore them out of the sky. Sir William Pope, Brock’s advisor, and a professor of chemistry at Cambridge University, explained: “You need to imagine a bullet proceeding at several thousand feet a second, and firing as it passes through a piece of fabric which is no thicker than a pocket handkerchief.” All to rupture a gigantic sac of hydrogen sufficiently to make the gas explode. (Much less easy than you think; the Hindenburg only crashed because its entire outer envelope was set on fire.)

Frank died in an assault on the mole at Zeebrugge in 1918. He shouldn’t have been there. He should have been in a lab somewhere, cooking up another bullet, another light, another poison gas. Today, he surely would be suitably contained, his efforts efficiently channeled, his spirit carefully and surgically broken.

Frank lived at a time when it was possible — and men, at any rate, were encouraged — to be more than one thing. That this heroic idea overreached itself — that rugby field and school chemistry lab both dissolved seamlessly into the Somme — needs no rehearsing.

Still, we have lost something. When Frank went to school there was a bookstall near the station which sold “a magazine called Pluck, containing ‘the daring deeds of plucky sailors, plucky soldiers, plucky firemen, plucky explorers, plucky detectives, plucky railwaymen, plucky boys and plucky girls and all sorts of conditions of British heroes’.”

Frank was a boy moulded thus, and sneer as much as you want, we will not see his like again.