Nothing happens without a reason

Reading Journey to the Edge of Reason: The Life of Kurt Gödel by Stephen Budiansky for the Spectator, 29 May 2021

The 20th-century Austrian mathematician Kurt Gödel did his level best to live in the world as his philosophical hero Gottfried Wilhelm Leibnitz imagined it: a place of pre-established harmony, whose patterns are accessible to reason.

It’s an optimistic world, and a theological one: a universe presided over by a God who does not play dice. It’s most decidedly not a 20th-century world, but “in any case”, as Gödel himself once commented, “there is no reason to trust blindly in the spirit of the time.”

His fellow mathematician Paul Erdös was appalled: “You became a mathematician so that people should study you,” he complained, “not that you should study Leibnitz.” But Gödel always did prefer study to self-expression, and is this is chiefly why we know so little about him, and why the spectacular deterioration of his final years — a fantasmagoric tale of imagined conspiracies, strange vapours and shadowy intruders, ending in his self-starvation in 1978 — has come to stand for the whole of his life.

“Nothing, Gödel believed, happened without a reason,” says Stephen Burdiansky. “It was at once an affirmation of ultrarationalism, and a recipe for utter paranoia.”

You need hindsight to see the paranoia waiting to pounce. But the ultrarationalism — that was always tripping him up. There was something worryingly non-stick about him. He didn’t so much resist the spirit of the time as blunder about totally oblivious of it. He barely noticed the Anschluss, barely escaped Vienna as the Nazis assumed control, and, once ensconced at the Institute for Advanced Study at Princeton, barely credited that tragedy was even possible, or that, say, a friend might die in a concentration camp (it took three letters for his mother to convince him).

Many believed that he’d blundered, in a way typical to him, into marriage with his life-long partner, a foot-care specialist and divorcée called Adele Nimbursky. Perhaps he did. But Burdiansky does a spirited job of defending this “uneducated but determined” woman against the sneers of snobs. If anyone kept Gödel rooted to the facts of living, it was Adele. She once stuck a concrete flamingo, painted pink and black, in a flower bed right outside his study window. All evidence suggests he adored it.

Idealistic and dysfunctional, Gödel became, in mathematician Jordan Ellenberg’s phrase, “the romantic’s favourite mathematician”, a reputation cemented by the fact that we knew hardly anything about him. Key personal correspondence was destroyed at his death, while his journals and notebooks — written in Gabelsberger script, a German shorthand that had fallen into disuse by the mid-1920s — resisted all-comers until Cheryl Dawson, wife of the man tasked with sorting through Gödel’s mountain of posthumous papers — learned how to transcribe it all.

Biographer Stephen Budiansky is the first to try to give this pile of new information a human shape, and my guess is it hasn’t been easy.

Burdiansky handles the mathematics very well, capturing the air of scientific optimism that held sway over the intellectual Vienna and induced Germany’s leading mathematician David Hilbert to declare that “in mathematics there is *nothing* unknowable!”

Solving Hilbert’s four “Problems of Laying Foundations for Mathematics” of 1928 was supposed to secure the foundations of mathematics for good, and Gödel, a 22-year-old former physics student, solved one of them. Unfortunately for Hilbert and his disciples, however, Gödel also proved the insolubility of the other three. So much for the idea that all mathematics could be derived from the propositions of logic: Gödel demonstrated that logic itself was flawed.

This discovery didn’t worry Gödel nearly so much as it did his contemporaries. For Gödel, as Burdiansky explains, “Mathematical objects and a priori truth was as real to him as anything the senses could directly perceive.” If our reason failed, well, that was no reason to throw away the world: we would always be able to recognise some truths through intuition that could never be established through computation. That, for Gödel, was the whole point of being human.

It’s one thing to be a Platonist in a world dead set against Platonism, or an idealist in the world that’s gone all-in with materialism. It’s quite another to see acts of sabotage in the errors of TV listings magazines, or political conspiracy in the suicide of King Ludwig II of Bavaria. The Elysian calm and concentration afforded Gödel after the second world war at the Institute of Advanced Study probably did him more harm than good. “Gödel is too alone,” his friend Oskar Morgenstern fretted: “he should be given teaching duties; at least an hour a week.”

In the end, though, neither his friendships nor his marriage nor that ridiculous flamingo could tether to the Earth a man who had always preferred to write for his desk drawer, and Burdiansky, for all his tremendous efforts and exhaustive interrogations of Godel’s times and places, acquaintances and offices, can only leave us, at the end, with an immeasurably enriched version of Gödel the wise child. It’s an undeniably distracting and reductive picture. But — and this is the trouble — it’s not wrong.

To hell with the philosopause!

Reading Hawking Hawking: The Selling of a Scientific Celebrity by Charles Seife for the Spectator, 1 May 2021

I could never muster much enthusiasm for the theoretical physicist Stephen Hawking. His work, on the early universe and the nature of spacetime, was Nobel-worthy, but those of us outside his narrow community were horribly short-changed. His 1988 global best-seller A Brief History of Time was incomprehensible, not because it was difficult, but because it was bad.

Nobody, naturally, wanted to ascribe Hawking’s popular success to his rare form of Motor Neurone Disease, Hawking least of all. He afforded us no room for horror or, God forbid, pity. In 1990, asked a dumb question about how his condition might have shaped his work (because people who suffer ruinous, debilitating illnesses acquire compensating superpowers, right?) Hawking played along: “I haven’t had to lecture or teach undergraduates, and I haven’t had to sit on tedious and time-consuming committees. So I have been able to devote myself completely to research.”

The truth — that Hawking was one of the worst popular communicators of his day — is as evident as it is unsayable. A Brief History of Time was incomprehensible because after nearly five years’ superhuman effort, the author proved incapable of composing a whole book unaided. He couldn’t even do mathematics the way most people do it, by doodling, since he’d already lost the use of his hands. He could not jot notes. He could not manipulate equations. He had to turn every problem he encountered into a species of geometry, just to be able to think about it. He held his own in an impossibly rarified profession for years, but the business of popular communication was beyond him. As was communication, in the end, according to Hawking’s late collaborator Andy Strominger: “You would talk about words per minute, and then it went to minutes per word, and then, you know, it just got slower and slower until it just sort of stopped.”

Hawking became, in the end, a computerised patchwork of hackneyed, pre-stored utterances and responses. Pull the string at his back and marvel. Charles Seife, a biographer braver than most, begins by staring down the puppet. His conceit is to tell Stephen Hawking’s story backwards, peeling back the layers of celebrity and incapacity to reveal the wounded human within.

It’s a tricksy idea that works so well, you wonder why no-one thought of it before (though ordering his material and his arguments in this way must have nearly killed the poor author).

Hawking’s greatest claim to fame is that he discovered things about black holes — still unobserved at that time — that set the two great schools of theoretical physics, quantum mechanics and relativity, at a fresh and astonishingly creative loggerheads.

But a new golden era of astronomical observation dawned almost immediately after, and A Brief History was badly outdated before it even hit the shelves. It couldn’t even get the date of the universe right.

It used to be that genius that outlived its moment could reinvent itself. When new-fangled endocrine science threw Ivan Pavlov’s Nobel-winning physiology into doubt, he reinvented himself as a psychologist (and not a bad one at that).

Today’s era of narrow specialism makes such a move almost impossible but, by way of intellectual compensation, there is always philosophy — a perennially popular field more or less wholly abandoned by professional philosophers. Images of the middle-aged scientific genius indulging its philosopause in book after book about science and art, science God, science and society and so on and so forth, may raise a wry smile, but work of real worth has come out of it.

Alas, even if Hawking had shown the slightest aptitude for philosophy (and he didn’t), he couldn’t possibly have composed it.

In our imaginations, Hawking is the cartoon embodiment of the scientific sage, effectively disembodied and above ordinary mortal concerns. In truth, life denied him a path to sagacity even as it steeped him in the spit and stew of physical being. Hawking’s libido never waned. So to hell with the philosopause! Bring on the dancing girls! Bring on the cheques, from Specsavers, BT, Jaguar, Paddy Power. (Hawking never had enough money: the care he needed was so intensive and difficult, a transatlantic air flight could set him back around a quarter of a million pounds). Bring on the billionaires with their fat cheques books (naifs, the lot of them, but decent enough, and generous to a fault). Bring on the countless opportunities to bloviate about subjects he didn’t understand, a sort of Prince Charles only without Charles’s efforts at warmth.

I find it impossible, having read Seife, not to see Hawking through the lens of Jacobean tragedy, warped and raging, unable even to stick a finger up at a world that could not — but much worse, *chose* not — to understand him. Of course he was a monster, and years too late, and through a book that will anger many, I have come to love him for it.

Soaked in ink and paint

Reading Dutch Light: Christiaan Huygens and the making of science in Europe
by Hugh Aldersey-Williams for the Spectator, 19 December 2020

This book, soaked, like the Dutch Republic itself, “in ink and paint”, is enchanting to the point of escapism. The author calls it “an interior journey, into a world of luxury and leisure”. It is more than that. What he says of Huygen’s milieu is true also of his book: “Like a ‘Dutch interior’ painting, it turns out to contain everything.”

Hugh Aldersey-Williams says that Huygens was the first modern scientist. This is a delicate argument to make — the word “scientist” didn’t enter the English language before 1834. And he’s right to be sparing with such rhetoric, since a little of it goes a very long way. What inadvertent baggage comes attached, for instance, to the (not unreasonable) claim that the city of Middleburg, supported by the market for spectacles, became “a hotbed of optical innovation” at the end of the 16th century? As I read about the collaboration between Christiaan’s father Constantijn (“with his trim dark beard and sharp features”) and his lens-grinder Cornelis Drebbel (“strapping, ill-read… careless of social hierarchies”) I kept getting flashbacks to the Steve Jobs and Steve Wozniak double-act in Aaron Sorkin’s film.

This is the problem of popular history, made double by the demands of explaining the science. Secretly, readers want the past to be either deeply exotic (so they don’t have to worry about it) or fundamentally familiar (so they, um, don’t have to worry about it).

Hugh Aldersey-Williams steeps us in neither fantasy for too long, and Dutch Light is, as a consequence, an oddly disturbing read: we see our present understanding of the world, and many of our current intellectual habits, emerging through the accidents and contingencies of history, through networks and relationships, friendships and fallings-out. Huygens’s world *is* distinctly modern — disturbingly so: the engine itself, the pipework and pistons, without any of the fancy fairings and decals of liberalism.

Trade begets technology begets science. The truth is out there but it costs money. Genius can only swim so far up the stream of social prejudice. Who your parents are matters.

Under Dutch light — clean, caustic, calvinistic — we see, not Enlightenment Europe emerging into the comforts of the modern, but a mirror in which we moderns are seen squatting a culture, full of flaws, that we’ve never managed to better.

One of the best things about Aldersey-Williams’s absorbing book (and how many 500-page biographies do you know feel too short when you finish them?) is the interest he shows in everyone else. Christiaan arrives in the right place, in the right time, among the right people, to achieve wonders. His father, born 1596 was a diplomat, architect, poet (he translated John Donne) and artist (he discovered Rembrandt). His longevity exasperated him: “Cease murderous years, and think no more of me” he wrote, on his 82nd birthday. He lived eight years more. But the space and energy Aldersey-Williams devotes to Constantijn and his four other children — “a network that stretched across Europe” — is anything but exasperating. It immeasurably enriches our idea of Christiaan’s work meant, and what his achievements signified.

Huygens worked at the meeting point of maths and physics, at a time when some key physical aspects of reality still resisted mathematical description. Curves provide a couple of striking examples. The cycloid is the path made by a point on the circumference of a turning wheel. The catenary is the curve made by a chain or rope hanging under gravity. Huygens was the first to explain these curves mathematically, doing more than most to embed mathematics in the physical sciences. He tackled problems in geometry and probability, and had some fun in the process (“A man of 56 years marries a woman of 16 years, how long can they live together without one or the other dying?”) Using telescopes he designed and made himself, he discovered Saturn’s ring system and its largest moon, Titan. He was the first to describe the concept of centrifugal force. He invented the pendulum clock.

Most extraordinary of all, Huygens — though a committed follower of Descartes (who was once a family friend) — came up with a model of light as a wave, wholly consistent with everything then known about the nature of light apart from colour, and streets ahead of the “corpuscular” theory promulgated by Newton, which had light consisting of a stream of tiny particles.

Huygens’s radical conception of light seems even stranger, when you consider that, as much as his conscience would let him, Huygens stayed faithful to Descartes’ vision of physics as a science of bodies in collision. Newton’s work on gravity, relying as it did on an unseen force, felt like a retreat to Huygens — a step towards occultism.

Because we turn our great thinkers into fetishes, we allow only one per generation. Newton has shut out Huygens, as Galileo shut out Kepler. Huygens became an also-ran in Anglo-Saxon eyes; ridiculous busts of Newton, meanwhile, were knocked out to adorn the salons of Britain’s country estates, “available in marble, terracotta and plaster versions to suit all pockets.”

Aldersey-Williams insists that this competition between the elder Huygens and the enfant terrible Newton was never so cheap. Set aside their notorious dispute over calculus, and we find the two men in lively and, yes, friendly correspondence. Cooperation and collaboration were on the rise: “Gone,” Aldersey-Williams writes, “is the quickness to feel insulted and take umbrage that characterised so many exchanges — domestic as well as international — in the early days of the French and English academies of science.”

When Henry Oldenburg, the prime mobile of the Royal Society, died suddenly in 1677, a link was broken between scientists everywhere, and particularly between Britain and the continent. The 20th century did not forge a culture of international scientific cooperation. It repaired the one Oldenburg and Huygens had built over decades of eager correspondence and clever diplomacy.

What else you got?

Reading Benjamin Labatut’s When We Cease to Understand the World for the Spectator, 14 November 2020

One day someone is going to have to write the definitive study of Wikipedia’s influence on letters. What, after all, are we supposed to make of all these wikinovels? I mean novels that leap from subject to subject, anecdote to anecdote, so that the reader feels as though they are toppling like Alice down a particularly erudite Wikipedia rabbit-hole.

The trouble with writing such a book, in an age of ready internet access, and particularly Wikipedia, is that, however effortless your erudition, no one is any longer going to be particularly impressed by it.

We can all be our own Don DeLillo now; our own W G Sebald. The model for this kind of literary escapade might not even be literary at all; does anyone here remember James Burke’s Connections, a 1978 BBC TV series which took an interdisciplinary approach to the history of science and invention, and demonstrated how various discoveries, scientific achievements, and historical world events were built from one another successively in an interconnected way?

And did anyone notice how I ripped the last 35 words from the show’s Wikipedia entry?

All right, I’m sneering, and I should make clear from the off that When We Cease… is a chilling, gripping, intelligent, deeply humane book. It’s about the limits of human knowledge, and the not-so-very-pleasant premises on which physical reality seems to be built. The author, a Chilean born in Rotterdam in 1980, writes in Spanish. Adrian Nathan West — himself a cracking essayist — fashioned this spiky, pitch-perfect English translation. The book consists, in the main, of four broadly biographical essays. The chemist Franz Haber finds an industrial means of fixing nitrogen, enabling the revolution in food supply that sustains our world, while also pioneering modern chemical warfare. Karl Schwarzchild, imagines the terrible uber-darkness at the heart of a black hole, dies in a toxic first world war and ushers in a thermonuclear second. Alexander Grothendieck is the first of a line of post-war mathematician-paranoiacs convinced they’ve uncovered a universal principle too terrible to discuss in public (and after Oppenheimer, really, who can blame them?) In the longest essay-cum-story, Erwin Schrodinger and Werner Heisenberg slug it out for dominance in a field — quantum physics — increasingly consumed by uncertainty and (as Labatut would have it) dread.

The problem here — if problem it is — is that no connection, in this book of artfully arranged connections, is more than a keypress away from the internet-savvy reader. Wikipedia, twenty years old next year, really has changed our approach to knowledge. There’s nothing aristocratic about erudition now. It is neither a sign of privilege, nor (and this is more disconcerting) is it necessarily a sign of industry. Erudition has become a register, like irony. like sarcasm. like melancholy. It’s become, not the fruit of reading, but a way of perceiving the world.

Literary attempts to harness this great power are sometimes laughable. But this has always been the case for literary innovation. Look at the gothic novel. Fifty odd years before the peerless masterpiece that is Mary Shelley’s Frankenstein we got Horace Walpole’s The Castle of Otranto, which is jolly silly.

Now, a couple of hundred years after Frankenstein was published, “When We Cease to Understand the World” dutifully repeats the rumours (almost certainly put about by the local tourist industry) that the alchemist Johann Conrad Dippel, born outside Darmstadt in the original Burg Frankenstein in 1673, wielded an uncanny literary influence over our Mary. This is one of several dozen anecdotes which Labatut marshals to drive home that message that There Are Things In This World That We Are Not Supposed to Know. It’s artfully done, and chilling in its conviction. Modish, too, in the way it interlaces fact and fiction.

It’s also laughable, and for a couple of reasons. First, it seems a bit cheap of Labatut to treat all science and mathematics as one thing. If you want to build a book around the idea of humanity’s hubris, you can’t just point your finger at “boffins”.

The other problem is Labatut’s mixing of fact and fiction. He’s not out to cozen us. But here and there this reviewer was disconcerted enough to check his facts — and where else but on Wikipedia? I’m not saying Labatut used Wikipedia. (His bibliography lists a handful of third-tier sources including, I was amused to see, W G Sebald.) Nor am I saying that using Wikipedia is a bad thing.

I think, though, that we’re going to have to abandon our reflexive admiration for erudition. It’s always been desperately easy to fake. (John Fowles.) And today, thanks in large part to Wikipedia, it’s not beyond the wit of most of us to actually *acquire*.

All right, Benjamin, you’re erudite. We get it. What else you got?

An intellectual variant of whack-a-mole

Reading Joseph Mazur’s The Clock Mirage for The Spectator, 27 June 2020 

Some books elucidate their subject, mapping and sharpening its boundaries. The Clock Mirage, by the mathematician Joseph Mazur, is not one of them. Mazur is out to muddy time’s waters, dismantling the easy opposition between clock time and mental time, between physics and philosophy, between science and feeling.

That split made little sense even in 1922, when the philosopher Henri Bergson and the young physicist Albert Einstein (much against his better judgment) went head-to-head at the Société française de philosophie in Paris to discuss the meaning of relativity. (Or that was the idea. Actually they talked at complete cross-purposes.)

Einstein won. At the time, there was more novel insight to be got from physics than from psychological introspection. But time passes, knowledge accrues and fashions change. The inference (not Einstein’s, though people associate it with him) that time is a fourth dimension, commensurable with the three dimensions of space, is looking decidedly frayed. Meanwhile Bergson’s psychology of time has been pruned by neurologists and put out new shoots.

Our lives and perceptions are governed, to some extent, by circadian rhythms, but there is no internal clock by which we measure time in the abstract. Instead we construct events, and organise their relations, in space. Drivers, thinking they can make up time with speed, acquire tickets faster than they save seconds. Such errors are mathematically obvious, but spring from the irresistible association we make (poor vulnerable animals that we are) between speed and survival.

The more we understand about non-human minds, the more eccentric and sui generis our own time sense seems to be. Mazur ignores the welter of recent work on other animals’ sense of time — indeed, he winds the clock back several decades in his careless talk of animal ‘instincts’ (no one in animal behaviour uses the ‘I’ word any more). For this, though, I think he can be forgiven. He has put enough on his plate.

Mazur begins by rehearsing how the Earth turns, how clocks were developed, and how the idea of universal clock time came hot on the heels of the railway (mistimed passenger trains kept running into each other). His mind is engaged well enough throughout this long introduction, but around page 47 his heart beats noticeably faster. Mazur’s first love is theory, and he handles it well, using Zeno’s paradoxes to unpack the close relationship between psychology and mathematics.

In Zeno’s famous foot race, by the time fleet-footed Achilles catches up to the place where the plodding tortoise was, the tortoise has moved a little bit ahead. That keeps happening ad infinitum, or at least until Newton (or Leibniz, depending on who you think got to it first) pulls calculus out of his hat. Calculus is an algebraic way of handling (well, fudging) the continuity of the number line. It handles vectors and curves and smooth changes — the sorts of phenomena you can measure only if you’re prepared to stop counting.

But what if reality is granular after all, and time is quantised, arriving in discrete packets like the frames of a celluloid film stuttering through the gate of a projector? In this model of time, calculus is redundant and continuity is merely an illusion. Does it solve Zeno’s paradox? Perhaps it makes it 100 times more intractable. Just as motion needs time, time needs motion, and ‘we might wonder what happens to the existence of the world between those falling bits of time sand’.

This is all beautifully done, and Mazur, having hit his stride, maintains form throughout the rest of the book, though I suspect he has bitten off more than any reader could reasonably want to swallow. Rather than containing and spotlighting his subject, Mazur’s questions about time turn out (time and again, I’m tempted to say) to be about something completely different, as though we were playing an intellectual variant of whack-a-mole.

But this, I suppose, is the point. Mazur quotes Henri Poincaré:

Not only have we not direct intuition of the equality of two periods, but we have not even direct intuition of the simultaneity of two events occurring in two different places.

Our perception of time is so fractured, so much an ad hoc amalgam of the chatter of numerous, separately evolved systems (for the perception of motion; for the perception of daylight; for the perception of risk, and on and on — it’s a very long list), it may in the end be easier to abandon talk of time altogether, and for the same reason that psychologists, talking shop among themselves, eschew vague terms suchas ‘love’.

So much of what we mean by time, as we perceive it day to day, is really rhythm. So much of what physicists mean by time is really space. Time exists, as love exists, as a myth: real because contingent, real because constructed, a catch-all term for phenomena bigger, more numerous and far stranger than we can yet comprehend.

Joy in the detail

Reading Charles Darwin’s Barnacle and David Bowie’s Spider by Stephen Heard for the Spectator, 16 May 2020

Heteropoda davidbowie is a species of huntsman spider. Though rare, it has been found in parts of Malaysia, Singapore, Indonesia and possibly Thailand. (The uncertainty arises because it’s often mistaken for a similar-looking species, the Heteropoda javana.) In 2008 a German collector sent photos of his unusual-looking “pet” to Peter Jäger, an arachnologist at the Senckenberg Research Institute in Frankfurt. Consequently, and in common with most other living finds, David Bowie’s spider was discovered twice: once in the field, and once in the collection.

Bowie’s spider is famous, but not exceptional. Jäger has discovered more than 200 species of spider in the last decade, and names them after politicians, comedians and rock stars to highlight our ecological plight. Other researchers find more pointed ways to further the same cause. In the first month of Donald Trump’s administration, Iranian-Canadian entomologist Vazrick Nazari discovered a moth with a head crowned with large, blond comb-over scales. There’s more to Neopalpa donaldtrumpi than a striking physical resemblance: it lives in a federally protected area around where the border wall with Mexico is supposed to go. Cue headlines.

Species are becoming extinct 100 times faster than they did before modern humans arrived. This makes reading a book about the naming of species a curiously queasy affair. Nor is there much comfort to be had in evolutionary ecologist Stephen Heard’s observation that, having described 1.5 million species, we’ve (at very best) only recorded half of what’s out there. There is, you may recall, that devastating passage in Cormac McCarthy’s western novel Blood Meridian in which Judge Holden meticulously records a Native American artifact in his sketchbook — then destroys it. Given that to discover a species you must, by definition, invade its environment, Holden’s sketch-and-burn habit appears to be a painfully accurate metonym for what the human species is doing to the planet. Since the 1970s (when there used to be twice as many wild animals than there are now) we’ve been discovering and endangering new species in almost the same breath.

Richard Spruce, one of the Victorian era’s great botanical explorers, who spent 15 years exploring the Amazon from the Andes to its mouth, is a star of this short, charming book about how we have named and ordered the living world. No detail of his bravery, resilience and grace under pressure come close to the eloquence of this passing quotation, however: “Whenever rains, swollen streams, and grumbling Indians combined to overwhelm me with chagrin,” he wrote in his account of his travels, “I found reason to thank heaven which had enabled me to forget for the moment all my troubles in the contemplation of a simple moss.”

Stephen Heard, an evolutionary ecologist based in Canada, explains how extraordinary amounts of curiosity have been codified to create a map of the living world. The legalistic-sounding codes by which species are named are, it turns out, admirably egalitarian, ensuring that the names amateurs give species are just as valid as those of professional scientists.

Formal names are necessary because of the difficulty we have in distinguishing between similiar species. Common names run into this difficulty all the time. There too many of them, so the same species gets different names in different languages. At the same time, there aren’t enough of them, so that, as Heard points out, “Darwin’s finches aren’t finches, African violets aren’t violets, and electric eels aren’t eels;” Robins, blackbirds and badgers are entirely different animals in Europe and North America; and virtually every flower has at one time or another been called a daisy.

Also names tend, reasonably enough, to be descriptive. This is fine when you’re distinguishing between, say, five different types of fish When there are 500 different fish to sort through, however, absurdity beckons. Heard lovingly transcribes the pre-Linnaean species name of the English whiting, formulated around 1738: “Gadus, dorso tripterygio, ore cirrato, longitudine ad latitudinem tripla, pinna ani prima officulorum trigiata“. So there.

It takes nothing away from the genius of Swedish physician Carl Linnaeus, who formulated the naming system we still use today, to say that he came along at the right time. By Linnaeus’s day, it was possible to look things up. Advances in printing and distribution had made reference works possible. Linnaeus’s innovation was to decouple names from descriptions. And this, as Heard reveals in anecdote after anecdote, is where the fun now slips in: the mythopoeic cool of the baboon Papio anubis, the mischevious smarts of the beetle Agra vation, the nerd celebrity of lemur Avahi cleesi.

Hearst’s taxonomy of taxonomies makes for somewhat thin reading; this is less of a book, more like a dozen interesting magazine articles flying in close formation. But its close focus, bringing to life minutiae of both the living world and the practice of science, is welcome.

I once met Michael Land, the neurobiologist who figured out how the lobster’s eye works. He told me that the trouble with big ideas is that they get in the way of the small ones. Heard’s lesson, delivered with such a light touch, is the same. The joy, and much of the accompanying wisdom, lies in the detail.

Goodbye to all that

Reading Technologies of the Human Corpse by John Troyer for the Spectator, 11 April 2020

John Troyer, the director of the Centre for Death and Society at the University of Bath, has moves. You can find his interpretative dances punctuating a number of his lectures, which go by such arresting titles as ‘150 Years of the Human Corpse in American History in Under 15 Minutes with Jaunty Background Music’ and ‘Abusing the Corpse Even More: Understanding Necrophilia Laws in the USA — Now with more Necro! And more Philia!’ (Wisconsin and Ohio are, according to Troyer’s eccentric looking and always fascinating website, ‘two states that just keep giving and giving when it comes to American necrophilia cases’.)

Troyer’s budding stand-up career has taken a couple of recent knocks. First was the ever more pressing need for him to crack on with his PhD (his dilatoriness was becoming a family joke). Technologies of the Human Corpse is yanked, not without injury, from that career-establishing academic work. Even as he assembled the present volume, however, there came another, far more personal, blow.

Late in July 2017 Troyer’s younger sister Julie was diagnosed with an aggressive brain cancer. Her condition deteriorated far more quickly than anyone expected, and on 29 July 2018 she died. This left Troyer — the engaging young American death scholar sprung from a family of funeral directors — having to square his erudite and cerebral thoughts on death and dead bodies with the fact he’d just kissed his sister goodbye. He interleaves poetical journal entries composed during Julie’s dying and her death, her funeral and her commemoration, between chapters written by a younger, jollier and of course shallower self.

To be brutal, the poems aren’t up to much, and on their own they wouldn’t add a great deal by way of nuance or tragedy. Happily for us, however, and to Troyer’s credit, he has transformed them into a deeply moving 30-page memoir that now serves as the book’s preface. This, then, is Troyer’s monster: a powerful essay about dying and bereavement; a set of poems written off the cuff and under great stress; and seven rather disconnected chapters about what’s befallen the human corpse in the past century or so.

Even as the book was going to print, Troyer explains in a hurried postscript, his father, a retired undertaker, lost consciousness following a cardiac arrest and was very obviously dying:

“And seeing my father suddenly fall into a comatose state so soon after watching my sister die is impossible to fully describe: I understand what is happening, yet I do not want to understand what is happening.”

This deceptively simple statement from Troyer the writer is streets ahead of anything Troyer the postgrad can pull off.

But to the meat of the book. The American civil war saw several thousand corpses embalmed and transported on new-fangled railway routes across the continent. The ability to preserve bodies, and even lend them a lifelike appearance months after death, created a new industry that, in various configurations and under several names, now goes by the daunting neologism of ‘deathcare provision’. In the future, this industry will be seen ‘transforming the mostly funeralisation side of the business into a much broader, human body parts and tissue distribution system’, as technical advances make increasing use of cadavers and processed cadaver parts.

So how much is a dead body worth? Between $30,000 and $50,000, says Troyer — five times as much for donors processed into medical implants, dermal implants and demineralised bone matrices. Funds and materials are exchanged through a network of body brokers who serve as middlemen between biomedical corporations such as Johnson & Johnson and the usual sources of human cadavers — medical schools, funeral homes and mortuaries. It is by no stretch an illegal trade, nor is it morally problematic in most instances; but it is rife with scandal. As one involved party remarks: ‘If you’re cremated, no one is ever going to know if you’re missing your shoulders or knees or your head.’

Troyer is out to show how various industries serve to turn our dead bodies into ‘an unfettered source of capital’. The ‘fluid men’ of Civil War America — who toured the battlefields showing keen students how to embalm a corpse (and almost always badly) — had no idea what a strange story they had started. Today, as the anatomist Gunther von Hagens poses human cadavers in sexual positions to pique and titillate worldwide audiences, we begin to get a measure of how far we have come. Hagens’s posthumous pornography reveals, says Troyer, ‘the ultimate taxonomic power over nature: we humans, or at least our bodies, can live forever because we pull ourselves from nature’.

Technologies of the Human Corpse is a bit of a mess, but I have a lot of time for Troyer. His insights are sound, and his recent travails may yet (and at high human cost — but it was ever thus) make him a writer of some force.

 

Pluck

Reading Gunpowder and Glory: The Explosive Life of Frank Brock OBE by Harry Smee and Henry Macrory for the Spectator, 21 March 2020

Early one morning in October 1874, a barge carrying three barrels of benzoline and five tons of gunpowder blew up in the Regent’s Canal, close to London Zoo. The crew of three were killed outright, scores of houses were badly damaged, the explosion could be heard 25 miles away, and “dead fish rained from the sky in the West End.”

This is a book about the weird, if obvious, intersection between firework manufacture and warfare. It is, ostensibly, the biography of a hero of the First World War, Frank Brock. And if it were the work of more ambitious literary hands, Brock would have been all you got. His heritage, his school adventures, his international career as a showman, his inventions, his war work, his violent death. Enough for a whole book, surely?

But Gunpowder and Glory is not a “literary” work, by which I mean it is neither self-conscious nor overwrought. Instead Henry Macrory (who anyway has already proved his literary chops with his 2018 biography of the swindler Whitaker Wright) has opted for what looks like a very light touch here, assembling and ordering the anecdotes and reflections of Frank Brock’s grandson Harry Smee about his family, their business as pyrotechnical artists, and, finally, about Frank, his illustrious forebear.

I suspect a lot of sweat went into such artlessness, and it’s paid off, creating a book that reads like fascinating dinner conversation. Reading its best passages, I felt I was discovering Brock the way Harry had as a child, looking into his mother’s “ancient oak chests filled with papers, medals, newspapers, books, photographs, an Intelligence-issue knuckleduster and pieces of Zeppelin and Zeppelin bomb shrapnel.”

For eight generations, the Brock family produced pyrotechnic spectaculars of a unique kind. Typical set piece displays in the eighteenth century included “Jupiter discharging lightning and thunder, Two gladiators combating with fire and sword, and Neptune finely carv’d seated in his chair, drawn by two sea horses on fire-wheels, spearing a dolphin.”

Come the twentieth century, Brock’s shows were a signature of Empire. It would take a write like Thomas Pynchon to do full justice to “a sixty foot-high mechanical depiction of the Victorian music-hall performer, Lottie Collins, singing the chorus of her famous song ‘Ta-ra-ra-boom-de-ay’ and giving a spirited kick of an automated leg each time the word ‘boom’ rang out.”

Frank was a Dulwich College boy, and one of that generation lost to the slaughter of the Great War. A spy and an inventor — James Bond and Q in one — he applied his inherited chemical and pyrotechnical genius to the war effort — by making a chemical weapon. It wasn’t any good, though: Jellite, developed during the summer of 1915 and named after its jelly-like consistency during manufacture, proved insufficiently lethal.

On such turns of chance do reputations depend, since we remember Frank Brock for his many less problematic inventions. Dover flares burned for seven and a half minutes
and lit up an area of three miles radius, as Winston Churchill put it, “as bright as Piccadilly”. U boats, diving to avoid these lights, encountered mines. Frank’s artificial fogs, hardly bettered since, concealed whole British fleets, entire Allied battle lines.

Then there are his incendiary bullets.

At the time of the Great War a decent Zeppelin could climb to 20,000 feet, travel at 47 mph for more than 1,000 miles, and stay aloft for 36 hours. Smee and Mcrory are well within their rights to call them “the stealth bombers of their time”.

Brock’s bullets tore them out of the sky. Sir William Pope, Brock’s advisor, and a professor of chemistry at Cambridge University, explained: “You need to imagine a bullet proceeding at several thousand feet a second, and firing as it passes through a piece of fabric which is no thicker than a pocket handkerchief.” All to rupture a gigantic sac of hydrogen sufficiently to make the gas explode. (Much less easy than you think; the Hindenburg only crashed because its entire outer envelope was set on fire.)

Frank died in an assault on the mole at Zeebrugge in 1918. He shouldn’t have been there. He should have been in a lab somewhere, cooking up another bullet, another light, another poison gas. Today, he surely would be suitably contained, his efforts efficiently channeled, his spirit carefully and surgically broken.

Frank lived at a time when it was possible — and men, at any rate, were encouraged — to be more than one thing. That this heroic idea overreached itself — that rugby field and school chemistry lab both dissolved seamlessly into the Somme — needs no rehearsing.

Still, we have lost something. When Frank went to school there was a bookstall near the station which sold “a magazine called Pluck, containing ‘the daring deeds of plucky sailors, plucky soldiers, plucky firemen, plucky explorers, plucky detectives, plucky railwaymen, plucky boys and plucky girls and all sorts of conditions of British heroes’.”

Frank was a boy moulded thus, and sneer as much as you want, we will not see his like again.

 

“So that’s how the negroes of Georgia live!”

Visiting W.E.B. Du Bois: Charting Black Lives, at the House of Illustration, London, for the Spectator, 25 January 2020

William Edward Burghardt Du Bois was born in Massachusetts in 1868, three years after the official end of slavery in the United States. He grew up among a small, tenacious business- and property-owning black middle class who had their own newspapers, their own schools and universities, their own elected officials.

After graduating with a PhD in history from Harvard University, Du Bois embarked on a sprawling study of African Americans living in Philadelphia. At the historically black Atlanta University in 1897, he established international credentials as a pioneer of the newfangled science of sociology. His students were decades ahead of their counterparts in the Chicago school.

In the spring of 1899, Du Bois’s son Burghardt died, succumbing to sewage pollution in the Atlanta water supply. ‘The child’s death tore our lives in two,’ Du Bois later wrote. His response: ‘I threw myself more completely into my work.’

A former pupil, the black lawyer Thomas Junius Calloway, thought that Du Bois was just the man to help him mount an exhibition to demonstrate the progress that had been made by African Americans. Funded by Congress and planned for the Paris Exposition of 1900, the project employed around a dozen clerks, students and former students to assemble and run ‘the great machinery of a special census’.

Two studies emerged. ‘The Georgia Negro’, comprising 32 handmade graphs and charts, captured a living community in numbers: how many black children were enrolled in public schools, how far family budgets extended, what people did for work, even the value of people’s kitchen furniture.

The other, a set of about 30 statistical graphics, was made by students at Atlanta University and considered the African American population of the whole of the United States. Du Bois was struck by the fact that the illiteracy of African Americans was ‘less than that of Russia, and only equal to that of Hungary’. A chart called ‘Conjugal Condition’ suggests that black Americans were more likely to be married than Germans.

The Exposition Universelle of 1900 brought all the world to the banks of the Seine. Assorted Africans, shipped over for the occasion, found themselves in model native villages performing bemused and largely made-up rituals for the visitors. (Some were given a truly lousy time by their bosses; others lived for the nightlife.) Meanwhile, in a theatre made of plaster and drapes, the Japanese geisha Sada Yacco, wise to this crowd from her recent US tour, staged a theatrical suicide for herself every couple of hours.

The expo also afforded visitors more serious windows on the world. Du Bois scraped together enough money to travel steerage to Paris to oversee his exhibition’s installation at the Palace of Social Economy.

He wasn’t overly impressed by the competition. ‘There is little here of the “science of society”,’ he remarked, and the organisers of the Exposition may well have agreed with him: they awarded him a gold medal for what Du Bois called, with justifiable pride, ‘an honest, straightforward exhibit of a small nation of people, picturing their life and development without apology or gloss, and above all made by themselves’.

At the House of Illustration in London you too can now follow the lines, bars and spirals that reveal how black wealth, literacy and land ownership expanded over the four decades since emancipation.

His exhibition also included what he called ‘the usual paraphernalia for catching the eye — photographs, models, industrial work, and pictures’, so why did Du Bois include so many charts, maps and diagrams?

The point about data is that it looks impersonal. It is a way of separating your argument from what people think of you, and this makes it a powerful weapon in the hands of those who find themselves mistrusted in politics and wider society. Du Bois and his community, let’s not forget, were besieged — by economic hardship, and especially by the Jim Crow laws that would outlive him by two years (he died in 1963).

Du Bois pioneered sociology, not statistics. Means of visualising data had entered academia more than a century before, through the biographical experiments of Joseph Priestly. His timeline charts of people’s lives and relative lifespans had proved popular, inspiring William Playfair’s invention of the bar chart. Playfair, an engineer and political economist, published his Commercial and Political Atlas in London in 1786. It was the first major work to contain statistical graphs. More to the point, it was the first time anyone had tried to visualise an entire nation’s economy.

Statistics and their graphic representation were quickly established as an essential, if specialised, component of modern government. There was no going back. Metrics are a self-fertilising phenomenon. Arguments over figures, and over the meaning of figures, can only generate more figures. The French civil engineer Charles Joseph Minard used charts in the 1840s to work out how to monetise freight on the newfangled railroads, then, in retirement, and for a hobby, used two colours and six dimensions of data to visualise Napoleon’s invasion and retreat during the 1812 campaign of Russia.

And where society leads, science follows. John Snow founded modern epidemiology when his annotated map revealed the source of an outbreak of cholera in London’s Soho. English nurse Florence Nightingale used information graphics to persuade Queen Victoria to improve conditions in military hospitals.

Rightly, we care about how accurate or misleading infographics can be. But let’s not forget that they should be beautiful. The whole point of an infographic is, after all, to capture attention. Last year, the House of Illustration ran a tremendous exhibition of the work of Marie Neurath who, with her husband Otto, dreamt up a way of communicating, without language, by means of a system of universal symbols. ‘Words divide, pictures unite’ was the slogan over the door of their Viennese design institute. The couple’s aspirations were as high-minded as their output was charming. The Neurath stamp can be detected, not just in kids’ picture books, but across our entire designscape.

Infographics are prompts to the imagination. (One imagines at least some of the 50 million visitors to the Paris Expo remarking to each other, ‘So that’s how the negroes of Georgia live!’) They’re full of facts, but do they convey them more effectively than language? I doubt it. Where infographics excel is in eliciting curiosity and wonder. They can, indeed, be downright playful, as when Fritz Kahn, in the 1920s, used fast trains, street traffic, dancing couples and factory floors to describe, by visual analogy, the workings of the human body.

Du Bois’s infographics aren’t rivals to Kahn or the Neuraths. Rendered in ink, gouache watercolour and pencil, they’re closer in spirit to the hand-drawn productions of Minard and Snow. They’re the meticulous, oh-so-objective statements of a proud, decent, politically besieged people. They are eloquent in their plainness, as much as in their ingenuity, and, given a little time and patience, they prove to be quite unbearably moving.

Cutting up the sky

Reading A Scheme of Heaven: Astrology and the Birth of Science by Alexander Boxer
for the Spectator, 18 January 2020

Look up at sky on a clear night. This is not an astrological game. (Indeed, the experiment’s more impressive if you don’t know one zodiacal pattern from another, and rely solely on your wits.) In a matter of seconds, you will find patterns among the stars.

We can pretty much apprehend up to five objects (pennies, points of light, what-have-you) at a single glance. Totting up more than five objects, however, takes work. It means looking for groups, lines, patterns, symmetries, boundaries.

The ancients cut up the sky into figures, all those aeons ago, for the same reason we each cut up the sky within moments of gazing at it: because if we didn’t, we wouldn’t be able to comprehend the sky at all.

Our pattern-finding ability can get out of hand. During his Nobel lecture in 1973 the zoologist Konrad Lorenz recalled how he once :”… mistook a mill for a sternwheel steamer. A vessel was anchored on the banks of the Danube near Budapest. It had a little smoking funnel and at its stern an enormous slowly-turning paddle-wheel.”

Some false patterns persist. Some even flourish. And the brighter and more intellectually ambitious you are, the likelier you are to be suckered. John Dee, Queen Elizabeth’s court philosopher, owned the country’s largest library (it dwarfed any you would find at Oxford or Cambridge). His attempt to tie up all that knowledge in a single divine system drove him into the arms of angels — or at any rate, into the arms of the “scrier” Edward Kelley, whose prodigious output of symbolic tables of course could be read in such a way as to reveal fragments of esoteric wisdom.

This, I suspect, is what most of us think about astrology: that it was a fanciful misconception about the world that flourished in times of widespread superstition and ignorance, and did not, could not, survive advances in mathematics and science.

Alexander Boxer is out to show how wrong that picture is, and A Scheme of Heaven will make you fall in love with astrology, even as it extinguishes any niggling suspicion that it might actually work.

Boxer, a physicist and historian, kindles our admiration for the earliest astronomers. My favourite among his many jaw-dropping stories is the discovery of the precession of the equinoxes. This is the process by which the sun, each mid-spring and mid-autumn, rises at a fractionally different spot in the sky each year. It takes 26,000 years to make a full revolution of the zodiac — a tiny motion first detected by Hipparchus around 130 BC. And of course Hipparchus, to make this observation at all, “had to rely on the accuracy of stargazers who would have seemed ancient even to him.”

In short, a had a library card. And we know that such libraries existed because the “astronomical diaries” from the Assyrian library at Nineveh stretch from 652BC to 61BC, representing possibly the longest continuous research program ever undertaken in human history.

Which makes astrology not too shoddy, in my humble estimation. Boxer goes much further, dubbing it “the ancient world’s most ambitious applied mathematics problem.”

For as long as lives depend on the growth cycles of plants, the stars will, in a very general sense, dictate the destiny of our species. How far can we push this idea before it tips into absurdity? The answer is not immediately obvious, since pretty much any scheme we dream up will fit some conjunction or arrangement of the skies.

As civilisations become richer and more various, the number and variety of historical events increases, as does the chance that some event will coincide with some planetary conjunction. Around the year 1400, the French Catholic cardinal Pierre D’Ailly concluded his astrological history of the world with a warning that the Antichrist could be expected to arrive in the year 1789, which of course turned out to be the year of the French revolution.

But with every spooky correlation comes an even larger horde of absurdities and fatuities. Today, using a machine-learning algorithm, Boxer shows that “it’s possible to devise a model that perfectlly mimics Bitcoin’s price history and that takes, as its input data, nothing more than the zodiac signs of the planets on any given day.”

The Polish science fiction writer Stanislaw Lem explored this territory in his novel The Chain of Chance: “We now live in such a dense world of random chance,” he wrote in 1975, “in a molecular and chaotic gas whose ‘improbabilities’ are amazing only to the individual human atoms.” And this, I suppose, is why astrology eventually abandoned the business of describing whole cultures and nations (a task now handed over to economics, another largely ineffectual big-number narrative) and now, in its twilight, serves merely to gull individuals.

Astrology, to work at all, must assume that human affairs are predestined. It cannot, in the long run, survive the notion of free will. Christianity did for astrology, not because it defeated a superstition, but because it rendered moot astrology’s iron bonds of logic.

“Today,” writes Boxer, “there’s no need to root and rummage for incidental correlations. Modern machine-learning algorithms are correlation monsters. They can make pretty much any signal correlate with any other.”

We are bewitched by big data, and imagine it is something new. We are ever-indulgent towards economists who cannot even spot a global crash. We credulously conform to every algorithmically justified norm. Are we as credulous, then, as those who once took astrological advice as seriously as a medical diagnosis? Oh, for sure.

At least our forebears could say they were having to feel their way in the dark. The statistical tools you need to sort real correlations from pretty patterns weren’t developed until the late nineteenth century. What’s our excuse?

“Those of us who are enthusiastic about the promise of numerical data to unlock the secrets of ourselves and our world,” Boxer writes, “would do well simply to acknowledge that others have come this way before.”