A surprisingly narrow piano

Reading Richard Mainwaring’s Everybody Hertz for the Spectator, 30 April 2022.

Imagine that all the frequencies nature affords were laid out on an extended piano keyboard. Never mind that some waves are mechanical, propagated through air or some other fluid, and other waves are electromagnetic, and can pass through a vacuum. Lay them down all together, and what do you get?

The startling answer is: a surprisingly narrow piano. To play X-rays (whose waves cycle up to 30,000,000,000,000,000,000 times per second), our pianist would have to travel a mere nine metres to the right of middle C. Wandering nine and a half metres in the other direction, our pianist would then be able to sound the super-bass note generated by shockwaves rippling through the hot gas around a supermassive black hole in the Perseus cluster — a wave that cycles just once every 18.5 million years.

Closer to home, how big do you think that piano would have to be for it to play every note distinguishable by the human ear? You’d have to add barely a single octave to either side of a regular concert grand.

Readers of Richard Mainwaring’s wonderfully titled book will fall into two camps. Some will want to hear what this “infinite piano” conceit reveals about the natural world; about the (considerable) auditory abilities of spiders, say, or how 23 high-stepping fitness junkies caused a tremor that evacuated the the 39-storey Techno Mart building in Seoul, South Korea.

Other readers, though entertained well enough by Mainwaring’s extraordinary clear and concise science writing, won’t be able to get that infinite piano out of their heads. It’s a metaphor so engaging, so intuitive, it’s quite as exciting as anything else in the book (for all that the book features ghosts, whales, Neolithic chambered cairns and Nikolai Tesla).

Mainwaring is a musician and a composer, and the business of music runs under even his most abstruse intellectual excursions. A Marsquake recorded on On 6 April 2019, sped up by a factor of 60, sounds, he tells us, “not unlike someone blowing over the top of a half-full wine bottle in Westminster Abbey”. Fully concentrating on a task generates brainwaves of around 40 Hz or more: ”it’s a wonder we can’t hear them humming, as they are at the same frequency as the opening bass note of Cypress Hill’s ‘Insane in the Brain’.”

This is infotainment at its most charming and lightweight; tonally, it’s of a piece with the musical stunts (for example, arranging a performance by massed tuning-forks) that Mainwaring has regularly staged for BBC1’s pre-watershed magazine programme The ONE Show. The glimpses Mainwaring gives us into the peculiar, fractured, distraction-filled business of modern music making are quite as fascinating as his tales of planetary resonance and the latest thinking about olfaction. He can also be tremendously catty, as when he pricks the vaingloriousness of virtuoso bass players (“Know your role, bassists – stay out of the way.”)

Like any ebullient teacher, he won’t be everybody’s cup of tea. There’s always one misery-guts at the back of the class whose teeth will be set on edge, and now and again Mainwaring’s humour is a little forced. This is usually because he’s hit on some neat metaphor and doesn’t know when to stop beating on it. We should set against this, though, his willingness to dive (and deeply, too) into any number of abstruse subjects, from religious experiences to Edwardian vibrators.

Throughout, Mainwaring keeps a sharp eye out for specious claims and pretensions. There is, he says, nothing magical about “the God-given, superhero ability of perfect pitch” — the ability to identify a note from its frequency. Indeed, before 1955, the year the ISO standardised “A” at 440 Hz, there was no such thing as perfect pitch. (Interestingly, though, speakers of Mandarin, a language dependent on tonal inflexion, are rather better at guessing notes than the rest of us.)

On the other hand there is, as Mainwaring ably demonstrates, an extraordinary spiritual power to music, particularly around the note A forty-seven white keys to the left of middle C. (That’s 19 cycles per second, or 19 Hertz, we say now, in honour of Heinrich Rudolf Hertz, who proved the existence of electromagnetic waves). This “A” can trigger cold sweats, fits of severe depression, and even sightings of dead people. Mainwaring traces the use of low notes and infrasound from the more inaccessible tunnels of French caves (where little ochre dots marked where prehistoric singers should stand to sound especially resonant and amplified) to Bach’s Toccata and Fugue in D Minor which, on any decent organ, generates infrasonic byproducts by means of two chords and a low pedal D.

Though horribly abused and exploited by various new Age fads over the years, the old intuition still holds: vibrations reveal much about life, consciousness and the integrity of matter. Mainwaring’s clear-eyed forays into medicine, psychology and spirituality reflect as much.

It’s a commonplace of popular science that the world is looked at best through this or that funny-shaped window of the author’s choosing. But Mainwaring’s garrulous offering is the real deal.

Stone the Fool and others

Reading Stars and Spies: Intelligence Operations and the Entertainment Business
by Christopher Andrew and Julius Green for the Spectator, 18 December 2021

On 2 October 2020, when he became chief of the UK Secret Intelligence Service (MI6, if you prefer), Richard Moore tweeted (*tweeted!*)

#Bond or #Smiley need not apply. They’re (splendid) fiction but actually we’re #secretlyjustlikeyou.

The gesture’s novelty disguised, at the time, its appalling real-world implications: Bond was, after all, competent; and Smiley had integrity.

Stars and Spies, by veteran intelligence historian Christopher Andrew and theatre director and circus producer Julius Green, is a thoroughly entertaining read, but not at all a reassuring one. “The adoption of a fictional persona, the learning of scripts and the ability to improvise” are central to career progression in both theatre and espionage, the writers explain, “and undercover agents often find themselves engaged in what is effectively an exercise in in long-form role play.”

It should, then, come as no surprise that this book boasts “no shortage of enthusiastic but inept entertainer-spies”.

There’s Aphra Behn, the first woman employed as a secret agent by the British state during the Second Anglo-Dutch War in 1665: reaping no secret intelligence from her former lover, “ASTRA, Agent 160”, she made stuff up.

As, indeed, did “The Man Called Intrepid”, Sir William Stephenson, subject, in 1976, of the biggest-selling book ever on intelligence history. His recollections, spanning everything from organising wartime resistance in Europe to developing the Spitfire and the jet engine, work on the German Enigma code, and developing nuclear weapons, turned out to be the melancholy fabulations of a man suffering catastrophic memory loss.

The authors imagine that their subject — the intersection between spying and acting — is entertaining enough that they can simply start in the England of Good Queen Bess and Christopher Marlowe (recruited to spy for Walsingham while a student at Cambridge; also wrote a play or two), and end with the ludicrous antics (and — fair’s fair — brilliant acting) of US spy show Homeland.

And, by and large, they’re right. Begin at the beginning; end at the end. Why gild the lily with anything so arduous as an argument, when your anecdotes are this engaging? (Daniel Defoe’s terrifying plans for a surveillance state were scotched because the government’s intelligence budget was being siphoned off to keep Charles II’s mistresses quiet; and why were the British establishment so resistant to the charms of Soviet ballerinas?)

This approach does, however, leave the authors’ sense of proportion open to question. They’re not wrong to point out that “the most theatrical innovations pioneered by Stalinist intelligence were the show trials”, but in the context of so many Corenesque quasi-theatrical anecdotes, this observation can’t help but feel a bit cheap.

Once the parallels between spying and acting have been pointed out, the stories told here (many of them the fruit of fairly arduous primary research) sometimes come across as slightly fatuous. Why should the popular broadcaster Maxwell Knight not be a powerful recruiter of spies during the inter-war years? There’s nothing counter-intuitive here, if you think about the circles Knight must have moved in.

We are on surer ground when the authors measure the sharp contrast between fictional spies and their real-life counterparts. In the movies, honeypots abound, still rehashing the myths attaching to the courageous World War One French spy Mistinguett and the sadly deluded Margaretha Zelle (Mata Hari).

In truth, though, and for the longest while, women in this business have been more middle management than cat-suited loot. Recruited largely from Oxford’s women’s colleges and Cheltenham Ladies’ College, women played a more important part in the Security Service than in any other wartime government department, and for years, we are told, the service has been recruiting more women at officer and executive level than any other branch of government.

As for seduction and pillow-talk, even a fleeting acquaintance with men in their natural environment will tell us that, as Maxwell Knight put it, “Nothing is easier than for a woman to gain a man’s confidence by the showing and expression of a little sympathy… I am convinced,” he went on, “that more information has been obtained by women agents by keeping out of the arms of a man, than was ever obtained by willingly sinking into them.”

Fuelled by Erskine Childers’s peerless spy novel The Riddle of the Sands (1903), by Somerset Maughan’s Ashenden stories and by everything Fleming ever wrote, of course the audience for espionage drama hankers for real-life insight from writers “in the know”. And if the writer complains that the whole espionage industry is a thing of smoke and mirrors, well, we’ll find that fascinating too. (In Ben Jonson’s spy farce Volpone Sir Pol, on being told of the death of Stone the Fool, claims that Stone actually ran a sophisticated spy ring which communicated by means of dead drops hidden in fruit and vegetables. Eat your heart out, Le Carré.)

Andrew and Green, who both at different times studied history at Corpus Christi, Christopher Marlowe’s old college, are not really giving us the inside track. I would go so far as to say that they are not really telling us anything new. But they marshall their rare facts splendidly, and use them to spin ripping yarns.

A cherry is a cherry is a cherry

Life is Simple: How Occam’s Razor Sets Science Free and Shapes the Universe
by Johnjoe McFadden, reviewed for the Spectator, 28 August 2021

Astonishing, where an idea can lead you. You start with something that, 800 years hence, will sound like it’s being taught at kindergarten: Fathers are fathers, not because they are filled with some “essence of fatherhood”, but because they have children.

Fast forward a few years, and the Pope is trying to have you killed.

Not only have you run roughshod over his beloved eucharist (justified, till then, by some very dodgy Aristotelian logic-chopping); you’re also saying there’s no “essence of kinghood”, neither. If kings are only kings because they have subjects, then, said William of Occam, “power should not be entrusted to anyone without the consent of all”. Heady stuff for 1334.

How this progression of thought birthed the very idea of modern science, is the subject of what may be the most sheerly enjoyable history of science of recent years.

William was born around 1288 in the little town of Ockham in Surrey. He was probably an orphan; at any rate he was given to the Franciscan order around the age of eleven. He shone at Greyfriars in London, and around 1310 was dispatched to Oxford’s newfangled university.

All manner of intellectual, theological and political shenanigans followed, mostly to do with William’s efforts to demolish almost the entire edifice of medieval philosophy.

It needed demolishing, and that’s because it still held to Aristotle’s ideas about what an object is. Aristotle wondered how single objects and multiples can co-exist. His solution: categorise everything. A cherry is a cherry is a cherry, and all cherries have cherryness in common. A cherry is a “universal”; the properties that might distinguish one cherry from another are “accidental”.

The trouble with Aristotle’s universals, though, is that they assume a one-to-one correspondence between word and thing, and posit a universe made up of a terrifying number of unique things — at least one for each noun or verb in the language.

And the problem with that is that it’s an engine for making mistakes.

Medieval philosophy relied largely on syllogistic reasoning, juggling things into logical-looking relations. “Socrates is a man, all men are mortal, so Socrates is mortal.”

So he is, but — and this is crucial — this conclusion is arrived at more by luck than good judgement. The statement isn’t “true” in any sense; it’s merely internally consistent.

Imagine we make a mistake. Imagine we spring from a society where beards are pretty much de rigeur (classical Athens, say, or Farringdon Road). Imagine we said, “Socrates is a man, all men have beards, therefore Socrates has a beard”?

Though one of its premises is wrong, the statement barrels ahead regardless; it’s internally consistent, and so, if you’re not paying attention, it creates the appearance of truth.

But there’s worse: the argument that gives Socates a beard might actually be true. Some men do have beards. Socrates may be one of them. And if he is, that beard seems — again, if you’re not paying attention — to confirm a false assertion.

William of Occam understood that our relationship with the world is a lot looser, cloudier, and more indeterminate than syllogistic logic allows. That’s why, when a tavern owner hangs a barrel hoop outside his house, passing travellers know they can stop there for a drink. The moment words are decoupled from things, then they act as signs, negotiating flexibly with a world of blooming, buzzing confusion.

Once we take this idea to heart, then very quickly — and as a matter of taste more than anything — we discover how much more powerful straightforward explanations are than complicated ones. Occam came up with a number of versions of what even then was not an entirely new idea: “It is futile to do with more what can be done with less,” he once remarked. Subsequent formulations do little but gild this lily.

His idea proved so powerful, three centuries later the French theologian Libert Froidmont coined the term “Occam’s razor”, to describe how we arrive at good explanations by shaving away excess complexity. As McFadden shows, that razor’s still doing useful work.

Life is Simple is primarily a history of science, tracing William’s dangerous idea through astronomy, cosmology, physics and biology, from Copernicus to Brahe, Kepler to Newton, Darwin to Mendel, Einstein to Noether to Weyl. But McFadden never loses sight of William’s staggering, in some ways deplorable influence over the human psyche as a whole. For if words are independent of things, how do we know what’s true?

Thanks to William of Occam, we don’t. The universe, after Occam, is unknowable. Yes, we can come up with explanations of things, and test them against observation and experience; but from here on in, our only test of truth will be utility. Ptolemy’s 2nd-century Almagest, a truly florid description of the motions of the stars and planetary paths, is not and never will be *wrong*; the worst we can say is that it’s overcomplicated.

In the Coen brothers’ movie The Big Lebowski, an exasperated Dude turns on his friend: “You’re not *wrong*, Walter” he cries, “you’re just an asshole.” William of Occam is our universal Walter, and the first prophet of our disenchantment. He’s the friend we wish we’d never listened to, when he told us Father Christmas was not real.

Nothing happens without a reason

Reading Journey to the Edge of Reason: The Life of Kurt Gödel by Stephen Budiansky for the Spectator, 29 May 2021

The 20th-century Austrian mathematician Kurt Gödel did his level best to live in the world as his philosophical hero Gottfried Wilhelm Leibnitz imagined it: a place of pre-established harmony, whose patterns are accessible to reason.

It’s an optimistic world, and a theological one: a universe presided over by a God who does not play dice. It’s most decidedly not a 20th-century world, but “in any case”, as Gödel himself once commented, “there is no reason to trust blindly in the spirit of the time.”

His fellow mathematician Paul Erdös was appalled: “You became a mathematician so that people should study you,” he complained, “not that you should study Leibnitz.” But Gödel always did prefer study to self-expression, and is this is chiefly why we know so little about him, and why the spectacular deterioration of his final years — a fantasmagoric tale of imagined conspiracies, strange vapours and shadowy intruders, ending in his self-starvation in 1978 — has come to stand for the whole of his life.

“Nothing, Gödel believed, happened without a reason,” says Stephen Burdiansky. “It was at once an affirmation of ultrarationalism, and a recipe for utter paranoia.”

You need hindsight to see the paranoia waiting to pounce. But the ultrarationalism — that was always tripping him up. There was something worryingly non-stick about him. He didn’t so much resist the spirit of the time as blunder about totally oblivious of it. He barely noticed the Anschluss, barely escaped Vienna as the Nazis assumed control, and, once ensconced at the Institute for Advanced Study at Princeton, barely credited that tragedy was even possible, or that, say, a friend might die in a concentration camp (it took three letters for his mother to convince him).

Many believed that he’d blundered, in a way typical to him, into marriage with his life-long partner, a foot-care specialist and divorcée called Adele Nimbursky. Perhaps he did. But Burdiansky does a spirited job of defending this “uneducated but determined” woman against the sneers of snobs. If anyone kept Gödel rooted to the facts of living, it was Adele. She once stuck a concrete flamingo, painted pink and black, in a flower bed right outside his study window. All evidence suggests he adored it.

Idealistic and dysfunctional, Gödel became, in mathematician Jordan Ellenberg’s phrase, “the romantic’s favourite mathematician”, a reputation cemented by the fact that we knew hardly anything about him. Key personal correspondence was destroyed at his death, while his journals and notebooks — written in Gabelsberger script, a German shorthand that had fallen into disuse by the mid-1920s — resisted all-comers until Cheryl Dawson, wife of the man tasked with sorting through Gödel’s mountain of posthumous papers — learned how to transcribe it all.

Biographer Stephen Budiansky is the first to try to give this pile of new information a human shape, and my guess is it hasn’t been easy.

Burdiansky handles the mathematics very well, capturing the air of scientific optimism that held sway over the intellectual Vienna and induced Germany’s leading mathematician David Hilbert to declare that “in mathematics there is *nothing* unknowable!”

Solving Hilbert’s four “Problems of Laying Foundations for Mathematics” of 1928 was supposed to secure the foundations of mathematics for good, and Gödel, a 22-year-old former physics student, solved one of them. Unfortunately for Hilbert and his disciples, however, Gödel also proved the insolubility of the other three. So much for the idea that all mathematics could be derived from the propositions of logic: Gödel demonstrated that logic itself was flawed.

This discovery didn’t worry Gödel nearly so much as it did his contemporaries. For Gödel, as Burdiansky explains, “Mathematical objects and a priori truth was as real to him as anything the senses could directly perceive.” If our reason failed, well, that was no reason to throw away the world: we would always be able to recognise some truths through intuition that could never be established through computation. That, for Gödel, was the whole point of being human.

It’s one thing to be a Platonist in a world dead set against Platonism, or an idealist in the world that’s gone all-in with materialism. It’s quite another to see acts of sabotage in the errors of TV listings magazines, or political conspiracy in the suicide of King Ludwig II of Bavaria. The Elysian calm and concentration afforded Gödel after the second world war at the Institute of Advanced Study probably did him more harm than good. “Gödel is too alone,” his friend Oskar Morgenstern fretted: “he should be given teaching duties; at least an hour a week.”

In the end, though, neither his friendships nor his marriage nor that ridiculous flamingo could tether to the Earth a man who had always preferred to write for his desk drawer, and Burdiansky, for all his tremendous efforts and exhaustive interrogations of Godel’s times and places, acquaintances and offices, can only leave us, at the end, with an immeasurably enriched version of Gödel the wise child. It’s an undeniably distracting and reductive picture. But — and this is the trouble — it’s not wrong.

To hell with the philosopause!

Reading Hawking Hawking: The Selling of a Scientific Celebrity by Charles Seife for the Spectator, 1 May 2021

I could never muster much enthusiasm for the theoretical physicist Stephen Hawking. His work, on the early universe and the nature of spacetime, was Nobel-worthy, but those of us outside his narrow community were horribly short-changed. His 1988 global best-seller A Brief History of Time was incomprehensible, not because it was difficult, but because it was bad.

Nobody, naturally, wanted to ascribe Hawking’s popular success to his rare form of Motor Neurone Disease, Hawking least of all. He afforded us no room for horror or, God forbid, pity. In 1990, asked a dumb question about how his condition might have shaped his work (because people who suffer ruinous, debilitating illnesses acquire compensating superpowers, right?) Hawking played along: “I haven’t had to lecture or teach undergraduates, and I haven’t had to sit on tedious and time-consuming committees. So I have been able to devote myself completely to research.”

The truth — that Hawking was one of the worst popular communicators of his day — is as evident as it is unsayable. A Brief History of Time was incomprehensible because after nearly five years’ superhuman effort, the author proved incapable of composing a whole book unaided. He couldn’t even do mathematics the way most people do it, by doodling, since he’d already lost the use of his hands. He could not jot notes. He could not manipulate equations. He had to turn every problem he encountered into a species of geometry, just to be able to think about it. He held his own in an impossibly rarified profession for years, but the business of popular communication was beyond him. As was communication, in the end, according to Hawking’s late collaborator Andy Strominger: “You would talk about words per minute, and then it went to minutes per word, and then, you know, it just got slower and slower until it just sort of stopped.”

Hawking became, in the end, a computerised patchwork of hackneyed, pre-stored utterances and responses. Pull the string at his back and marvel. Charles Seife, a biographer braver than most, begins by staring down the puppet. His conceit is to tell Stephen Hawking’s story backwards, peeling back the layers of celebrity and incapacity to reveal the wounded human within.

It’s a tricksy idea that works so well, you wonder why no-one thought of it before (though ordering his material and his arguments in this way must have nearly killed the poor author).

Hawking’s greatest claim to fame is that he discovered things about black holes — still unobserved at that time — that set the two great schools of theoretical physics, quantum mechanics and relativity, at a fresh and astonishingly creative loggerheads.

But a new golden era of astronomical observation dawned almost immediately after, and A Brief History was badly outdated before it even hit the shelves. It couldn’t even get the date of the universe right.

It used to be that genius that outlived its moment could reinvent itself. When new-fangled endocrine science threw Ivan Pavlov’s Nobel-winning physiology into doubt, he reinvented himself as a psychologist (and not a bad one at that).

Today’s era of narrow specialism makes such a move almost impossible but, by way of intellectual compensation, there is always philosophy — a perennially popular field more or less wholly abandoned by professional philosophers. Images of the middle-aged scientific genius indulging its philosopause in book after book about science and art, science God, science and society and so on and so forth, may raise a wry smile, but work of real worth has come out of it.

Alas, even if Hawking had shown the slightest aptitude for philosophy (and he didn’t), he couldn’t possibly have composed it.

In our imaginations, Hawking is the cartoon embodiment of the scientific sage, effectively disembodied and above ordinary mortal concerns. In truth, life denied him a path to sagacity even as it steeped him in the spit and stew of physical being. Hawking’s libido never waned. So to hell with the philosopause! Bring on the dancing girls! Bring on the cheques, from Specsavers, BT, Jaguar, Paddy Power. (Hawking never had enough money: the care he needed was so intensive and difficult, a transatlantic air flight could set him back around a quarter of a million pounds). Bring on the billionaires with their fat cheques books (naifs, the lot of them, but decent enough, and generous to a fault). Bring on the countless opportunities to bloviate about subjects he didn’t understand, a sort of Prince Charles only without Charles’s efforts at warmth.

I find it impossible, having read Seife, not to see Hawking through the lens of Jacobean tragedy, warped and raging, unable even to stick a finger up at a world that could not — but much worse, *chose* not — to understand him. Of course he was a monster, and years too late, and through a book that will anger many, I have come to love him for it.

Soaked in ink and paint

Reading Dutch Light: Christiaan Huygens and the making of science in Europe
by Hugh Aldersey-Williams for the Spectator, 19 December 2020

This book, soaked, like the Dutch Republic itself, “in ink and paint”, is enchanting to the point of escapism. The author calls it “an interior journey, into a world of luxury and leisure”. It is more than that. What he says of Huygen’s milieu is true also of his book: “Like a ‘Dutch interior’ painting, it turns out to contain everything.”

Hugh Aldersey-Williams says that Huygens was the first modern scientist. This is a delicate argument to make — the word “scientist” didn’t enter the English language before 1834. And he’s right to be sparing with such rhetoric, since a little of it goes a very long way. What inadvertent baggage comes attached, for instance, to the (not unreasonable) claim that the city of Middleburg, supported by the market for spectacles, became “a hotbed of optical innovation” at the end of the 16th century? As I read about the collaboration between Christiaan’s father Constantijn (“with his trim dark beard and sharp features”) and his lens-grinder Cornelis Drebbel (“strapping, ill-read… careless of social hierarchies”) I kept getting flashbacks to the Steve Jobs and Steve Wozniak double-act in Aaron Sorkin’s film.

This is the problem of popular history, made double by the demands of explaining the science. Secretly, readers want the past to be either deeply exotic (so they don’t have to worry about it) or fundamentally familiar (so they, um, don’t have to worry about it).

Hugh Aldersey-Williams steeps us in neither fantasy for too long, and Dutch Light is, as a consequence, an oddly disturbing read: we see our present understanding of the world, and many of our current intellectual habits, emerging through the accidents and contingencies of history, through networks and relationships, friendships and fallings-out. Huygens’s world *is* distinctly modern — disturbingly so: the engine itself, the pipework and pistons, without any of the fancy fairings and decals of liberalism.

Trade begets technology begets science. The truth is out there but it costs money. Genius can only swim so far up the stream of social prejudice. Who your parents are matters.

Under Dutch light — clean, caustic, calvinistic — we see, not Enlightenment Europe emerging into the comforts of the modern, but a mirror in which we moderns are seen squatting a culture, full of flaws, that we’ve never managed to better.

One of the best things about Aldersey-Williams’s absorbing book (and how many 500-page biographies do you know feel too short when you finish them?) is the interest he shows in everyone else. Christiaan arrives in the right place, in the right time, among the right people, to achieve wonders. His father, born 1596 was a diplomat, architect, poet (he translated John Donne) and artist (he discovered Rembrandt). His longevity exasperated him: “Cease murderous years, and think no more of me” he wrote, on his 82nd birthday. He lived eight years more. But the space and energy Aldersey-Williams devotes to Constantijn and his four other children — “a network that stretched across Europe” — is anything but exasperating. It immeasurably enriches our idea of Christiaan’s work meant, and what his achievements signified.

Huygens worked at the meeting point of maths and physics, at a time when some key physical aspects of reality still resisted mathematical description. Curves provide a couple of striking examples. The cycloid is the path made by a point on the circumference of a turning wheel. The catenary is the curve made by a chain or rope hanging under gravity. Huygens was the first to explain these curves mathematically, doing more than most to embed mathematics in the physical sciences. He tackled problems in geometry and probability, and had some fun in the process (“A man of 56 years marries a woman of 16 years, how long can they live together without one or the other dying?”) Using telescopes he designed and made himself, he discovered Saturn’s ring system and its largest moon, Titan. He was the first to describe the concept of centrifugal force. He invented the pendulum clock.

Most extraordinary of all, Huygens — though a committed follower of Descartes (who was once a family friend) — came up with a model of light as a wave, wholly consistent with everything then known about the nature of light apart from colour, and streets ahead of the “corpuscular” theory promulgated by Newton, which had light consisting of a stream of tiny particles.

Huygens’s radical conception of light seems even stranger, when you consider that, as much as his conscience would let him, Huygens stayed faithful to Descartes’ vision of physics as a science of bodies in collision. Newton’s work on gravity, relying as it did on an unseen force, felt like a retreat to Huygens — a step towards occultism.

Because we turn our great thinkers into fetishes, we allow only one per generation. Newton has shut out Huygens, as Galileo shut out Kepler. Huygens became an also-ran in Anglo-Saxon eyes; ridiculous busts of Newton, meanwhile, were knocked out to adorn the salons of Britain’s country estates, “available in marble, terracotta and plaster versions to suit all pockets.”

Aldersey-Williams insists that this competition between the elder Huygens and the enfant terrible Newton was never so cheap. Set aside their notorious dispute over calculus, and we find the two men in lively and, yes, friendly correspondence. Cooperation and collaboration were on the rise: “Gone,” Aldersey-Williams writes, “is the quickness to feel insulted and take umbrage that characterised so many exchanges — domestic as well as international — in the early days of the French and English academies of science.”

When Henry Oldenburg, the prime mobile of the Royal Society, died suddenly in 1677, a link was broken between scientists everywhere, and particularly between Britain and the continent. The 20th century did not forge a culture of international scientific cooperation. It repaired the one Oldenburg and Huygens had built over decades of eager correspondence and clever diplomacy.

What else you got?

Reading Benjamin Labatut’s When We Cease to Understand the World for the Spectator, 14 November 2020

One day someone is going to have to write the definitive study of Wikipedia’s influence on letters. What, after all, are we supposed to make of all these wikinovels? I mean novels that leap from subject to subject, anecdote to anecdote, so that the reader feels as though they are toppling like Alice down a particularly erudite Wikipedia rabbit-hole.

The trouble with writing such a book, in an age of ready internet access, and particularly Wikipedia, is that, however effortless your erudition, no one is any longer going to be particularly impressed by it.

We can all be our own Don DeLillo now; our own W G Sebald. The model for this kind of literary escapade might not even be literary at all; does anyone here remember James Burke’s Connections, a 1978 BBC TV series which took an interdisciplinary approach to the history of science and invention, and demonstrated how various discoveries, scientific achievements, and historical world events were built from one another successively in an interconnected way?

And did anyone notice how I ripped the last 35 words from the show’s Wikipedia entry?

All right, I’m sneering, and I should make clear from the off that When We Cease… is a chilling, gripping, intelligent, deeply humane book. It’s about the limits of human knowledge, and the not-so-very-pleasant premises on which physical reality seems to be built. The author, a Chilean born in Rotterdam in 1980, writes in Spanish. Adrian Nathan West — himself a cracking essayist — fashioned this spiky, pitch-perfect English translation. The book consists, in the main, of four broadly biographical essays. The chemist Franz Haber finds an industrial means of fixing nitrogen, enabling the revolution in food supply that sustains our world, while also pioneering modern chemical warfare. Karl Schwarzchild, imagines the terrible uber-darkness at the heart of a black hole, dies in a toxic first world war and ushers in a thermonuclear second. Alexander Grothendieck is the first of a line of post-war mathematician-paranoiacs convinced they’ve uncovered a universal principle too terrible to discuss in public (and after Oppenheimer, really, who can blame them?) In the longest essay-cum-story, Erwin Schrodinger and Werner Heisenberg slug it out for dominance in a field — quantum physics — increasingly consumed by uncertainty and (as Labatut would have it) dread.

The problem here — if problem it is — is that no connection, in this book of artfully arranged connections, is more than a keypress away from the internet-savvy reader. Wikipedia, twenty years old next year, really has changed our approach to knowledge. There’s nothing aristocratic about erudition now. It is neither a sign of privilege, nor (and this is more disconcerting) is it necessarily a sign of industry. Erudition has become a register, like irony. like sarcasm. like melancholy. It’s become, not the fruit of reading, but a way of perceiving the world.

Literary attempts to harness this great power are sometimes laughable. But this has always been the case for literary innovation. Look at the gothic novel. Fifty odd years before the peerless masterpiece that is Mary Shelley’s Frankenstein we got Horace Walpole’s The Castle of Otranto, which is jolly silly.

Now, a couple of hundred years after Frankenstein was published, “When We Cease to Understand the World” dutifully repeats the rumours (almost certainly put about by the local tourist industry) that the alchemist Johann Conrad Dippel, born outside Darmstadt in the original Burg Frankenstein in 1673, wielded an uncanny literary influence over our Mary. This is one of several dozen anecdotes which Labatut marshals to drive home that message that There Are Things In This World That We Are Not Supposed to Know. It’s artfully done, and chilling in its conviction. Modish, too, in the way it interlaces fact and fiction.

It’s also laughable, and for a couple of reasons. First, it seems a bit cheap of Labatut to treat all science and mathematics as one thing. If you want to build a book around the idea of humanity’s hubris, you can’t just point your finger at “boffins”.

The other problem is Labatut’s mixing of fact and fiction. He’s not out to cozen us. But here and there this reviewer was disconcerted enough to check his facts — and where else but on Wikipedia? I’m not saying Labatut used Wikipedia. (His bibliography lists a handful of third-tier sources including, I was amused to see, W G Sebald.) Nor am I saying that using Wikipedia is a bad thing.

I think, though, that we’re going to have to abandon our reflexive admiration for erudition. It’s always been desperately easy to fake. (John Fowles.) And today, thanks in large part to Wikipedia, it’s not beyond the wit of most of us to actually *acquire*.

All right, Benjamin, you’re erudite. We get it. What else you got?

An intellectual variant of whack-a-mole

Reading Joseph Mazur’s The Clock Mirage for The Spectator, 27 June 2020 

Some books elucidate their subject, mapping and sharpening its boundaries. The Clock Mirage, by the mathematician Joseph Mazur, is not one of them. Mazur is out to muddy time’s waters, dismantling the easy opposition between clock time and mental time, between physics and philosophy, between science and feeling.

That split made little sense even in 1922, when the philosopher Henri Bergson and the young physicist Albert Einstein (much against his better judgment) went head-to-head at the Société française de philosophie in Paris to discuss the meaning of relativity. (Or that was the idea. Actually they talked at complete cross-purposes.)

Einstein won. At the time, there was more novel insight to be got from physics than from psychological introspection. But time passes, knowledge accrues and fashions change. The inference (not Einstein’s, though people associate it with him) that time is a fourth dimension, commensurable with the three dimensions of space, is looking decidedly frayed. Meanwhile Bergson’s psychology of time has been pruned by neurologists and put out new shoots.

Our lives and perceptions are governed, to some extent, by circadian rhythms, but there is no internal clock by which we measure time in the abstract. Instead we construct events, and organise their relations, in space. Drivers, thinking they can make up time with speed, acquire tickets faster than they save seconds. Such errors are mathematically obvious, but spring from the irresistible association we make (poor vulnerable animals that we are) between speed and survival.

The more we understand about non-human minds, the more eccentric and sui generis our own time sense seems to be. Mazur ignores the welter of recent work on other animals’ sense of time — indeed, he winds the clock back several decades in his careless talk of animal ‘instincts’ (no one in animal behaviour uses the ‘I’ word any more). For this, though, I think he can be forgiven. He has put enough on his plate.

Mazur begins by rehearsing how the Earth turns, how clocks were developed, and how the idea of universal clock time came hot on the heels of the railway (mistimed passenger trains kept running into each other). His mind is engaged well enough throughout this long introduction, but around page 47 his heart beats noticeably faster. Mazur’s first love is theory, and he handles it well, using Zeno’s paradoxes to unpack the close relationship between psychology and mathematics.

In Zeno’s famous foot race, by the time fleet-footed Achilles catches up to the place where the plodding tortoise was, the tortoise has moved a little bit ahead. That keeps happening ad infinitum, or at least until Newton (or Leibniz, depending on who you think got to it first) pulls calculus out of his hat. Calculus is an algebraic way of handling (well, fudging) the continuity of the number line. It handles vectors and curves and smooth changes — the sorts of phenomena you can measure only if you’re prepared to stop counting.

But what if reality is granular after all, and time is quantised, arriving in discrete packets like the frames of a celluloid film stuttering through the gate of a projector? In this model of time, calculus is redundant and continuity is merely an illusion. Does it solve Zeno’s paradox? Perhaps it makes it 100 times more intractable. Just as motion needs time, time needs motion, and ‘we might wonder what happens to the existence of the world between those falling bits of time sand’.

This is all beautifully done, and Mazur, having hit his stride, maintains form throughout the rest of the book, though I suspect he has bitten off more than any reader could reasonably want to swallow. Rather than containing and spotlighting his subject, Mazur’s questions about time turn out (time and again, I’m tempted to say) to be about something completely different, as though we were playing an intellectual variant of whack-a-mole.

But this, I suppose, is the point. Mazur quotes Henri Poincaré:

Not only have we not direct intuition of the equality of two periods, but we have not even direct intuition of the simultaneity of two events occurring in two different places.

Our perception of time is so fractured, so much an ad hoc amalgam of the chatter of numerous, separately evolved systems (for the perception of motion; for the perception of daylight; for the perception of risk, and on and on — it’s a very long list), it may in the end be easier to abandon talk of time altogether, and for the same reason that psychologists, talking shop among themselves, eschew vague terms suchas ‘love’.

So much of what we mean by time, as we perceive it day to day, is really rhythm. So much of what physicists mean by time is really space. Time exists, as love exists, as a myth: real because contingent, real because constructed, a catch-all term for phenomena bigger, more numerous and far stranger than we can yet comprehend.

Joy in the detail

Reading Charles Darwin’s Barnacle and David Bowie’s Spider by Stephen Heard for the Spectator, 16 May 2020

Heteropoda davidbowie is a species of huntsman spider. Though rare, it has been found in parts of Malaysia, Singapore, Indonesia and possibly Thailand. (The uncertainty arises because it’s often mistaken for a similar-looking species, the Heteropoda javana.) In 2008 a German collector sent photos of his unusual-looking “pet” to Peter Jäger, an arachnologist at the Senckenberg Research Institute in Frankfurt. Consequently, and in common with most other living finds, David Bowie’s spider was discovered twice: once in the field, and once in the collection.

Bowie’s spider is famous, but not exceptional. Jäger has discovered more than 200 species of spider in the last decade, and names them after politicians, comedians and rock stars to highlight our ecological plight. Other researchers find more pointed ways to further the same cause. In the first month of Donald Trump’s administration, Iranian-Canadian entomologist Vazrick Nazari discovered a moth with a head crowned with large, blond comb-over scales. There’s more to Neopalpa donaldtrumpi than a striking physical resemblance: it lives in a federally protected area around where the border wall with Mexico is supposed to go. Cue headlines.

Species are becoming extinct 100 times faster than they did before modern humans arrived. This makes reading a book about the naming of species a curiously queasy affair. Nor is there much comfort to be had in evolutionary ecologist Stephen Heard’s observation that, having described 1.5 million species, we’ve (at very best) only recorded half of what’s out there. There is, you may recall, that devastating passage in Cormac McCarthy’s western novel Blood Meridian in which Judge Holden meticulously records a Native American artifact in his sketchbook — then destroys it. Given that to discover a species you must, by definition, invade its environment, Holden’s sketch-and-burn habit appears to be a painfully accurate metonym for what the human species is doing to the planet. Since the 1970s (when there used to be twice as many wild animals than there are now) we’ve been discovering and endangering new species in almost the same breath.

Richard Spruce, one of the Victorian era’s great botanical explorers, who spent 15 years exploring the Amazon from the Andes to its mouth, is a star of this short, charming book about how we have named and ordered the living world. No detail of his bravery, resilience and grace under pressure come close to the eloquence of this passing quotation, however: “Whenever rains, swollen streams, and grumbling Indians combined to overwhelm me with chagrin,” he wrote in his account of his travels, “I found reason to thank heaven which had enabled me to forget for the moment all my troubles in the contemplation of a simple moss.”

Stephen Heard, an evolutionary ecologist based in Canada, explains how extraordinary amounts of curiosity have been codified to create a map of the living world. The legalistic-sounding codes by which species are named are, it turns out, admirably egalitarian, ensuring that the names amateurs give species are just as valid as those of professional scientists.

Formal names are necessary because of the difficulty we have in distinguishing between similiar species. Common names run into this difficulty all the time. There too many of them, so the same species gets different names in different languages. At the same time, there aren’t enough of them, so that, as Heard points out, “Darwin’s finches aren’t finches, African violets aren’t violets, and electric eels aren’t eels;” Robins, blackbirds and badgers are entirely different animals in Europe and North America; and virtually every flower has at one time or another been called a daisy.

Also names tend, reasonably enough, to be descriptive. This is fine when you’re distinguishing between, say, five different types of fish When there are 500 different fish to sort through, however, absurdity beckons. Heard lovingly transcribes the pre-Linnaean species name of the English whiting, formulated around 1738: “Gadus, dorso tripterygio, ore cirrato, longitudine ad latitudinem tripla, pinna ani prima officulorum trigiata“. So there.

It takes nothing away from the genius of Swedish physician Carl Linnaeus, who formulated the naming system we still use today, to say that he came along at the right time. By Linnaeus’s day, it was possible to look things up. Advances in printing and distribution had made reference works possible. Linnaeus’s innovation was to decouple names from descriptions. And this, as Heard reveals in anecdote after anecdote, is where the fun now slips in: the mythopoeic cool of the baboon Papio anubis, the mischevious smarts of the beetle Agra vation, the nerd celebrity of lemur Avahi cleesi.

Hearst’s taxonomy of taxonomies makes for somewhat thin reading; this is less of a book, more like a dozen interesting magazine articles flying in close formation. But its close focus, bringing to life minutiae of both the living world and the practice of science, is welcome.

I once met Michael Land, the neurobiologist who figured out how the lobster’s eye works. He told me that the trouble with big ideas is that they get in the way of the small ones. Heard’s lesson, delivered with such a light touch, is the same. The joy, and much of the accompanying wisdom, lies in the detail.

Goodbye to all that

Reading Technologies of the Human Corpse by John Troyer for the Spectator, 11 April 2020

John Troyer, the director of the Centre for Death and Society at the University of Bath, has moves. You can find his interpretative dances punctuating a number of his lectures, which go by such arresting titles as ‘150 Years of the Human Corpse in American History in Under 15 Minutes with Jaunty Background Music’ and ‘Abusing the Corpse Even More: Understanding Necrophilia Laws in the USA — Now with more Necro! And more Philia!’ (Wisconsin and Ohio are, according to Troyer’s eccentric looking and always fascinating website, ‘two states that just keep giving and giving when it comes to American necrophilia cases’.)

Troyer’s budding stand-up career has taken a couple of recent knocks. First was the ever more pressing need for him to crack on with his PhD (his dilatoriness was becoming a family joke). Technologies of the Human Corpse is yanked, not without injury, from that career-establishing academic work. Even as he assembled the present volume, however, there came another, far more personal, blow.

Late in July 2017 Troyer’s younger sister Julie was diagnosed with an aggressive brain cancer. Her condition deteriorated far more quickly than anyone expected, and on 29 July 2018 she died. This left Troyer — the engaging young American death scholar sprung from a family of funeral directors — having to square his erudite and cerebral thoughts on death and dead bodies with the fact he’d just kissed his sister goodbye. He interleaves poetical journal entries composed during Julie’s dying and her death, her funeral and her commemoration, between chapters written by a younger, jollier and of course shallower self.

To be brutal, the poems aren’t up to much, and on their own they wouldn’t add a great deal by way of nuance or tragedy. Happily for us, however, and to Troyer’s credit, he has transformed them into a deeply moving 30-page memoir that now serves as the book’s preface. This, then, is Troyer’s monster: a powerful essay about dying and bereavement; a set of poems written off the cuff and under great stress; and seven rather disconnected chapters about what’s befallen the human corpse in the past century or so.

Even as the book was going to print, Troyer explains in a hurried postscript, his father, a retired undertaker, lost consciousness following a cardiac arrest and was very obviously dying:

“And seeing my father suddenly fall into a comatose state so soon after watching my sister die is impossible to fully describe: I understand what is happening, yet I do not want to understand what is happening.”

This deceptively simple statement from Troyer the writer is streets ahead of anything Troyer the postgrad can pull off.

But to the meat of the book. The American civil war saw several thousand corpses embalmed and transported on new-fangled railway routes across the continent. The ability to preserve bodies, and even lend them a lifelike appearance months after death, created a new industry that, in various configurations and under several names, now goes by the daunting neologism of ‘deathcare provision’. In the future, this industry will be seen ‘transforming the mostly funeralisation side of the business into a much broader, human body parts and tissue distribution system’, as technical advances make increasing use of cadavers and processed cadaver parts.

So how much is a dead body worth? Between $30,000 and $50,000, says Troyer — five times as much for donors processed into medical implants, dermal implants and demineralised bone matrices. Funds and materials are exchanged through a network of body brokers who serve as middlemen between biomedical corporations such as Johnson & Johnson and the usual sources of human cadavers — medical schools, funeral homes and mortuaries. It is by no stretch an illegal trade, nor is it morally problematic in most instances; but it is rife with scandal. As one involved party remarks: ‘If you’re cremated, no one is ever going to know if you’re missing your shoulders or knees or your head.’

Troyer is out to show how various industries serve to turn our dead bodies into ‘an unfettered source of capital’. The ‘fluid men’ of Civil War America — who toured the battlefields showing keen students how to embalm a corpse (and almost always badly) — had no idea what a strange story they had started. Today, as the anatomist Gunther von Hagens poses human cadavers in sexual positions to pique and titillate worldwide audiences, we begin to get a measure of how far we have come. Hagens’s posthumous pornography reveals, says Troyer, ‘the ultimate taxonomic power over nature: we humans, or at least our bodies, can live forever because we pull ourselves from nature’.

Technologies of the Human Corpse is a bit of a mess, but I have a lot of time for Troyer. His insights are sound, and his recent travails may yet (and at high human cost — but it was ever thus) make him a writer of some force.