“Democratic leaders are more difficult to decipher”

Reading The Madman in the White House: Sigmund Freud, Ambassador Bullitt and the Lost Psychobiography of Woodrow Wilson by Patrick Weil, for the Spectator, 29 July 2023

It was a vision US president Woodrow Wilson could not resist, and it was instrumental in bringing the US into the Great War.

The treaty at Versailles and the League of Nations founded during the negotiations were meant, not just to end the current war, but to end all future wars, by ensuring that a country taking up arms against one signatory would be treated as a belligerent by all the others.

Wilson took his advisor Edward “Colonel” House’s vision of a new world order and careered off with it.

Against advice, he attended Versailles in person, let none of his staff in with him during negotiations, was quickly overwhelmed, saw his principled “fourteen points” deluged by special provisions and horse-trading, and returned home, convinced, first, that his dearest close colleagues had betrayed him (which they hadn’t); second, that the League of Nations alone could mend what the Treaty of Versailles had left broken or made worse (which it didn’t); third, that that he was the vessel of divine will, and that what the world needed from him at this crucial hour was a show of principle and Christ-like sacrifice. “The stage is set,” he declared to a dumbfounded and sceptical Senate, “the destiny is disclosed. It has come about by no plan of our conceiving, but by the hand of God who has led us into this way. We cannot turn back.”

Winston Churchill had Wilson’s number: “Peace and Goodwill among all Nations abroad, but no truck with the Republican party at home. That was his ticket and that was his ruin and the ruin of much else as well.”

When it was clear he would not get everything he wanted, Wilson destroyed the bill utterly, needlessly ending US involvement in the League before it had even begun.

Several developments followed ineluctably from this. Another world war, of course; also a small but vibrant cottage industry in books explaining just what in hell Wilson had thought he was playing at. Patrick Weil’s new book captures the anger and confusion of the period. His evenness of manner that sets the hysteria of his subjects into high relief; take, for example, Sigmund Freud, who wrote of Wilson: “As far as a single person can be responsible for the misery of this part of the world, he surely is.”

Anger and a certain literary curiosity drove Freud to collaborate with William C. Bullitt, Wilson’s erstwhile advisor, on a psychobiography of the man. His daughter Anna hated the final book, but one has to assume she was dealing with her own daddy issues.

Delayed by decades so as not to derail Bullitt’s political career, Thomas Woodrow Wilson: a psychological study was published in bowdlerised form, and to no very positive fanfare, in 1967. Bullitt, by then a veteran anti-communist, was chary of handing ammunition to the enemy, and suppressed his book’s most sensational interpretations, involving Wilson’s suppressed homosexuality, his overbearing father, and his Christ complex.

In 2014, Weil, a political scientist based in Paris, happened upon the original 1932 manuscript.

Are the revelations contained there enough on their own to sustain a new book? Weil is circumspect, enriching his account with a quite detailed and acute psychological biography of his own — of William Bullitt. Bullitt was a democratic idealist and political insider who found himself pushed into increasingly hawkish postures by his all-too-clear appreciation of the threat posed by Stalin’s Soviet Union. He made strange bedfellows over the years: on his deathbed he received a friendly note from Richard Nixon, “Congratulations incidentally on driving the liberal establishment out of their minds with your Wilson.”

Those readers who’ve come for bread and circuses will find that John Maynard Keynes’s 1919 book “The Economic Consequences of the Peace” chopped up Wilson nicely long ago, tracing “the disintegration of the president’s moral position and the clouding of his mind.” Then there’s the 1922 Wilson psychobiography that Freud did not endorse, though he wrote privately to its author, William Bayard Hale, that his linguistic analysis of Wilson’s prolix pronouncements had “a true spirit of psychoanalysis in it.” Then there’s Alexander and Juliet Georges’ Woodrow Wilson and Colonel House, a psychological portrait from 1956 that argues that Wilson’s father represented a superego whom Wilson could never satisfy. And there are several others, if you go looking.

So, is Madman a conscientious but unnecessary book about Wilson? Or an insightful but rather oddly proportioned book about Bullitt? The answer is both, nor is this necessarily a drawback. Bullitt’s growing disillusion with Wilson, his hurt, and ultimately his contempt for the man, shaped him as surely as his curiously unhappy childhood and a formative political debacle at Princeton shaped Woodrow Wilson.

“Dictators are easy to read,” Weil writes. “Democratic leaders are more difficult to decipher. However, they can be just as unbalanced as dictators and can play a truly destructive role in our history.”

This is well put, but I think Weil’s portrait of Bullitt demonstrates something broader and rather more hopeful: that politics — even realpolitik — is best understood as an affair of the heart.

The monster comes from outside

Reading To Battersea Park by Philip Hensher for The Spectator, 1 April 2023

We never quite make it to Battersea Park. By the time the narrator and his husband reach its gates, it’s time for them, and us, to return home.

The narrator is a writer, living just that little bit too far away from Battersea Park, inspired by eeriness of the Covid lockdown regime but also horribly blocked. All kinds of approaches to fiction beckon to him in his plight, and we are treated to not a few of them here.

Each section of this short novel embodies a literary device. We begin, maddeningly, in “The Iterative Mood” (“I would have”, “She would normally have”, “They used to…”) and we end in “Entrelacement”, with its overlapping stories offering strange resolutions to this polyphonous, increasingly surreal account of Lockdown uncanny. Every technique the narrator employs is an attempt to witness strange times using ordinary words.

Hensher didn’t just pluck this idea out of the void. Fiction has a nasty habit of pratfalling again and again at the feet of a contemporary crisis. Elizabeth Bowen’s The Heat of the Day (the Blitz) dribbles away into an underpowered spy thriller; Don DeLillo’s Falling Man (the September 11 attacks) only gets going in the last few dozen pages, when the protagonist quits New York for the poker-tournament circuit. Mind you, indirection may prove to be a winning strategy of itself. The most sheerly enjoyable section of To Battersea Park is a “hero’s journey” set in post-apocalyptic Whitstable. Hensher nails perfectly the way we distance ourselves from a crisis by romanticising it.

Milan Kundera wrote about this — about how “the monster comes from outside and is called History” — impersonal, uncontrollable, incalculable, incomprehensible and above all inescapable.

In To Battersea Park, Hensher speaks to the same idea, and ends up writing the kind of book Kundera wrote: one that appeals, first of all — almost, I would say, exclusively — to other writers.

In the middle of the book there’s a short scene in which a journalist interviews a novelist called Henry Ricks Bailey, and Bailey says:

“When people talk about novels, if they talk at all, they talk about the subject of those novels, or they talk about the life of the person who wrote it. This is a wonderful book, they say. It’s about a couple who fall in love during the Rwandan Genocide, they say… It’s as if all one had to do to write a novel is pick up a big box of stuff in one room and move it into the next.”

This (of course, and by design) borders on the infantile: the writer boo-hooing because the reader has had the temerity to beg a moral.

Hensher is more circumspect: he understands that the more you do right by events — the endless “and-then”-ness of everything — the less you’re going to to able to interest a reader, who has after all paid good money to bathe in causes and consequences, in “becauses” and “buts”.

To Battersea Park reveals all the ways we try to comprehend a world that isn’t good or fair, or causal, or even comprehensible. It’s about how we reduce the otherwise ungraspable world using conventions, often of our own devising. An elderly man fills half his house with a model railway. A dangerously brittle paterfamilias pumps the air out of his marriage. A blocked writer experiments with a set of literary devices. A horrified child sets sail in an imaginary boat. It’s a revelation: a comedy of suburban manners slowed to the point of nightmare.

That said, I get nervous around art that’s so directly addressed to the practitioners of that art. It’s a novel that teaches, more than it inspires, and a small triumph, in a world that I can’t help but feel is gasping for big ones.

 

A surprisingly narrow piano

Reading Richard Mainwaring’s Everybody Hertz for the Spectator, 30 April 2022.

Imagine that all the frequencies nature affords were laid out on an extended piano keyboard. Never mind that some waves are mechanical, propagated through air or some other fluid, and other waves are electromagnetic, and can pass through a vacuum. Lay them down all together, and what do you get?

The startling answer is: a surprisingly narrow piano. To play X-rays (whose waves cycle up to 30,000,000,000,000,000,000 times per second), our pianist would have to travel a mere nine metres to the right of middle C. Wandering nine and a half metres in the other direction, our pianist would then be able to sound the super-bass note generated by shockwaves rippling through the hot gas around a supermassive black hole in the Perseus cluster — a wave that cycles just once every 18.5 million years.

Closer to home, how big do you think that piano would have to be for it to play every note distinguishable by the human ear? You’d have to add barely a single octave to either side of a regular concert grand.

Readers of Richard Mainwaring’s wonderfully titled book will fall into two camps. Some will want to hear what this “infinite piano” conceit reveals about the natural world; about the (considerable) auditory abilities of spiders, say, or how 23 high-stepping fitness junkies caused a tremor that evacuated the the 39-storey Techno Mart building in Seoul, South Korea.

Other readers, though entertained well enough by Mainwaring’s extraordinary clear and concise science writing, won’t be able to get that infinite piano out of their heads. It’s a metaphor so engaging, so intuitive, it’s quite as exciting as anything else in the book (for all that the book features ghosts, whales, Neolithic chambered cairns and Nikolai Tesla).

Mainwaring is a musician and a composer, and the business of music runs under even his most abstruse intellectual excursions. A Marsquake recorded on On 6 April 2019, sped up by a factor of 60, sounds, he tells us, “not unlike someone blowing over the top of a half-full wine bottle in Westminster Abbey”. Fully concentrating on a task generates brainwaves of around 40 Hz or more: ”it’s a wonder we can’t hear them humming, as they are at the same frequency as the opening bass note of Cypress Hill’s ‘Insane in the Brain’.”

This is infotainment at its most charming and lightweight; tonally, it’s of a piece with the musical stunts (for example, arranging a performance by massed tuning-forks) that Mainwaring has regularly staged for BBC1’s pre-watershed magazine programme The ONE Show. The glimpses Mainwaring gives us into the peculiar, fractured, distraction-filled business of modern music making are quite as fascinating as his tales of planetary resonance and the latest thinking about olfaction. He can also be tremendously catty, as when he pricks the vaingloriousness of virtuoso bass players (“Know your role, bassists – stay out of the way.”)

Like any ebullient teacher, he won’t be everybody’s cup of tea. There’s always one misery-guts at the back of the class whose teeth will be set on edge, and now and again Mainwaring’s humour is a little forced. This is usually because he’s hit on some neat metaphor and doesn’t know when to stop beating on it. We should set against this, though, his willingness to dive (and deeply, too) into any number of abstruse subjects, from religious experiences to Edwardian vibrators.

Throughout, Mainwaring keeps a sharp eye out for specious claims and pretensions. There is, he says, nothing magical about “the God-given, superhero ability of perfect pitch” — the ability to identify a note from its frequency. Indeed, before 1955, the year the ISO standardised “A” at 440 Hz, there was no such thing as perfect pitch. (Interestingly, though, speakers of Mandarin, a language dependent on tonal inflexion, are rather better at guessing notes than the rest of us.)

On the other hand there is, as Mainwaring ably demonstrates, an extraordinary spiritual power to music, particularly around the note A forty-seven white keys to the left of middle C. (That’s 19 cycles per second, or 19 Hertz, we say now, in honour of Heinrich Rudolf Hertz, who proved the existence of electromagnetic waves). This “A” can trigger cold sweats, fits of severe depression, and even sightings of dead people. Mainwaring traces the use of low notes and infrasound from the more inaccessible tunnels of French caves (where little ochre dots marked where prehistoric singers should stand to sound especially resonant and amplified) to Bach’s Toccata and Fugue in D Minor which, on any decent organ, generates infrasonic byproducts by means of two chords and a low pedal D.

Though horribly abused and exploited by various new Age fads over the years, the old intuition still holds: vibrations reveal much about life, consciousness and the integrity of matter. Mainwaring’s clear-eyed forays into medicine, psychology and spirituality reflect as much.

It’s a commonplace of popular science that the world is looked at best through this or that funny-shaped window of the author’s choosing. But Mainwaring’s garrulous offering is the real deal.

Stone the Fool and others

Reading Stars and Spies: Intelligence Operations and the Entertainment Business
by Christopher Andrew and Julius Green for the Spectator, 18 December 2021

On 2 October 2020, when he became chief of the UK Secret Intelligence Service (MI6, if you prefer), Richard Moore tweeted (*tweeted!*)

#Bond or #Smiley need not apply. They’re (splendid) fiction but actually we’re #secretlyjustlikeyou.

The gesture’s novelty disguised, at the time, its appalling real-world implications: Bond was, after all, competent; and Smiley had integrity.

Stars and Spies, by veteran intelligence historian Christopher Andrew and theatre director and circus producer Julius Green, is a thoroughly entertaining read, but not at all a reassuring one. “The adoption of a fictional persona, the learning of scripts and the ability to improvise” are central to career progression in both theatre and espionage, the writers explain, “and undercover agents often find themselves engaged in what is effectively an exercise in in long-form role play.”

It should, then, come as no surprise that this book boasts “no shortage of enthusiastic but inept entertainer-spies”.

There’s Aphra Behn, the first woman employed as a secret agent by the British state during the Second Anglo-Dutch War in 1665: reaping no secret intelligence from her former lover, “ASTRA, Agent 160”, she made stuff up.

As, indeed, did “The Man Called Intrepid”, Sir William Stephenson, subject, in 1976, of the biggest-selling book ever on intelligence history. His recollections, spanning everything from organising wartime resistance in Europe to developing the Spitfire and the jet engine, work on the German Enigma code, and developing nuclear weapons, turned out to be the melancholy fabulations of a man suffering catastrophic memory loss.

The authors imagine that their subject — the intersection between spying and acting — is entertaining enough that they can simply start in the England of Good Queen Bess and Christopher Marlowe (recruited to spy for Walsingham while a student at Cambridge; also wrote a play or two), and end with the ludicrous antics (and — fair’s fair — brilliant acting) of US spy show Homeland.

And, by and large, they’re right. Begin at the beginning; end at the end. Why gild the lily with anything so arduous as an argument, when your anecdotes are this engaging? (Daniel Defoe’s terrifying plans for a surveillance state were scotched because the government’s intelligence budget was being siphoned off to keep Charles II’s mistresses quiet; and why were the British establishment so resistant to the charms of Soviet ballerinas?)

This approach does, however, leave the authors’ sense of proportion open to question. They’re not wrong to point out that “the most theatrical innovations pioneered by Stalinist intelligence were the show trials”, but in the context of so many Corenesque quasi-theatrical anecdotes, this observation can’t help but feel a bit cheap.

Once the parallels between spying and acting have been pointed out, the stories told here (many of them the fruit of fairly arduous primary research) sometimes come across as slightly fatuous. Why should the popular broadcaster Maxwell Knight not be a powerful recruiter of spies during the inter-war years? There’s nothing counter-intuitive here, if you think about the circles Knight must have moved in.

We are on surer ground when the authors measure the sharp contrast between fictional spies and their real-life counterparts. In the movies, honeypots abound, still rehashing the myths attaching to the courageous World War One French spy Mistinguett and the sadly deluded Margaretha Zelle (Mata Hari).

In truth, though, and for the longest while, women in this business have been more middle management than cat-suited loot. Recruited largely from Oxford’s women’s colleges and Cheltenham Ladies’ College, women played a more important part in the Security Service than in any other wartime government department, and for years, we are told, the service has been recruiting more women at officer and executive level than any other branch of government.

As for seduction and pillow-talk, even a fleeting acquaintance with men in their natural environment will tell us that, as Maxwell Knight put it, “Nothing is easier than for a woman to gain a man’s confidence by the showing and expression of a little sympathy… I am convinced,” he went on, “that more information has been obtained by women agents by keeping out of the arms of a man, than was ever obtained by willingly sinking into them.”

Fuelled by Erskine Childers’s peerless spy novel The Riddle of the Sands (1903), by Somerset Maughan’s Ashenden stories and by everything Fleming ever wrote, of course the audience for espionage drama hankers for real-life insight from writers “in the know”. And if the writer complains that the whole espionage industry is a thing of smoke and mirrors, well, we’ll find that fascinating too. (In Ben Jonson’s spy farce Volpone Sir Pol, on being told of the death of Stone the Fool, claims that Stone actually ran a sophisticated spy ring which communicated by means of dead drops hidden in fruit and vegetables. Eat your heart out, Le Carré.)

Andrew and Green, who both at different times studied history at Corpus Christi, Christopher Marlowe’s old college, are not really giving us the inside track. I would go so far as to say that they are not really telling us anything new. But they marshall their rare facts splendidly, and use them to spin ripping yarns.

A cherry is a cherry is a cherry

Life is Simple: How Occam’s Razor Sets Science Free and Shapes the Universe
by Johnjoe McFadden, reviewed for the Spectator, 28 August 2021

Astonishing, where an idea can lead you. You start with something that, 800 years hence, will sound like it’s being taught at kindergarten: Fathers are fathers, not because they are filled with some “essence of fatherhood”, but because they have children.

Fast forward a few years, and the Pope is trying to have you killed.

Not only have you run roughshod over his beloved eucharist (justified, till then, by some very dodgy Aristotelian logic-chopping); you’re also saying there’s no “essence of kinghood”, neither. If kings are only kings because they have subjects, then, said William of Occam, “power should not be entrusted to anyone without the consent of all”. Heady stuff for 1334.

How this progression of thought birthed the very idea of modern science, is the subject of what may be the most sheerly enjoyable history of science of recent years.

William was born around 1288 in the little town of Ockham in Surrey. He was probably an orphan; at any rate he was given to the Franciscan order around the age of eleven. He shone at Greyfriars in London, and around 1310 was dispatched to Oxford’s newfangled university.

All manner of intellectual, theological and political shenanigans followed, mostly to do with William’s efforts to demolish almost the entire edifice of medieval philosophy.

It needed demolishing, and that’s because it still held to Aristotle’s ideas about what an object is. Aristotle wondered how single objects and multiples can co-exist. His solution: categorise everything. A cherry is a cherry is a cherry, and all cherries have cherryness in common. A cherry is a “universal”; the properties that might distinguish one cherry from another are “accidental”.

The trouble with Aristotle’s universals, though, is that they assume a one-to-one correspondence between word and thing, and posit a universe made up of a terrifying number of unique things — at least one for each noun or verb in the language.

And the problem with that is that it’s an engine for making mistakes.

Medieval philosophy relied largely on syllogistic reasoning, juggling things into logical-looking relations. “Socrates is a man, all men are mortal, so Socrates is mortal.”

So he is, but — and this is crucial — this conclusion is arrived at more by luck than good judgement. The statement isn’t “true” in any sense; it’s merely internally consistent.

Imagine we make a mistake. Imagine we spring from a society where beards are pretty much de rigeur (classical Athens, say, or Farringdon Road). Imagine we said, “Socrates is a man, all men have beards, therefore Socrates has a beard”?

Though one of its premises is wrong, the statement barrels ahead regardless; it’s internally consistent, and so, if you’re not paying attention, it creates the appearance of truth.

But there’s worse: the argument that gives Socates a beard might actually be true. Some men do have beards. Socrates may be one of them. And if he is, that beard seems — again, if you’re not paying attention — to confirm a false assertion.

William of Occam understood that our relationship with the world is a lot looser, cloudier, and more indeterminate than syllogistic logic allows. That’s why, when a tavern owner hangs a barrel hoop outside his house, passing travellers know they can stop there for a drink. The moment words are decoupled from things, then they act as signs, negotiating flexibly with a world of blooming, buzzing confusion.

Once we take this idea to heart, then very quickly — and as a matter of taste more than anything — we discover how much more powerful straightforward explanations are than complicated ones. Occam came up with a number of versions of what even then was not an entirely new idea: “It is futile to do with more what can be done with less,” he once remarked. Subsequent formulations do little but gild this lily.

His idea proved so powerful, three centuries later the French theologian Libert Froidmont coined the term “Occam’s razor”, to describe how we arrive at good explanations by shaving away excess complexity. As McFadden shows, that razor’s still doing useful work.

Life is Simple is primarily a history of science, tracing William’s dangerous idea through astronomy, cosmology, physics and biology, from Copernicus to Brahe, Kepler to Newton, Darwin to Mendel, Einstein to Noether to Weyl. But McFadden never loses sight of William’s staggering, in some ways deplorable influence over the human psyche as a whole. For if words are independent of things, how do we know what’s true?

Thanks to William of Occam, we don’t. The universe, after Occam, is unknowable. Yes, we can come up with explanations of things, and test them against observation and experience; but from here on in, our only test of truth will be utility. Ptolemy’s 2nd-century Almagest, a truly florid description of the motions of the stars and planetary paths, is not and never will be *wrong*; the worst we can say is that it’s overcomplicated.

In the Coen brothers’ movie The Big Lebowski, an exasperated Dude turns on his friend: “You’re not *wrong*, Walter” he cries, “you’re just an asshole.” William of Occam is our universal Walter, and the first prophet of our disenchantment. He’s the friend we wish we’d never listened to, when he told us Father Christmas was not real.

Nothing happens without a reason

Reading Journey to the Edge of Reason: The Life of Kurt Gödel by Stephen Budiansky for the Spectator, 29 May 2021

The 20th-century Austrian mathematician Kurt Gödel did his level best to live in the world as his philosophical hero Gottfried Wilhelm Leibnitz imagined it: a place of pre-established harmony, whose patterns are accessible to reason.

It’s an optimistic world, and a theological one: a universe presided over by a God who does not play dice. It’s most decidedly not a 20th-century world, but “in any case”, as Gödel himself once commented, “there is no reason to trust blindly in the spirit of the time.”

His fellow mathematician Paul Erdös was appalled: “You became a mathematician so that people should study you,” he complained, “not that you should study Leibnitz.” But Gödel always did prefer study to self-expression, and is this is chiefly why we know so little about him, and why the spectacular deterioration of his final years — a fantasmagoric tale of imagined conspiracies, strange vapours and shadowy intruders, ending in his self-starvation in 1978 — has come to stand for the whole of his life.

“Nothing, Gödel believed, happened without a reason,” says Stephen Burdiansky. “It was at once an affirmation of ultrarationalism, and a recipe for utter paranoia.”

You need hindsight to see the paranoia waiting to pounce. But the ultrarationalism — that was always tripping him up. There was something worryingly non-stick about him. He didn’t so much resist the spirit of the time as blunder about totally oblivious of it. He barely noticed the Anschluss, barely escaped Vienna as the Nazis assumed control, and, once ensconced at the Institute for Advanced Study at Princeton, barely credited that tragedy was even possible, or that, say, a friend might die in a concentration camp (it took three letters for his mother to convince him).

Many believed that he’d blundered, in a way typical to him, into marriage with his life-long partner, a foot-care specialist and divorcée called Adele Nimbursky. Perhaps he did. But Burdiansky does a spirited job of defending this “uneducated but determined” woman against the sneers of snobs. If anyone kept Gödel rooted to the facts of living, it was Adele. She once stuck a concrete flamingo, painted pink and black, in a flower bed right outside his study window. All evidence suggests he adored it.

Idealistic and dysfunctional, Gödel became, in mathematician Jordan Ellenberg’s phrase, “the romantic’s favourite mathematician”, a reputation cemented by the fact that we knew hardly anything about him. Key personal correspondence was destroyed at his death, while his journals and notebooks — written in Gabelsberger script, a German shorthand that had fallen into disuse by the mid-1920s — resisted all-comers until Cheryl Dawson, wife of the man tasked with sorting through Gödel’s mountain of posthumous papers — learned how to transcribe it all.

Biographer Stephen Budiansky is the first to try to give this pile of new information a human shape, and my guess is it hasn’t been easy.

Burdiansky handles the mathematics very well, capturing the air of scientific optimism that held sway over the intellectual Vienna and induced Germany’s leading mathematician David Hilbert to declare that “in mathematics there is *nothing* unknowable!”

Solving Hilbert’s four “Problems of Laying Foundations for Mathematics” of 1928 was supposed to secure the foundations of mathematics for good, and Gödel, a 22-year-old former physics student, solved one of them. Unfortunately for Hilbert and his disciples, however, Gödel also proved the insolubility of the other three. So much for the idea that all mathematics could be derived from the propositions of logic: Gödel demonstrated that logic itself was flawed.

This discovery didn’t worry Gödel nearly so much as it did his contemporaries. For Gödel, as Burdiansky explains, “Mathematical objects and a priori truth was as real to him as anything the senses could directly perceive.” If our reason failed, well, that was no reason to throw away the world: we would always be able to recognise some truths through intuition that could never be established through computation. That, for Gödel, was the whole point of being human.

It’s one thing to be a Platonist in a world dead set against Platonism, or an idealist in the world that’s gone all-in with materialism. It’s quite another to see acts of sabotage in the errors of TV listings magazines, or political conspiracy in the suicide of King Ludwig II of Bavaria. The Elysian calm and concentration afforded Gödel after the second world war at the Institute of Advanced Study probably did him more harm than good. “Gödel is too alone,” his friend Oskar Morgenstern fretted: “he should be given teaching duties; at least an hour a week.”

In the end, though, neither his friendships nor his marriage nor that ridiculous flamingo could tether to the Earth a man who had always preferred to write for his desk drawer, and Burdiansky, for all his tremendous efforts and exhaustive interrogations of Godel’s times and places, acquaintances and offices, can only leave us, at the end, with an immeasurably enriched version of Gödel the wise child. It’s an undeniably distracting and reductive picture. But — and this is the trouble — it’s not wrong.

To hell with the philosopause!

Reading Hawking Hawking: The Selling of a Scientific Celebrity by Charles Seife for the Spectator, 1 May 2021

I could never muster much enthusiasm for the theoretical physicist Stephen Hawking. His work, on the early universe and the nature of spacetime, was Nobel-worthy, but those of us outside his narrow community were horribly short-changed. His 1988 global best-seller A Brief History of Time was incomprehensible, not because it was difficult, but because it was bad.

Nobody, naturally, wanted to ascribe Hawking’s popular success to his rare form of Motor Neurone Disease, Hawking least of all. He afforded us no room for horror or, God forbid, pity. In 1990, asked a dumb question about how his condition might have shaped his work (because people who suffer ruinous, debilitating illnesses acquire compensating superpowers, right?) Hawking played along: “I haven’t had to lecture or teach undergraduates, and I haven’t had to sit on tedious and time-consuming committees. So I have been able to devote myself completely to research.”

The truth — that Hawking was one of the worst popular communicators of his day — is as evident as it is unsayable. A Brief History of Time was incomprehensible because after nearly five years’ superhuman effort, the author proved incapable of composing a whole book unaided. He couldn’t even do mathematics the way most people do it, by doodling, since he’d already lost the use of his hands. He could not jot notes. He could not manipulate equations. He had to turn every problem he encountered into a species of geometry, just to be able to think about it. He held his own in an impossibly rarified profession for years, but the business of popular communication was beyond him. As was communication, in the end, according to Hawking’s late collaborator Andy Strominger: “You would talk about words per minute, and then it went to minutes per word, and then, you know, it just got slower and slower until it just sort of stopped.”

Hawking became, in the end, a computerised patchwork of hackneyed, pre-stored utterances and responses. Pull the string at his back and marvel. Charles Seife, a biographer braver than most, begins by staring down the puppet. His conceit is to tell Stephen Hawking’s story backwards, peeling back the layers of celebrity and incapacity to reveal the wounded human within.

It’s a tricksy idea that works so well, you wonder why no-one thought of it before (though ordering his material and his arguments in this way must have nearly killed the poor author).

Hawking’s greatest claim to fame is that he discovered things about black holes — still unobserved at that time — that set the two great schools of theoretical physics, quantum mechanics and relativity, at a fresh and astonishingly creative loggerheads.

But a new golden era of astronomical observation dawned almost immediately after, and A Brief History was badly outdated before it even hit the shelves. It couldn’t even get the date of the universe right.

It used to be that genius that outlived its moment could reinvent itself. When new-fangled endocrine science threw Ivan Pavlov’s Nobel-winning physiology into doubt, he reinvented himself as a psychologist (and not a bad one at that).

Today’s era of narrow specialism makes such a move almost impossible but, by way of intellectual compensation, there is always philosophy — a perennially popular field more or less wholly abandoned by professional philosophers. Images of the middle-aged scientific genius indulging its philosopause in book after book about science and art, science God, science and society and so on and so forth, may raise a wry smile, but work of real worth has come out of it.

Alas, even if Hawking had shown the slightest aptitude for philosophy (and he didn’t), he couldn’t possibly have composed it.

In our imaginations, Hawking is the cartoon embodiment of the scientific sage, effectively disembodied and above ordinary mortal concerns. In truth, life denied him a path to sagacity even as it steeped him in the spit and stew of physical being. Hawking’s libido never waned. So to hell with the philosopause! Bring on the dancing girls! Bring on the cheques, from Specsavers, BT, Jaguar, Paddy Power. (Hawking never had enough money: the care he needed was so intensive and difficult, a transatlantic air flight could set him back around a quarter of a million pounds). Bring on the billionaires with their fat cheques books (naifs, the lot of them, but decent enough, and generous to a fault). Bring on the countless opportunities to bloviate about subjects he didn’t understand, a sort of Prince Charles only without Charles’s efforts at warmth.

I find it impossible, having read Seife, not to see Hawking through the lens of Jacobean tragedy, warped and raging, unable even to stick a finger up at a world that could not — but much worse, *chose* not — to understand him. Of course he was a monster, and years too late, and through a book that will anger many, I have come to love him for it.

Soaked in ink and paint

Reading Dutch Light: Christiaan Huygens and the making of science in Europe
by Hugh Aldersey-Williams for the Spectator, 19 December 2020

This book, soaked, like the Dutch Republic itself, “in ink and paint”, is enchanting to the point of escapism. The author calls it “an interior journey, into a world of luxury and leisure”. It is more than that. What he says of Huygen’s milieu is true also of his book: “Like a ‘Dutch interior’ painting, it turns out to contain everything.”

Hugh Aldersey-Williams says that Huygens was the first modern scientist. This is a delicate argument to make — the word “scientist” didn’t enter the English language before 1834. And he’s right to be sparing with such rhetoric, since a little of it goes a very long way. What inadvertent baggage comes attached, for instance, to the (not unreasonable) claim that the city of Middleburg, supported by the market for spectacles, became “a hotbed of optical innovation” at the end of the 16th century? As I read about the collaboration between Christiaan’s father Constantijn (“with his trim dark beard and sharp features”) and his lens-grinder Cornelis Drebbel (“strapping, ill-read… careless of social hierarchies”) I kept getting flashbacks to the Steve Jobs and Steve Wozniak double-act in Aaron Sorkin’s film.

This is the problem of popular history, made double by the demands of explaining the science. Secretly, readers want the past to be either deeply exotic (so they don’t have to worry about it) or fundamentally familiar (so they, um, don’t have to worry about it).

Hugh Aldersey-Williams steeps us in neither fantasy for too long, and Dutch Light is, as a consequence, an oddly disturbing read: we see our present understanding of the world, and many of our current intellectual habits, emerging through the accidents and contingencies of history, through networks and relationships, friendships and fallings-out. Huygens’s world *is* distinctly modern — disturbingly so: the engine itself, the pipework and pistons, without any of the fancy fairings and decals of liberalism.

Trade begets technology begets science. The truth is out there but it costs money. Genius can only swim so far up the stream of social prejudice. Who your parents are matters.

Under Dutch light — clean, caustic, calvinistic — we see, not Enlightenment Europe emerging into the comforts of the modern, but a mirror in which we moderns are seen squatting a culture, full of flaws, that we’ve never managed to better.

One of the best things about Aldersey-Williams’s absorbing book (and how many 500-page biographies do you know feel too short when you finish them?) is the interest he shows in everyone else. Christiaan arrives in the right place, in the right time, among the right people, to achieve wonders. His father, born 1596 was a diplomat, architect, poet (he translated John Donne) and artist (he discovered Rembrandt). His longevity exasperated him: “Cease murderous years, and think no more of me” he wrote, on his 82nd birthday. He lived eight years more. But the space and energy Aldersey-Williams devotes to Constantijn and his four other children — “a network that stretched across Europe” — is anything but exasperating. It immeasurably enriches our idea of Christiaan’s work meant, and what his achievements signified.

Huygens worked at the meeting point of maths and physics, at a time when some key physical aspects of reality still resisted mathematical description. Curves provide a couple of striking examples. The cycloid is the path made by a point on the circumference of a turning wheel. The catenary is the curve made by a chain or rope hanging under gravity. Huygens was the first to explain these curves mathematically, doing more than most to embed mathematics in the physical sciences. He tackled problems in geometry and probability, and had some fun in the process (“A man of 56 years marries a woman of 16 years, how long can they live together without one or the other dying?”) Using telescopes he designed and made himself, he discovered Saturn’s ring system and its largest moon, Titan. He was the first to describe the concept of centrifugal force. He invented the pendulum clock.

Most extraordinary of all, Huygens — though a committed follower of Descartes (who was once a family friend) — came up with a model of light as a wave, wholly consistent with everything then known about the nature of light apart from colour, and streets ahead of the “corpuscular” theory promulgated by Newton, which had light consisting of a stream of tiny particles.

Huygens’s radical conception of light seems even stranger, when you consider that, as much as his conscience would let him, Huygens stayed faithful to Descartes’ vision of physics as a science of bodies in collision. Newton’s work on gravity, relying as it did on an unseen force, felt like a retreat to Huygens — a step towards occultism.

Because we turn our great thinkers into fetishes, we allow only one per generation. Newton has shut out Huygens, as Galileo shut out Kepler. Huygens became an also-ran in Anglo-Saxon eyes; ridiculous busts of Newton, meanwhile, were knocked out to adorn the salons of Britain’s country estates, “available in marble, terracotta and plaster versions to suit all pockets.”

Aldersey-Williams insists that this competition between the elder Huygens and the enfant terrible Newton was never so cheap. Set aside their notorious dispute over calculus, and we find the two men in lively and, yes, friendly correspondence. Cooperation and collaboration were on the rise: “Gone,” Aldersey-Williams writes, “is the quickness to feel insulted and take umbrage that characterised so many exchanges — domestic as well as international — in the early days of the French and English academies of science.”

When Henry Oldenburg, the prime mobile of the Royal Society, died suddenly in 1677, a link was broken between scientists everywhere, and particularly between Britain and the continent. The 20th century did not forge a culture of international scientific cooperation. It repaired the one Oldenburg and Huygens had built over decades of eager correspondence and clever diplomacy.

What else you got?

Reading Benjamin Labatut’s When We Cease to Understand the World for the Spectator, 14 November 2020

One day someone is going to have to write the definitive study of Wikipedia’s influence on letters. What, after all, are we supposed to make of all these wikinovels? I mean novels that leap from subject to subject, anecdote to anecdote, so that the reader feels as though they are toppling like Alice down a particularly erudite Wikipedia rabbit-hole.

The trouble with writing such a book, in an age of ready internet access, and particularly Wikipedia, is that, however effortless your erudition, no one is any longer going to be particularly impressed by it.

We can all be our own Don DeLillo now; our own W G Sebald. The model for this kind of literary escapade might not even be literary at all; does anyone here remember James Burke’s Connections, a 1978 BBC TV series which took an interdisciplinary approach to the history of science and invention, and demonstrated how various discoveries, scientific achievements, and historical world events were built from one another successively in an interconnected way?

And did anyone notice how I ripped the last 35 words from the show’s Wikipedia entry?

All right, I’m sneering, and I should make clear from the off that When We Cease… is a chilling, gripping, intelligent, deeply humane book. It’s about the limits of human knowledge, and the not-so-very-pleasant premises on which physical reality seems to be built. The author, a Chilean born in Rotterdam in 1980, writes in Spanish. Adrian Nathan West — himself a cracking essayist — fashioned this spiky, pitch-perfect English translation. The book consists, in the main, of four broadly biographical essays. The chemist Franz Haber finds an industrial means of fixing nitrogen, enabling the revolution in food supply that sustains our world, while also pioneering modern chemical warfare. Karl Schwarzchild, imagines the terrible uber-darkness at the heart of a black hole, dies in a toxic first world war and ushers in a thermonuclear second. Alexander Grothendieck is the first of a line of post-war mathematician-paranoiacs convinced they’ve uncovered a universal principle too terrible to discuss in public (and after Oppenheimer, really, who can blame them?) In the longest essay-cum-story, Erwin Schrodinger and Werner Heisenberg slug it out for dominance in a field — quantum physics — increasingly consumed by uncertainty and (as Labatut would have it) dread.

The problem here — if problem it is — is that no connection, in this book of artfully arranged connections, is more than a keypress away from the internet-savvy reader. Wikipedia, twenty years old next year, really has changed our approach to knowledge. There’s nothing aristocratic about erudition now. It is neither a sign of privilege, nor (and this is more disconcerting) is it necessarily a sign of industry. Erudition has become a register, like irony. like sarcasm. like melancholy. It’s become, not the fruit of reading, but a way of perceiving the world.

Literary attempts to harness this great power are sometimes laughable. But this has always been the case for literary innovation. Look at the gothic novel. Fifty odd years before the peerless masterpiece that is Mary Shelley’s Frankenstein we got Horace Walpole’s The Castle of Otranto, which is jolly silly.

Now, a couple of hundred years after Frankenstein was published, “When We Cease to Understand the World” dutifully repeats the rumours (almost certainly put about by the local tourist industry) that the alchemist Johann Conrad Dippel, born outside Darmstadt in the original Burg Frankenstein in 1673, wielded an uncanny literary influence over our Mary. This is one of several dozen anecdotes which Labatut marshals to drive home that message that There Are Things In This World That We Are Not Supposed to Know. It’s artfully done, and chilling in its conviction. Modish, too, in the way it interlaces fact and fiction.

It’s also laughable, and for a couple of reasons. First, it seems a bit cheap of Labatut to treat all science and mathematics as one thing. If you want to build a book around the idea of humanity’s hubris, you can’t just point your finger at “boffins”.

The other problem is Labatut’s mixing of fact and fiction. He’s not out to cozen us. But here and there this reviewer was disconcerted enough to check his facts — and where else but on Wikipedia? I’m not saying Labatut used Wikipedia. (His bibliography lists a handful of third-tier sources including, I was amused to see, W G Sebald.) Nor am I saying that using Wikipedia is a bad thing.

I think, though, that we’re going to have to abandon our reflexive admiration for erudition. It’s always been desperately easy to fake. (John Fowles.) And today, thanks in large part to Wikipedia, it’s not beyond the wit of most of us to actually *acquire*.

All right, Benjamin, you’re erudite. We get it. What else you got?

An intellectual variant of whack-a-mole

Reading Joseph Mazur’s The Clock Mirage for The Spectator, 27 June 2020 

Some books elucidate their subject, mapping and sharpening its boundaries. The Clock Mirage, by the mathematician Joseph Mazur, is not one of them. Mazur is out to muddy time’s waters, dismantling the easy opposition between clock time and mental time, between physics and philosophy, between science and feeling.

That split made little sense even in 1922, when the philosopher Henri Bergson and the young physicist Albert Einstein (much against his better judgment) went head-to-head at the Société française de philosophie in Paris to discuss the meaning of relativity. (Or that was the idea. Actually they talked at complete cross-purposes.)

Einstein won. At the time, there was more novel insight to be got from physics than from psychological introspection. But time passes, knowledge accrues and fashions change. The inference (not Einstein’s, though people associate it with him) that time is a fourth dimension, commensurable with the three dimensions of space, is looking decidedly frayed. Meanwhile Bergson’s psychology of time has been pruned by neurologists and put out new shoots.

Our lives and perceptions are governed, to some extent, by circadian rhythms, but there is no internal clock by which we measure time in the abstract. Instead we construct events, and organise their relations, in space. Drivers, thinking they can make up time with speed, acquire tickets faster than they save seconds. Such errors are mathematically obvious, but spring from the irresistible association we make (poor vulnerable animals that we are) between speed and survival.

The more we understand about non-human minds, the more eccentric and sui generis our own time sense seems to be. Mazur ignores the welter of recent work on other animals’ sense of time — indeed, he winds the clock back several decades in his careless talk of animal ‘instincts’ (no one in animal behaviour uses the ‘I’ word any more). For this, though, I think he can be forgiven. He has put enough on his plate.

Mazur begins by rehearsing how the Earth turns, how clocks were developed, and how the idea of universal clock time came hot on the heels of the railway (mistimed passenger trains kept running into each other). His mind is engaged well enough throughout this long introduction, but around page 47 his heart beats noticeably faster. Mazur’s first love is theory, and he handles it well, using Zeno’s paradoxes to unpack the close relationship between psychology and mathematics.

In Zeno’s famous foot race, by the time fleet-footed Achilles catches up to the place where the plodding tortoise was, the tortoise has moved a little bit ahead. That keeps happening ad infinitum, or at least until Newton (or Leibniz, depending on who you think got to it first) pulls calculus out of his hat. Calculus is an algebraic way of handling (well, fudging) the continuity of the number line. It handles vectors and curves and smooth changes — the sorts of phenomena you can measure only if you’re prepared to stop counting.

But what if reality is granular after all, and time is quantised, arriving in discrete packets like the frames of a celluloid film stuttering through the gate of a projector? In this model of time, calculus is redundant and continuity is merely an illusion. Does it solve Zeno’s paradox? Perhaps it makes it 100 times more intractable. Just as motion needs time, time needs motion, and ‘we might wonder what happens to the existence of the world between those falling bits of time sand’.

This is all beautifully done, and Mazur, having hit his stride, maintains form throughout the rest of the book, though I suspect he has bitten off more than any reader could reasonably want to swallow. Rather than containing and spotlighting his subject, Mazur’s questions about time turn out (time and again, I’m tempted to say) to be about something completely different, as though we were playing an intellectual variant of whack-a-mole.

But this, I suppose, is the point. Mazur quotes Henri Poincaré:

Not only have we not direct intuition of the equality of two periods, but we have not even direct intuition of the simultaneity of two events occurring in two different places.

Our perception of time is so fractured, so much an ad hoc amalgam of the chatter of numerous, separately evolved systems (for the perception of motion; for the perception of daylight; for the perception of risk, and on and on — it’s a very long list), it may in the end be easier to abandon talk of time altogether, and for the same reason that psychologists, talking shop among themselves, eschew vague terms suchas ‘love’.

So much of what we mean by time, as we perceive it day to day, is really rhythm. So much of what physicists mean by time is really space. Time exists, as love exists, as a myth: real because contingent, real because constructed, a catch-all term for phenomena bigger, more numerous and far stranger than we can yet comprehend.