“A failure by the British state”

Reading The Poison Line by Cara McGoogan for the Telegraph, 17 September 2023

Mayor Treloar College, founded in 1907 for the education and care of physically disabled children, was more than just a school for Ade Goodyear. The teaching and medical staff were more like an extended family. Dr Anthony Aronstam, director of the Treloar’s haemophilia centre, used to invite Ade and his schoolfriends over to his house where they drank lemonade and swam in the pool.

One afternoon in the summer of 1984 Ade found Aronstam bent over his desk, trembling. “‘We’ve fucked up,’ Aronstam said. ‘We’ve messed up, boys. I’ve messed up. It’s all gone wrong.’”

In the face of a gathering global calamity, Aronstam had been assessing Ade, without his knowledge, for signs of AIDS. Two of Ade’s schoolfriends were already diagnosed. One, Richard Campbell, had already died. By 1986 Aronstam had forty­-three patients who were HIV positive. He wrote in a report, “”There are gloomier predictions about, which suggest that up to 100 per cent of the infected haemo­philiac population will eventually succumb to the virus.”

The Poison Line is the first book by journalist Cara McGoogan. It began life as a couple of features written for this paper in the opening week of the Infected Blood Inquiry in 2019.
It may seem thin praise to single out the way McGoogan has arranged her material here, but truly the effort has been superhuman. This is the story of a global medical scandal, implicating health services, pharmaceutical companies and whole governments, and unfolding slowly enough, and meeting obstacles enough, that many of its victims died before they ever saw justice, never mind compensation. It is told, for the most part, through the recollections of the victims, their families, their doctors, their legal and political representatives. That so many individual stories here burn their way into the reader’s skull is testament to the strength of the source material, of course, but there were so many plates McGoogan could have dropped here and didn’t, so many stories to leave hanging and implications to leave unexplored, that there ought to be some sort of award for literary juggling established in her name.

Treloar College is just the most familiar domestic emblem of a crisis that played out across the US, UK, mainland Europe, and south-east Asia. It began when a new, much quicker, more convenient and more comfortable way was found of administering blood clotting factors to haemophiliacs. Factor VIII, a freeze-dried powder derived from blood, was infected with hepatitis B, but since this infection was common among haemophiliacs anyway, and went away in time, the issue was ignored. Consequently, other agents infecting Factor VIII went undetected, including HIV and hepatitis C.

Institution after institution doubled down on their original error in allowing and promoting a tainted product. In the UK, ministers themselves come out of this account surprisingly well, as McGoogan traces their appalled investigations into decades of deliberate cover-up. It was left to Jeremy Hunt, “the epitome of the establishment politician”, to sum up the disaster as “a failure by the British state. I don’t think there’s any other way to describe it.

The second half of Poison Line, about the victims’ courtroom battles, reveals the economic drivers of the scandal. By the 1990s plasma was more valuable than gold and oil. Most Factor VIII was produced in the US, and American blood bankers, who are allowed to pay donors for plasma, were gathering blood from wherever they could: outside nightclubs, from inside prisons, and from a centre in Nicaragua nicknamed the ‘House of Vampires’, which collected plasma from up to a thousand people a day. The there was the way Factor VIII was made: any one injection could contain the blood of twenty-​­five thousand people.

As McGoogan’s account gathers pace and scale, the more existential the issues become. At what point does a corporation countenance the death of its customers? In what institutional setting will a doctor think it reasonable to tell an AIDS-infected mother that “Women like you should be sterilized”? What level of conformism does it take for the mother of a seventeen year old, infected with HIV from a haemophilia treatment, to tell him that he’s brought shame on her, and throw him out the house?

By the closing pages, we seem to have left the news pages behind entirely, and be wrestling with something that looks very like the tragedy of the human condition.

New flavours of intelligence

Reading about AI and governance for New Scientist, 13 September 2023

A sorcerer’s apprentice decides to use magic to help clean his master’s castle. The broom he enchants works well, dousing the floors with pails full of water. When the work is finished, the apprentice tries to stop the broom. Then, he tries to smash the broom. But the broom simply splits and regrows, working twice as hard as before, four times as hard, eight times as hard… until the rooms are awash and the apprentice all but drowns.

I wonder if Johann Wolfgang von Goethe’s 1797 poem sprang to mind as Mustafa Suleyman (co-founder of AI pioneers DeepMind, now CEO of Inflection AI) composed his new book, The Coming Wave? Or perhaps the shade of Robert Oppenheimer darkened Suleyman’s descriptions of artificial intelligence, and his own not insignificant role in its rise? “Decades after their invention,” he muses, “the architects of the atomic bomb could no more stop a nuclear war than Henry Ford could stop a car accident.”

Suleyman and his peers, having launched artificially intelligent systems upon the world, are right to tremble. At one point Suleyman compares AI to “an evolutionary burst like the Cambrian explosion, the most intense eruption of new species in the Earth’s history.”

The Coming Wave is mostly about the destabilising effects of new technologies. It describes a wildly asymmetric world where a single quantum computer can render the world’s entire encryption infrastructure redundant, and an AI mapping new drugs can be repurposed to look for toxins at the press of a return key.

Extreme futures beckon: would you prefer subjection under an authoritarian surveillance state, or radical self-reliance in a world where “an array of assistants… when asked to create a school, a hospital, or an army, can make it happen in a realistic timeframe”?

The predatory city states dominating this latter, neo-Renaissance future may seem attractive to some. Suleyman is not so sure: “Renaissance would be great,” he writes; “unceasing war with tomorrow’s military technology, not so much.”

A third future possibility is infocalypse, “where the information ecosystem grounding knowledge, trust, and social cohesion… falls apart.”

We’ll come back to this.

As we navigate between these futures, we should stay focused on current challenges. “I’ve gone to countless meetings trying to raise questions about synthetic media and misinformation, or privacy, or lethal autonomous weapons,” Suleyman complains, “and instead spent the time answering esoteric questions from otherwise intelligent people about consciousness, the Singularity, and other matters irrelevant to our world right now.”

Historian David Runciman makes an analogous point in The Handover, an impressive (and impressively concise) history of the limited liability company and the modern nation state. The emergence of both “artificial agents” at the end of the 18th Century was, Runciman argues, “the first Singularity”, when we tied our individual fates to two distinct but compatible autonomous computational systems.

“These bodies and institutions have a lot more in common with robots than we might think,” argues Runciman. Our political systems are already radically artificial and autonomous, and if we fail to appreciate this, we won’t understand what to do, or what to fear, when they acquire new flavours of intelligence.

Long-lived, sustainable, dynamic states — ones with a healthy balance between political power and civil society — won’t keel over under the onslaught of helpful AI, Runciman predicts. They’ll embrace it, and grow increasingly automated and disconnected from human affairs. How will we ever escape this burgeoning machine utopia?

Well, human freedom may still be a force to reckon with, according to Igor Tulchinsky. Writing with Christopher Mason in The Age of Prediction, Tulchinsky explores why the more predictable world ushered in by AI may not necessarily turn out to be a safer one. Humans evolved to take risks, and weird incentives emerge whenever predictability increases and risk appears to decline.

Tulchinsky, a quant who analyzes the data flows in financial markets, and Mason, a geneticist who maps dynamics across human and microbial genomes, make odd bedfellows. Mason, reasonably enough, welcomes any advance that makes medicine more reliable. Tulchinsky fears lest perfect prediction in the markets renders humans as docile and demoralised as cattle. The authors’ spirited dialogue illuminates their detailed survey of what predictive technologies actually do, in theatres from warfare to recruitment, policing to politics.

Let’s say Tulchinsky and Mason are right, and that individual free will survives governance by all-seeing machines. It does not follow at all that human societies will survive their paternalistic attentions.

This was the unexpected sting in the tail delivered by Edward Geist in Deterrence under Uncertainty, a heavyweight but unexpectedly gripping examination of AI’s role in nuclear warfare.

Geist, steeped in the history and tradecraft of deception, reckons the smartest agent — be it meat or machine — can be rendered self-destructively stupid by an elegant bit of subterfuge. Fakery is so cheap, easy and effective, Geist envisions a future in which artificially intelligent “fog-of-war machines” create a world that favours neither beligerents nor conciliators, but deceivers: “those who seek to confound and mislead their rivals.”

In Geist’s hands, Suleyman’s “infocalypse” becomes a weapon, far cleaner and cheaper than any mere bomb. Imagine future wars fought entirely through mind games. In this world of shifting appearances, littered with bloody accidents and mutual misconstruals, people are persuaded that their adversary does not want to hurt them. Rather than living in fear of retaliation, they come to realise the adversary’s values are, and always have been, better than their own.
Depending on your interests, your politics, and your sensitivity to disinformation, you may well suspect that this particular infocalyptic future is already upon us.

And, says Geist, at his most Machiavellian (he is the most difficult of the writers here; also the most enjoyable): “would it not be much more preferable for one’s adversaries to decide one had been right all along, and welcome one’s triumph?”

 

“Trees and bushes and hills. Or houses and streets. Or rooms and furniture.”

Reading The Wolves of Eternity by Karl Ove Knausgaard for the Telegraph, 12 September 2023

1986: nineteen-year-old Syvert Løyning returns home from navy service to find his widowed mother chain-smoking. If she’s not dying of lung cancer, she’s making a good stab at it. Her declining health leaves Syvert spending most of his evenings looking after his younger brother Joar. His efforts in this direction are sincere but often ineffective.

This is Knausgaard’s long, compelling prequel to 2021’s The Morning Star, a novel which saw time collapse and brought the realms of the living and the dead into collision. To this metaphysical malarkey, The Wolves… offers the unprepared reader the coldest of cold openers. What should we make of Syvert’s slow, ineluctable decline amid the edgelands of southern Norway (woods and heaths and lakes, football fields and filling stations)? What of his equally slow recovery, as he acquires a girlfriend (Lisa), a job (at an undertaker’s) and a purpose — tracking down his dead father’s secret second family in the former Soviet Union?

Half way through, The Wolves… breaks off and moves to present-day [Russia] to follow Syvert’s lost relation Alevtina on her way to her step-dad’s 80th birthday party. Alevtina describes herself self-deprecatingly (and accurately) as “a kind of hippy biologist who talked to trees”. Knausgaard makes a show of giving her the same close attention he devoted to Syvert, but there’s a sight more handwaving going on now: a sophomoric piling-up of cultural capital about Death, Hell and various species of Russian messianism and biophilia. Never trust a protagonist who’s a college lecturer.

Both halves of The Wolves… have their strengths, of course. Alevtina’s half is flashier by far; consider this time-reversed vision of some woods: “If you filmed that, an injured or sick animal that crept away to hide, died and rotted, and then ran that film backwards, the soil would pull apart to become an animal that rose up and slunk away.”

But there’s a greater power, I reckon, to be gleaned in the ordinariness of things. The pity we feel as Syvert sorts out his father’s old boxes, “not to get closer to him, more the opposite, to remove him from me, put him back in his boxes, back with his things”. And terror, when it occurs to Syvert that the World (which The Wolves… sees to its end) is only “what the eyes could see… A sort of duct in front of and behind us, with various things in it. Trees and bushes and hills. Or houses and streets. Or rooms and furniture.”

Knausgaard should resist the siren call of his library card, and go on writing very big books about nothing. The less The Wolves… is about, the more it has to say.

Taking in the garbage

Reading Interstellar by Avi Loeb for New Scientist, 30 August 2023

On 8 January 2014, a meteor exploded above the Pacific just north of Papua New Guinea’s Manus Island.

Five years later Amir Siraj, then a research assistant for Harvard astronomer Avi Loeb, spotted it in an online catalogue at the Center for Near-Earth Object Studies, part of NASA’s Jet Propulsion Laboratory.

Partway through Interstellar, Loeb explains why he thinks the meteor comes from outside the solar system. This would make it one of only three objects so identified. The first was ‘Oumuamua, detected in 2017: a football-field size pancake-shaped anomaly and the subject of Loeb’s book Interplanetary, to which Interstellar is a repetitive, frenetic, grandiose extension.

Since Interstellar was sent to press, Loeb’s team have gathered particles from the crash site and packed them off to to labs at Harvard University, the University of California, Berkeley, and the Bruker Corporation in Germany for further analysis. Metallic spherules from outside our solar system would be a considerable find in itself.

Meanwhile Loeb is publically airing a hypothesis which, thanks to an opinion piece on 10 February 2023, is already familiar to readers of New Scientist. He reckons this meteor might turn out to have been manufactured by extraterrestrials.

Already there has been some bad-tempered push-back, but Loeb does not care. He’s innoculated against other people’s opinions, he says in Interstellar, not least because “my first mentor in astrophysics… had a professional rival, and when my mentor died it was his rival that was asked to write his obituary in a prestigious journal.”

Loeb, who has spent a career writing about black holes, dark matter and the deep time of the universe, does not waste time arguing for the existence of spacefaring extraterrestrials. Rather, he argues that we should be looking for spacefaring extraterrestrials, or at any rate for their gear. Among the possible scenarios for First Contact, “a human-alien handshake in front of the White House” is the least likely. It’s far more likely we’ll run into some garbage or a probe of some sort, and only then, says Loeb, because we’ve taken the trouble to seek it out.

Until very recently, no astronomical instrument was built for such a purpose. But this is changing, says Loeb, who cites NASA’s Unidentified Aerial Phenomena study, launched in December 2022, and the Legacy Survey of Space and Time — a 10-year-long high-resolution record of the entire southern sky, to be conducted on the brand-new Vera C. Rubin Observatory in Chile. Then there’s Loeb’s own brainchild, The Galileo Project, meant to bring the search for extraterrestrial technological signatures “from accidental or anecdotal observations and legends to the mainstream of transparent, validated and systematic scientific research.” The roof of the Harvard College Observatory boasts the project’s first sky-scanning apparatus.

There’s more than a whiff of Quixote about this project, but Loeb’s well within his rights to say that unless we go looking for extraterrestrials, we’re never going to find them. Loeb’s dating metaphor felt painfully hokey at first, but it grew on me: are we to be cosmic wallflowers, standing around on the off-chance that some stranger comes along? Or are we going to go looking for things we’ll never spot without a bit of effort?

Readers of grand speculations by the likes of Freeman Dyson and Stanislaw Lem will find nothing in Interstellar to make them blink, aside maybe from a rather cantankerous prose style. Can we be reassured by Loeb’s promise that he and his team work only with scientific data openly available for peer review, that they share their findings freely and only through traditional scientific channels, and will release no results except through scientifically accepted channels of publication?

I’m inclined to say yes, we should. Arguments from incredulity are always a bad idea, and sneering is never a good look.

A truth told backwards

Reading Disputed Inheritance by Gregory Radick for New Scientist, 23 August 2023

In 1865 Gregor Mendel, working to understand the mechanisms of hybridisation, discovered exquisitely simple and reliable patterns of inheritance in varieties of garden pea. Rediscovered in 1900, the patterns of inheritance described in his work revealed the existence of hereditary “particles”. Today, we call these particles “genes”.

Well, of course, there’s more to the story than this. In his ambitious and spirited history of the genetic idea, Leeds-based geneticist Gregory Radick accounts for our much more nuanced, sophisticated ideas of what genetics actually is, and in so doing he asks a deceptively simple question: why, knowing what we know now, do we still bother with Mendel? Why, when we explain genetics to people, do we reach for experiments conducted by a man who had no interest in heritability, never mind evolution, and whose conclusions about heritability (in as much as he ever made any) were quite spectacularly contradicted in experiments by Darwin’s favourite plantsman Thomas Laxton in 1866? (“Where Mendel’s pea hybrids always showed just the one parental character in colour and shape, Laxton’s,” says Radick, “were sometimes blended, sometimes wholly like the one parent, sometimes wholly like the other, and sometimes mosaically like both.”)

The evidence against Mendelian genetics began accumulating almost immediately after its 20th-century rediscovery. The “genetics” we talk about today isn’t Mendelian, it’s molecular, and it arose out of other sciences: microbiology, biochemistry, X-ray crystallography, and later a whole host of data-rich sequencing technologies. “Today’s genome,” Radick explains, “the post-genomic genome, looks more like… a device for regulating the production of specific proteins in response to the constantly changing signals [the cell] receives from its environment.”

The point is not that Mendel was “wrong”. (That would be silly, like saying Newton was “wrong” for not coming up with special relativity.) The point is that we have no real need to be thinking in Mendelian terms at all any more. Couching almost the whole of modern genetics as exceptions to Mendelian’s specious “rules” is to be constantly having to explain everything backwards.

Radick explains why this has happened, and what we can do about it. The seed of trouble was first sown in the battle (at first collegiate, then increasingly cantankerous) between the Cambridge-based William Bateson, who made it his mission to reshape biology in the image of Mendel’s experiments, and Oxford-based Walter Frank Raphael Weldon, who saw that Mendel (whose interest was hybrids, not heredity) had removed from his experiments as many ordinary sources of variability as he could. Real pea seeds are not always just yellow or green, or just round or wrinkled, and Weldon argued that actual variability should not be just idealised away.

“It seems to me that every character is at once inherited and acquired,” Weldon wrote, and of course he was right. The difficulty was what to do with that insight. “It is easy to say Mendelism does not happen,” he remarked to his friend Karl Pearson in March 1903, “but what the deuce does happen is harder every day!”

What Weldon needed, and what he pursued, was an alternative theory of heredity, but the book manuscript setting out his alternative vision was left unfinished at his death, from pneumonia, in April 1906.

Radick’s book champions the underdog, Weldon, over the victorious Bateson. Whether his account smacks of special pleading will depend on the reader’s education and interests. (Less than a century ago, geneticists in the Soviet Union faced ruin and even persecution as they defended the Mendelian idea, insufficient as that idea may seem to us now; temperatures in this field run high.)

This is not the first attempt to lay history’s ghosts to rest, and reset our ideas about genetics. That said, I can’t think of one that’s better argued, more fair-minded or more sheerly enjoyable.

Radiant with triumphant calamity

Reading Fear by Robert Peckham for the Telegraph, 25 August 2023 

Remember the UK intelligence claim that Saddam Hussein could strike the UK with a ballistic missile within 45 minutes? The story goes that this was spun out of a two-year-old conversation with a taxi driver on the Iraq-Jordan border. One thing’s for sure: fear breeds rumour breeds more fear.

Robert Peckham lives in fear, and claims we’re all of us entering “an era of insidious, mediatised fear”. This may be a case of misery seeking company. And you can see why: in 1988 this British historian of science (author of several well-received books about epidemics) narrowly missed getting blown up in a terrorist attack on the funeral of Abdul Ghaffar Khan in Jalalabad. More recently, in the summer of 2021, he quit his job at the University of Hong Kong where, he writes, ”fear was palpable… friends were being hounded by the authorities, news agencies shut down and opposition leaders jailed.”

With the spread of Covid-19, Peckham’s political and medical interests dovetailed in Hong Kong in grim fashion. “A pandemic turned out to be the ultimate anti-protest weapon,” he writes, “one that the city’s chief executive, Carrie Lam, deployed ruthlessly to stifle opposition.”

Fear is the story of how, over the last 700 years or so, power has managed and manipulated its subjects through dread: of natural disasters, pandemics, revolutions, technologies, financial crashes, wars and of course, through fear of the government itself.

We see how the Catholic Church tried and failed to canalise the horrors of the Black Death into sacral terror and obedience; how instead that fear powered the Reformation. There’s a revealing section about Shakespeare’s Hamlet, a play both steeped in fear and about it: and how fear is shown sometimes engendering, sometimes acting as a moral brake on violence. Through the bloody medium of the French revolution, we enter the modern era painfully aware that reason has proved more than capable of buttressing terror, and that the post-Enlightenment period is “radiant with triumphant calamity”.

Peckham’s history is as encyclopaedic as it is mirthless. After a striking and distressing chapter about the slave trade (every book should have one), Peckham even wonders whether “perhaps slavery has been so thoroughly embedded in free market capitalism that it can’t be dislodged, at least not without the collapse of the entire system”.

At this point, the reader is entitled tug on the reins and double check some figures on the United Nations website. And sure enough: in the 21st century alone global life expectancy has risen seven years, literacy has risen by nine per cent (to 91 per cent) and extreme poverty is about a third what it was at the beginning of this century.

Allow Peckham’s argument that the Machiavellian weaponisation of fear had a hand in all this: dare one suggest this was a price worth paying?

Of course this is far from the whole of Peckham’s argument. He says at the outset he wants to explore the role fear plays in promoting reform, as well as its use in repressing dissent. “What,” he asks, “would happen to all the public-spirited interventions that rely on the strategic use of fear to influence our behaviour? Don’t we need fear to take our problems seriously?”

It’s an interesting project. Too often, though, the focus on fear acts to dampen our responses, rather than enrich them. For instance, Peckham depicts Versailles as “a policed society” where “prescriptions on how to eat, talk, walk and dance kept courtiers in line, with the ever-present threat that they might be stripped of their privileges if rules of comportment were infringed”. This is at once self-evident and woefully incomplete, excluding as it does any talk of political aspiration, personal vanity, love of play, the temptations of gossip and the lure of luxe. This isn’t an insight into Versailles; it’s a gloomy version of Versailles.

There is a difference, it is true, between the trenches of Verdun, and the fear felt in those trenches, just as there is a difference between the NKVD knocking on your door, and your fear of the knock. But — and here’s the nub of the matter — is it a useful difference? Or is it merely a restatement of the obvious?

In the end, having failed to glean the riches he had hoped for, Peckham is left floundering: “Fear is always intersectional,” he writes, “an unnerving confluence of past, present and future, a convergence of the here and there.”

To which this reader replied, with some exasperation, “Oh, pull the other one!”

“Democratic leaders are more difficult to decipher”

Reading The Madman in the White House: Sigmund Freud, Ambassador Bullitt and the Lost Psychobiography of Woodrow Wilson by Patrick Weil, for the Spectator, 29 July 2023

It was a vision US president Woodrow Wilson could not resist, and it was instrumental in bringing the US into the Great War.

The treaty at Versailles and the League of Nations founded during the negotiations were meant, not just to end the current war, but to end all future wars, by ensuring that a country taking up arms against one signatory would be treated as a belligerent by all the others.

Wilson took his advisor Edward “Colonel” House’s vision of a new world order and careered off with it.

Against advice, he attended Versailles in person, let none of his staff in with him during negotiations, was quickly overwhelmed, saw his principled “fourteen points” deluged by special provisions and horse-trading, and returned home, convinced, first, that his dearest close colleagues had betrayed him (which they hadn’t); second, that the League of Nations alone could mend what the Treaty of Versailles had left broken or made worse (which it didn’t); third, that that he was the vessel of divine will, and that what the world needed from him at this crucial hour was a show of principle and Christ-like sacrifice. “The stage is set,” he declared to a dumbfounded and sceptical Senate, “the destiny is disclosed. It has come about by no plan of our conceiving, but by the hand of God who has led us into this way. We cannot turn back.”

Winston Churchill had Wilson’s number: “Peace and Goodwill among all Nations abroad, but no truck with the Republican party at home. That was his ticket and that was his ruin and the ruin of much else as well.”

When it was clear he would not get everything he wanted, Wilson destroyed the bill utterly, needlessly ending US involvement in the League before it had even begun.

Several developments followed ineluctably from this. Another world war, of course; also a small but vibrant cottage industry in books explaining just what in hell Wilson had thought he was playing at. Patrick Weil’s new book captures the anger and confusion of the period. His evenness of manner that sets the hysteria of his subjects into high relief; take, for example, Sigmund Freud, who wrote of Wilson: “As far as a single person can be responsible for the misery of this part of the world, he surely is.”

Anger and a certain literary curiosity drove Freud to collaborate with William C. Bullitt, Wilson’s erstwhile advisor, on a psychobiography of the man. His daughter Anna hated the final book, but one has to assume she was dealing with her own daddy issues.

Delayed by decades so as not to derail Bullitt’s political career, Thomas Woodrow Wilson: a psychological study was published in bowdlerised form, and to no very positive fanfare, in 1967. Bullitt, by then a veteran anti-communist, was chary of handing ammunition to the enemy, and suppressed his book’s most sensational interpretations, involving Wilson’s suppressed homosexuality, his overbearing father, and his Christ complex.

In 2014, Weil, a political scientist based in Paris, happened upon the original 1932 manuscript.

Are the revelations contained there enough on their own to sustain a new book? Weil is circumspect, enriching his account with a quite detailed and acute psychological biography of his own — of William Bullitt. Bullitt was a democratic idealist and political insider who found himself pushed into increasingly hawkish postures by his all-too-clear appreciation of the threat posed by Stalin’s Soviet Union. He made strange bedfellows over the years: on his deathbed he received a friendly note from Richard Nixon, “Congratulations incidentally on driving the liberal establishment out of their minds with your Wilson.”

Those readers who’ve come for bread and circuses will find that John Maynard Keynes’s 1919 book “The Economic Consequences of the Peace” chopped up Wilson nicely long ago, tracing “the disintegration of the president’s moral position and the clouding of his mind.” Then there’s the 1922 Wilson psychobiography that Freud did not endorse, though he wrote privately to its author, William Bayard Hale, that his linguistic analysis of Wilson’s prolix pronouncements had “a true spirit of psychoanalysis in it.” Then there’s Alexander and Juliet Georges’ Woodrow Wilson and Colonel House, a psychological portrait from 1956 that argues that Wilson’s father represented a superego whom Wilson could never satisfy. And there are several others, if you go looking.

So, is Madman a conscientious but unnecessary book about Wilson? Or an insightful but rather oddly proportioned book about Bullitt? The answer is both, nor is this necessarily a drawback. Bullitt’s growing disillusion with Wilson, his hurt, and ultimately his contempt for the man, shaped him as surely as his curiously unhappy childhood and a formative political debacle at Princeton shaped Woodrow Wilson.

“Dictators are easy to read,” Weil writes. “Democratic leaders are more difficult to decipher. However, they can be just as unbalanced as dictators and can play a truly destructive role in our history.”

This is well put, but I think Weil’s portrait of Bullitt demonstrates something broader and rather more hopeful: that politics — even realpolitik — is best understood as an affair of the heart.

Saltbushed, rabbitbrushed and tumbleweeded

Reading Dust by Jay Owens for the Telegraph, 17 July 2023

Here’s a lesson from optics that historians of science seem to have taken in with their mother’s milk: the narrower the aperture, the more focused the image. Pick a narrow something, research its story till it squeaks, and you might just end up with a twisted-but-true vision of the world as a whole. To Jared Diamond’s Guns and Germs and Steel, to Mark Kurlansky’s Salt, and Laura Martin’s Tea, can we now add geographer Jay Owens’ Dust?

Owens’ pursuit of dust (defined very broadly as particles of a certain size, however generated) sends her tripping through many fascinating and rewarding realms, but this can sometimes be at the expense of her main subject. (For instance, an awful lot of this book is less about dust than about the absence of water.) “Dust,” Owens writes, “is matter at the very limit-point of formlessness, the closest ‘stuff’ gets to nothing.” This is nicely put, but what it boils down is: Dust is slippery stuff to hang a book upon.

Owens’ view of dust is minatory, Some dust is vital to natural ecological processes (rainfall being not the least of them). Approximately 140 million tonnes of dust fall every year across the tropical Atlantic Ocean, providing nutrients to marine ecosystems. Still, dust also brings disease: “In the Caribbean,” Owens tells us, “the Saharan winds carry spores of the fungus Aspergillus, making corals and sea fans sicken and die.”

Increasing the amount of dust in the atmosphere has led and still leads to sickness and death. In Ford County, Kansas, at the very bottom of the Dust Bowl, one-third of all deaths in 1935 were from pneumonia. Today, lead and arsenic hitchhike on soot particles formed by combustion, driving some into hay-feverish discomfort, others into acute respiratory failure.

The direct health effects of dust are arresting, but Owens’ abiding interest in dust developed when she began tracing its ubiquity and systemic pervasiveness: how, for instance, electric cars, being heavier, generate extra road dust, which is rich in microplastic particles, and how these transport other environmental contaminants including 6PPD-quinone, “an antioxidant added to tyre rubber that researchers have found is producing mass die-offs of coho salmon in the Pacific Northwest.”

Set aside the temptation to run screaming into the hills, we have two ways to confront a world revealed to be this intagliated and insoluble. The first is to embrace ever vaguer suitcase language to contain its wicked problems. When Owens started talking about the “anthropocene”, — a putative new geological era triggered by [insert arbitrary technological advance here], my heart sank. Attempts to conciliate between the social sciences and geology are at best silly and at worst pompous.

The second tactic is to hold your nerve, get out of your chair and go look at stuff; observe the world as keenly as you can, and write as honestly as possible about what you see. And Owens’ success here is such as to nudge aside all earlier quibbles.

Owens is a superb travel writer, delivering with aplomb on her own idea of what geographers should be doing: “Paying attention to tangible, material realities to ground our theoretical models in the world.” (Owens, p. 326)

With Owens, we travel from saltbushed, rabbitbrushed and tumbleweeded Lake Owens in California to Aralka in Kazakhstan, and the toxic remains of what was once the fourth largest lake in the world. We visit ice core researchers in Greenland, and catch a glimpse of their “cold, arduous, multi-year detective work”. We discover through vicarious experience, and not just through rhetoric, why we can’t just admire the fruits of modernity, “the iPhones, the Teslas, the staggering abundance of consumer entertainment – but should follow that tree down to its roots.”

Dust’s journeys, interviews, and historical insights serve Owens’ purpose better than the terms of art she has brought across from social anthropology. I admit I was quite taken with the idea of “Discard Studies”, that interrogates the world through its trash; but a glimpse of Lake Owens’s current condition — a sort of cyborg woodland in place of the old lake, and a place more altered than restored — says more about our ever-more dust-choked world, than a thousand modish gestures ever could.

“Crude to the point of vulgarity, judgmental in the extreme, and bitterly punitive”

Reading The Age of Guilt by Mark Edmundson for New Scientist, 5 July 2023

In his Freudian analysis of what we might loosely term “cancel culture”, Mark Edmundson wisely chooses not to get into facile debates about which of the pioneering psychoanalyst’s ideas have or have not been “proved right”. What would that even mean? Psychology is not so much science as it is engineering, applying ideas and evidence to a purpose. Edmundson, an author and literary scholar, simply wants to suggest that Freud’s ideas might help us better understand our current cultural moment.

In the centre of Freud’s model of the personality sits the ego, the conscious bit of ourselves, the bit that thinks, and therefore is. Bracketing the ego are two components of the personality that are inaccessible to conscious awareness: the id, and the super-ego. The id is the name Freud gives to all those drives that promote immediate individual well-being. Fancy a sandwich? A roll in the hay? A chance to clout your rival? That’s your id talking.

Much later, in an attempt to understand why so many of his clients gave themselves such a hard time (beating themselves up over trivia, calling themselves names, self-harming) Freud conceived the super-ego. This is the bit of us that warns us against misbehaviour, and promotes conformity to social norms. Anyone who’s spent time watching chimpanzees will understand why such machinery might evolve in an animal as ultra-social as Homo sapiens.

Casual descriptions of Freud’s personality model often characterise the super-ego as a sort of wise uncle, paternalistically ushering the cadet ego out of trouble.

But this, Edmundson says, is a big mistake. A power that, in each of us, watches, discovers and criticizes all our intentions, is not a power to be taken lightly.

Edmundson argues that key cultural institutions evolved not just to regulate our appetites; they also provide direction and structure for the super-ego. A priest might raise an eyebrow at your gluttony; but that same priest will relieve you of your self-hatred by offering you a simple atonement: performing it wipes your slate clean. Edmundson wonders what, in the absence of faith, can corral and direct the fulminations of our super-ego — which in this account is not so much a fount of idealism, and more a petulant, unrelenting and potentially life-threatening martinet, “crude to the point of vulgarity, judgmental in the extreme, and bitterly punitive.”

The result of unmet super-ego demands is sickness. “The super-ego punishes the ego and turns it into an anxious, frightened creature, a debilitatingly depressed creature, or both by turns,” Edmundson explains, and quotes a Pew Research study showing that, from 2007 to 2017, the percentage of 12-to-17 year olds who have experienced a major depressive episode in the past year rose from 8 percent to 13 percent. Are these severely depressed teenagers “in some measure victims of the wholesale cultural repudiation of Freud”?

Arguments from intuition need a fairly hefty health warning slapped on them, but I defy you not to find yourself nodding along to more than a few of Edmundson’s philippics: for instance, how the internet became our culture’s chief manifestation of the super-ego, its loudest users bearing all the signs of possession, “immune to irony, void of humour, unforgiving, prone to demand harsh punishments.”

Half a century ago, the anthropologist Ernest Becker wrote a book, The Denial of Death, that hypothesised all manner of connections between society, behaviour and consciousness. Its informed and closely argued speculations inspired a handful of young researchers to test his ideas, and thereby revolutionise the field of experimental psychology. (An excellent book from 2015, The Worm at the Core, tells their story.)

In a culture that’s growing so pathologically judgmental, condemnatory, and punitive, I wonder if The Age of Guilt can perform the same very valuable trick? I do hope so.

Ideas are like boomerangs

Reading In a Flight of Starlings: The Wonder of Complex Systems by Giorgio Parisi for The Telegraph, 1 July 2023

“Researchers,” writes Giorgio Parisi, recipient of the 2021 Nobel Prize in Physics, “often pass by great discoveries without being able to grasp them.” A friend’s grandfather identified and then ignored a mould that killed bacteria, and so missed out on the discovery of penicillin. This story was told to Parisi in an attempt to comfort him for the morning in 1970 he’d spent with another hot-shot physicist, Gerard ‘t Hooft, dancing around what in hindsight was a perfectly obvious application of some particle accelerator findings. Having teetered on the edges of quantum chromodynamics, they walked on by; decades would pass before either man got another stab at the Nobel. “Ideas are often like boomerangs,” Parisi explains, and you can hear the sigh in his voice; “they start out moving in one direction but end up going in another.”

In a Flight of Starlings is the latest addition to an evergreen genre: the scientific confessional. Read this, and you will get at least a frisson of what a top-flight career in physics might feel like.

There’s much here that is charming and comfortable: an eminent man sharing tales of a bygone era. Parisi began his first year of undergraduate physics in November 1966 at Sapienza University in Rome, when computer analysis involved lugging about (and sometimes dropping) metre-long drawers of punched cards.

The book’s title refers to Parisi’s efforts to compute the murmurations of starlings. Recently he’s been trying to work out how many solid spheres of different sizes will fit into a box. There’s a goofiness to these pet projects that belies their significance. The techniques developed to follow thousands of starlings through three dimensions of space and one of time bear a close resemblance to those used to solve statistical physics problems. And fitting marbles in a box? That’s a classic problem in information theory.

The implications of Parisi’s work emerge slowly. The reader, who might, in all honesty, be touched now and again by boredom, sits up straighter once the threads begin to braid.

Physics for the longest time could not handle complexity. Galileo’s model of the physical world did not include friction, not because friction was any sort of mystery, but because the mathematics of his day couldn’t handle it.

Armed with better mathematics and computational tools physics can now study phenomena that Galileo could never have imagined would be part of physics. For instance, friction. For instance, the melting of ice, and the boiling of water: phenomena that, from the point of view of physics, are very strange indeed. Coming up with models that explain the phase transitions of more complex and disordered materials, such as glass and pitch, is something Parisi has been working on, on and off, since the middle of the 1990s.

Efforts to model more and more of the world are nothing new, but once rare successes now tumble in upon the field at a dizzying rate; almost as though physics has undergone its own phase transition. This, Parisi says, is because once two systems in different fields of physics can be described by the same mathematical structure, “a rapid advancement of knowledge takes place in which the two fields cross-fertilize.”

This has clearly happened in Parisi’s own specialism. The mathematics of disorder apply whether you’re describing why some particles try to spin in opposite directions, or why certain people sell shares that others are buying, or what happens when some dinner guests want to sit as far away from other guests as possible.

Phase transitions eloquently connect the visible and quantum worlds. Not that such connections are particularly hard to make. Once you know the physics, quantum phenomena are easy to spot. Ever wondered at a rainbow?

“Much becomes obvious in hindsight,” Parisi writes. “Yet it is striking how in both physics and mathematics there is a lack of proportion between the effort needed to understand something for the first time and the simplicity and naturalness of the solution once all the required stages have been completed.”

The striking “murmurations” of airborne starlings are created when each bird in the flock pays attention to the movements of its nearest neighbour. Obvious, no?

But as Parisi in his charming way makes clear, whenever something in this world seems obvious to us, it is likely because we are perched, knowingly or not, on the shoulders of giants.