Taking in the garbage

Reading Interstellar by Avi Loeb for New Scientist, 30 August 2023

On 8 January 2014, a meteor exploded above the Pacific just north of Papua New Guinea’s Manus Island.

Five years later Amir Siraj, then a research assistant for Harvard astronomer Avi Loeb, spotted it in an online catalogue at the Center for Near-Earth Object Studies, part of NASA’s Jet Propulsion Laboratory.

Partway through Interstellar, Loeb explains why he thinks the meteor comes from outside the solar system. This would make it one of only three objects so identified. The first was ‘Oumuamua, detected in 2017: a football-field size pancake-shaped anomaly and the subject of Loeb’s book Interplanetary, to which Interstellar is a repetitive, frenetic, grandiose extension.

Since Interstellar was sent to press, Loeb’s team have gathered particles from the crash site and packed them off to to labs at Harvard University, the University of California, Berkeley, and the Bruker Corporation in Germany for further analysis. Metallic spherules from outside our solar system would be a considerable find in itself.

Meanwhile Loeb is publically airing a hypothesis which, thanks to an opinion piece on 10 February 2023, is already familiar to readers of New Scientist. He reckons this meteor might turn out to have been manufactured by extraterrestrials.

Already there has been some bad-tempered push-back, but Loeb does not care. He’s innoculated against other people’s opinions, he says in Interstellar, not least because “my first mentor in astrophysics… had a professional rival, and when my mentor died it was his rival that was asked to write his obituary in a prestigious journal.”

Loeb, who has spent a career writing about black holes, dark matter and the deep time of the universe, does not waste time arguing for the existence of spacefaring extraterrestrials. Rather, he argues that we should be looking for spacefaring extraterrestrials, or at any rate for their gear. Among the possible scenarios for First Contact, “a human-alien handshake in front of the White House” is the least likely. It’s far more likely we’ll run into some garbage or a probe of some sort, and only then, says Loeb, because we’ve taken the trouble to seek it out.

Until very recently, no astronomical instrument was built for such a purpose. But this is changing, says Loeb, who cites NASA’s Unidentified Aerial Phenomena study, launched in December 2022, and the Legacy Survey of Space and Time — a 10-year-long high-resolution record of the entire southern sky, to be conducted on the brand-new Vera C. Rubin Observatory in Chile. Then there’s Loeb’s own brainchild, The Galileo Project, meant to bring the search for extraterrestrial technological signatures “from accidental or anecdotal observations and legends to the mainstream of transparent, validated and systematic scientific research.” The roof of the Harvard College Observatory boasts the project’s first sky-scanning apparatus.

There’s more than a whiff of Quixote about this project, but Loeb’s well within his rights to say that unless we go looking for extraterrestrials, we’re never going to find them. Loeb’s dating metaphor felt painfully hokey at first, but it grew on me: are we to be cosmic wallflowers, standing around on the off-chance that some stranger comes along? Or are we going to go looking for things we’ll never spot without a bit of effort?

Readers of grand speculations by the likes of Freeman Dyson and Stanislaw Lem will find nothing in Interstellar to make them blink, aside maybe from a rather cantankerous prose style. Can we be reassured by Loeb’s promise that he and his team work only with scientific data openly available for peer review, that they share their findings freely and only through traditional scientific channels, and will release no results except through scientifically accepted channels of publication?

I’m inclined to say yes, we should. Arguments from incredulity are always a bad idea, and sneering is never a good look.

A truth told backwards

Reading Disputed Inheritance by Gregory Radick for New Scientist, 23 August 2023

In 1865 Gregor Mendel, working to understand the mechanisms of hybridisation, discovered exquisitely simple and reliable patterns of inheritance in varieties of garden pea. Rediscovered in 1900, the patterns of inheritance described in his work revealed the existence of hereditary “particles”. Today, we call these particles “genes”.

Well, of course, there’s more to the story than this. In his ambitious and spirited history of the genetic idea, Leeds-based geneticist Gregory Radick accounts for our much more nuanced, sophisticated ideas of what genetics actually is, and in so doing he asks a deceptively simple question: why, knowing what we know now, do we still bother with Mendel? Why, when we explain genetics to people, do we reach for experiments conducted by a man who had no interest in heritability, never mind evolution, and whose conclusions about heritability (in as much as he ever made any) were quite spectacularly contradicted in experiments by Darwin’s favourite plantsman Thomas Laxton in 1866? (“Where Mendel’s pea hybrids always showed just the one parental character in colour and shape, Laxton’s,” says Radick, “were sometimes blended, sometimes wholly like the one parent, sometimes wholly like the other, and sometimes mosaically like both.”)

The evidence against Mendelian genetics began accumulating almost immediately after its 20th-century rediscovery. The “genetics” we talk about today isn’t Mendelian, it’s molecular, and it arose out of other sciences: microbiology, biochemistry, X-ray crystallography, and later a whole host of data-rich sequencing technologies. “Today’s genome,” Radick explains, “the post-genomic genome, looks more like… a device for regulating the production of specific proteins in response to the constantly changing signals [the cell] receives from its environment.”

The point is not that Mendel was “wrong”. (That would be silly, like saying Newton was “wrong” for not coming up with special relativity.) The point is that we have no real need to be thinking in Mendelian terms at all any more. Couching almost the whole of modern genetics as exceptions to Mendelian’s specious “rules” is to be constantly having to explain everything backwards.

Radick explains why this has happened, and what we can do about it. The seed of trouble was first sown in the battle (at first collegiate, then increasingly cantankerous) between the Cambridge-based William Bateson, who made it his mission to reshape biology in the image of Mendel’s experiments, and Oxford-based Walter Frank Raphael Weldon, who saw that Mendel (whose interest was hybrids, not heredity) had removed from his experiments as many ordinary sources of variability as he could. Real pea seeds are not always just yellow or green, or just round or wrinkled, and Weldon argued that actual variability should not be just idealised away.

“It seems to me that every character is at once inherited and acquired,” Weldon wrote, and of course he was right. The difficulty was what to do with that insight. “It is easy to say Mendelism does not happen,” he remarked to his friend Karl Pearson in March 1903, “but what the deuce does happen is harder every day!”

What Weldon needed, and what he pursued, was an alternative theory of heredity, but the book manuscript setting out his alternative vision was left unfinished at his death, from pneumonia, in April 1906.

Radick’s book champions the underdog, Weldon, over the victorious Bateson. Whether his account smacks of special pleading will depend on the reader’s education and interests. (Less than a century ago, geneticists in the Soviet Union faced ruin and even persecution as they defended the Mendelian idea, insufficient as that idea may seem to us now; temperatures in this field run high.)

This is not the first attempt to lay history’s ghosts to rest, and reset our ideas about genetics. That said, I can’t think of one that’s better argued, more fair-minded or more sheerly enjoyable.

Radiant with triumphant calamity

Reading Fear by Robert Peckham for the Telegraph, 25 August 2023 

Remember the UK intelligence claim that Saddam Hussein could strike the UK with a ballistic missile within 45 minutes? The story goes that this was spun out of a two-year-old conversation with a taxi driver on the Iraq-Jordan border. One thing’s for sure: fear breeds rumour breeds more fear.

Robert Peckham lives in fear, and claims we’re all of us entering “an era of insidious, mediatised fear”. This may be a case of misery seeking company. And you can see why: in 1988 this British historian of science (author of several well-received books about epidemics) narrowly missed getting blown up in a terrorist attack on the funeral of Abdul Ghaffar Khan in Jalalabad. More recently, in the summer of 2021, he quit his job at the University of Hong Kong where, he writes, ”fear was palpable… friends were being hounded by the authorities, news agencies shut down and opposition leaders jailed.”

With the spread of Covid-19, Peckham’s political and medical interests dovetailed in Hong Kong in grim fashion. “A pandemic turned out to be the ultimate anti-protest weapon,” he writes, “one that the city’s chief executive, Carrie Lam, deployed ruthlessly to stifle opposition.”

Fear is the story of how, over the last 700 years or so, power has managed and manipulated its subjects through dread: of natural disasters, pandemics, revolutions, technologies, financial crashes, wars and of course, through fear of the government itself.

We see how the Catholic Church tried and failed to canalise the horrors of the Black Death into sacral terror and obedience; how instead that fear powered the Reformation. There’s a revealing section about Shakespeare’s Hamlet, a play both steeped in fear and about it: and how fear is shown sometimes engendering, sometimes acting as a moral brake on violence. Through the bloody medium of the French revolution, we enter the modern era painfully aware that reason has proved more than capable of buttressing terror, and that the post-Enlightenment period is “radiant with triumphant calamity”.

Peckham’s history is as encyclopaedic as it is mirthless. After a striking and distressing chapter about the slave trade (every book should have one), Peckham even wonders whether “perhaps slavery has been so thoroughly embedded in free market capitalism that it can’t be dislodged, at least not without the collapse of the entire system”.

At this point, the reader is entitled tug on the reins and double check some figures on the United Nations website. And sure enough: in the 21st century alone global life expectancy has risen seven years, literacy has risen by nine per cent (to 91 per cent) and extreme poverty is about a third what it was at the beginning of this century.

Allow Peckham’s argument that the Machiavellian weaponisation of fear had a hand in all this: dare one suggest this was a price worth paying?

Of course this is far from the whole of Peckham’s argument. He says at the outset he wants to explore the role fear plays in promoting reform, as well as its use in repressing dissent. “What,” he asks, “would happen to all the public-spirited interventions that rely on the strategic use of fear to influence our behaviour? Don’t we need fear to take our problems seriously?”

It’s an interesting project. Too often, though, the focus on fear acts to dampen our responses, rather than enrich them. For instance, Peckham depicts Versailles as “a policed society” where “prescriptions on how to eat, talk, walk and dance kept courtiers in line, with the ever-present threat that they might be stripped of their privileges if rules of comportment were infringed”. This is at once self-evident and woefully incomplete, excluding as it does any talk of political aspiration, personal vanity, love of play, the temptations of gossip and the lure of luxe. This isn’t an insight into Versailles; it’s a gloomy version of Versailles.

There is a difference, it is true, between the trenches of Verdun, and the fear felt in those trenches, just as there is a difference between the NKVD knocking on your door, and your fear of the knock. But — and here’s the nub of the matter — is it a useful difference? Or is it merely a restatement of the obvious?

In the end, having failed to glean the riches he had hoped for, Peckham is left floundering: “Fear is always intersectional,” he writes, “an unnerving confluence of past, present and future, a convergence of the here and there.”

To which this reader replied, with some exasperation, “Oh, pull the other one!”

“Democratic leaders are more difficult to decipher”

Reading The Madman in the White House: Sigmund Freud, Ambassador Bullitt and the Lost Psychobiography of Woodrow Wilson by Patrick Weil, for the Spectator, 29 July 2023

It was a vision US president Woodrow Wilson could not resist, and it was instrumental in bringing the US into the Great War.

The treaty at Versailles and the League of Nations founded during the negotiations were meant, not just to end the current war, but to end all future wars, by ensuring that a country taking up arms against one signatory would be treated as a belligerent by all the others.

Wilson took his advisor Edward “Colonel” House’s vision of a new world order and careered off with it.

Against advice, he attended Versailles in person, let none of his staff in with him during negotiations, was quickly overwhelmed, saw his principled “fourteen points” deluged by special provisions and horse-trading, and returned home, convinced, first, that his dearest close colleagues had betrayed him (which they hadn’t); second, that the League of Nations alone could mend what the Treaty of Versailles had left broken or made worse (which it didn’t); third, that that he was the vessel of divine will, and that what the world needed from him at this crucial hour was a show of principle and Christ-like sacrifice. “The stage is set,” he declared to a dumbfounded and sceptical Senate, “the destiny is disclosed. It has come about by no plan of our conceiving, but by the hand of God who has led us into this way. We cannot turn back.”

Winston Churchill had Wilson’s number: “Peace and Goodwill among all Nations abroad, but no truck with the Republican party at home. That was his ticket and that was his ruin and the ruin of much else as well.”

When it was clear he would not get everything he wanted, Wilson destroyed the bill utterly, needlessly ending US involvement in the League before it had even begun.

Several developments followed ineluctably from this. Another world war, of course; also a small but vibrant cottage industry in books explaining just what in hell Wilson had thought he was playing at. Patrick Weil’s new book captures the anger and confusion of the period. His evenness of manner that sets the hysteria of his subjects into high relief; take, for example, Sigmund Freud, who wrote of Wilson: “As far as a single person can be responsible for the misery of this part of the world, he surely is.”

Anger and a certain literary curiosity drove Freud to collaborate with William C. Bullitt, Wilson’s erstwhile advisor, on a psychobiography of the man. His daughter Anna hated the final book, but one has to assume she was dealing with her own daddy issues.

Delayed by decades so as not to derail Bullitt’s political career, Thomas Woodrow Wilson: a psychological study was published in bowdlerised form, and to no very positive fanfare, in 1967. Bullitt, by then a veteran anti-communist, was chary of handing ammunition to the enemy, and suppressed his book’s most sensational interpretations, involving Wilson’s suppressed homosexuality, his overbearing father, and his Christ complex.

In 2014, Weil, a political scientist based in Paris, happened upon the original 1932 manuscript.

Are the revelations contained there enough on their own to sustain a new book? Weil is circumspect, enriching his account with a quite detailed and acute psychological biography of his own — of William Bullitt. Bullitt was a democratic idealist and political insider who found himself pushed into increasingly hawkish postures by his all-too-clear appreciation of the threat posed by Stalin’s Soviet Union. He made strange bedfellows over the years: on his deathbed he received a friendly note from Richard Nixon, “Congratulations incidentally on driving the liberal establishment out of their minds with your Wilson.”

Those readers who’ve come for bread and circuses will find that John Maynard Keynes’s 1919 book “The Economic Consequences of the Peace” chopped up Wilson nicely long ago, tracing “the disintegration of the president’s moral position and the clouding of his mind.” Then there’s the 1922 Wilson psychobiography that Freud did not endorse, though he wrote privately to its author, William Bayard Hale, that his linguistic analysis of Wilson’s prolix pronouncements had “a true spirit of psychoanalysis in it.” Then there’s Alexander and Juliet Georges’ Woodrow Wilson and Colonel House, a psychological portrait from 1956 that argues that Wilson’s father represented a superego whom Wilson could never satisfy. And there are several others, if you go looking.

So, is Madman a conscientious but unnecessary book about Wilson? Or an insightful but rather oddly proportioned book about Bullitt? The answer is both, nor is this necessarily a drawback. Bullitt’s growing disillusion with Wilson, his hurt, and ultimately his contempt for the man, shaped him as surely as his curiously unhappy childhood and a formative political debacle at Princeton shaped Woodrow Wilson.

“Dictators are easy to read,” Weil writes. “Democratic leaders are more difficult to decipher. However, they can be just as unbalanced as dictators and can play a truly destructive role in our history.”

This is well put, but I think Weil’s portrait of Bullitt demonstrates something broader and rather more hopeful: that politics — even realpolitik — is best understood as an affair of the heart.

The press of a single red button

Watching Christopher Nolan’s Oppenheimer for New Scientist, 19 July 2023

At 05.29 and 45 seconds on 16 July 1945, an electrical circuit clicks shut and thirty-two detonators fire, driving a uranium plug into a core of plutonium. The plutonium fissions, each atom splitting into lighter elements, a blast of gamma radiation and two or three more neutrons, which hurtle forth, triggering further reactions. A new world order is born: one in which the human species has the capacity to all-but wipe itself from the face of the planet; a world in which the terror of annihilation helps avert global conflict, unevenly, at great cost, and by no means necessarily for ever.

J Robert Oppenheimer directed atomic bomb development at Los Alamos in New Mexico, and then spent many subsequent years arguing for international arms control, and against US development of the even more powerful fusion bomb. Not only did he midwife this new Cold War world into being; he gave us the vocabulary with which to talk about it, agonise over it, and fear it.

It is possible to miss the point of Christopher Nolan’s superb biopic of Oppenheimer. One and a half hours of screen time follow the successful Trinity test of an atomic device. If all that interests you is how Nolan, a filmmaker famously wedded to analogue production and real (70mm IMAX) film, conveys the scale of an atomic explosion, you’re in for a long haul.

Oppenheimer is about the war in its hero’s head. It reflects the world in which Oppenheimer actually operated. It’s a film set in lecture rooms and laboratories, in living rooms and kitchens, shacks and bunkers. (The horror of Hiroshima is conveyed quite simply: Oppenheimer, sat in front of footage of the aftermath, cannot stand to watch, and looks away.)

Following America’s use of two atomic bombs on Japan at the end of the second world war, walls shake, exposures wobble, continuity stutters and different film stocks are muddled together to convey Oppenheimer’s increasingly nightmarish experience of the new reality. Were Nolan’s story (drawn from Kai Bird and Martin Sherwin’s biography American Prometheus) not so grippingly told, the final film, with its invarying pace, portentous, minimalist musical score and abiding humourlessness would, I suspect, prove unwatchable: like 2020’s Tenet, a film easier to read than to watch: a three-hour-long promo video.

What transforms Oppenheimer — and makes it, for my money at any rate, Nolan’s best film since 2006’s The Prestige — is the sheer crafti evident in the script.

The film orbits around two official hearings, both of which took place in the early fifties: Oppenheimer’s appeal against the revocation of his security clearance with the Atomic Energy Commission; and former AEC commissioner Lewis Strauss’s cabinet confirmation hearing as he tilted for reappointment as US Commerce Secretary. Those who know Strausss’s fraught attitudes towards Oppenheimer will relish Robert Downey Jr’s screen-chewing perfomance as the multifaceted Strauss. Those coming to the material fresh have a cracking twist in store, as the pair’s relationship comes to vivid life in the final act of the film.

Fragments of Oppenheimer’s odyssey — from theoretical astrophysicist to father of the atomic bomb — orbit these two centres of gravity. The narrative surface that results is as complex as anything Nolan has achieved before, but less confusing. Oppenheimer covers a staggering amount of intellectual historical and biographical ground, with nary a trace of gallumphing exposition. The script finds room to give Russian physicists given their due, and conveys very sensitively the internationalist sentiment that dominated research at Los Alamos.

Of course, the physicists and engineers at Los Alamos could think what they liked. There was a war on, and a Cold War to follow. Oppenheimer’s largely fruitless tilts at geopolitical realities after the war was over became emblematic of the plight of the conscience-stricken government scientist. His damaging run-ins with officialdom during the anti-communist scares of the 1950s only confirmed his status as a modern Prometheus, punished for handing atomic fire to humanity.

Strauss had little time for the idea of Oppenheimer-the-tragic-overreacher, and Nolan, funnily enough, seems to agree. At any rate, he finds no use for Oppenheimer’s own self-dramatising. (Oppenheimer, quoting the Bhagavad-Gita, used to notoriously bang on about becoming Death, Destroyer of Worlds; this dark flourish is got rid of early on.) Nolan is much more interested in Oppenheimer’s impossible bind: an intelligent man, by no means naive or “unpolitical”, whose background in academia and theory un-fits him for the world he helps create. Emily Blunt’s performance as Kitty, Oppenheimer’s increasingly embittered and partisan wife, is crucial, if almost wordless. Other big names flourish in supporting roles that allow them unusual freedom. Matt Damon is positively gruff as Leslie Groves, the general in charge of the Los Alamos project; Dane DeHaan relishes a gratingly unsympathetic portrait of Kenneth Nichols, director of US Army R&D; Bennie Safdie makes even the peaceniks among us fall in love with Edward Teller, hawkish father of the fusion bomb, a straight-shooting adversary Oppenheimer can’t help but shake by the hand, to Kitty’s lip-curling disgust.

Even before he starts acting, Cillian Murphy’s resting demeanour drips a sort of divine cluelessness that makes him a shoo-in for the role of Robert Oppenheimer. He goes on to deliver a shuddering performance that, more than any finely wrought dialogue, conveys the impossible moral bind of scientists recruited into government service.

To know the world is to change it. On 16 July 1945, knowledge and deed were separated by the press of a single red button. Oppenheimer takes three hours to explain why this moment matters, and there’s not a second of screentime wasted. It’s a rich, strange, compelling film. A tragedy, yes — and a triumph.

Saltbushed, rabbitbrushed and tumbleweeded

Reading Dust by Jay Owens for the Telegraph, 17 July 2023

Here’s a lesson from optics that historians of science seem to have taken in with their mother’s milk: the narrower the aperture, the more focused the image. Pick a narrow something, research its story till it squeaks, and you might just end up with a twisted-but-true vision of the world as a whole. To Jared Diamond’s Guns and Germs and Steel, to Mark Kurlansky’s Salt, and Laura Martin’s Tea, can we now add geographer Jay Owens’ Dust?

Owens’ pursuit of dust (defined very broadly as particles of a certain size, however generated) sends her tripping through many fascinating and rewarding realms, but this can sometimes be at the expense of her main subject. (For instance, an awful lot of this book is less about dust than about the absence of water.) “Dust,” Owens writes, “is matter at the very limit-point of formlessness, the closest ‘stuff’ gets to nothing.” This is nicely put, but what it boils down is: Dust is slippery stuff to hang a book upon.

Owens’ view of dust is minatory, Some dust is vital to natural ecological processes (rainfall being not the least of them). Approximately 140 million tonnes of dust fall every year across the tropical Atlantic Ocean, providing nutrients to marine ecosystems. Still, dust also brings disease: “In the Caribbean,” Owens tells us, “the Saharan winds carry spores of the fungus Aspergillus, making corals and sea fans sicken and die.”

Increasing the amount of dust in the atmosphere has led and still leads to sickness and death. In Ford County, Kansas, at the very bottom of the Dust Bowl, one-third of all deaths in 1935 were from pneumonia. Today, lead and arsenic hitchhike on soot particles formed by combustion, driving some into hay-feverish discomfort, others into acute respiratory failure.

The direct health effects of dust are arresting, but Owens’ abiding interest in dust developed when she began tracing its ubiquity and systemic pervasiveness: how, for instance, electric cars, being heavier, generate extra road dust, which is rich in microplastic particles, and how these transport other environmental contaminants including 6PPD-quinone, “an antioxidant added to tyre rubber that researchers have found is producing mass die-offs of coho salmon in the Pacific Northwest.”

Set aside the temptation to run screaming into the hills, we have two ways to confront a world revealed to be this intagliated and insoluble. The first is to embrace ever vaguer suitcase language to contain its wicked problems. When Owens started talking about the “anthropocene”, — a putative new geological era triggered by [insert arbitrary technological advance here], my heart sank. Attempts to conciliate between the social sciences and geology are at best silly and at worst pompous.

The second tactic is to hold your nerve, get out of your chair and go look at stuff; observe the world as keenly as you can, and write as honestly as possible about what you see. And Owens’ success here is such as to nudge aside all earlier quibbles.

Owens is a superb travel writer, delivering with aplomb on her own idea of what geographers should be doing: “Paying attention to tangible, material realities to ground our theoretical models in the world.” (Owens, p. 326)

With Owens, we travel from saltbushed, rabbitbrushed and tumbleweeded Lake Owens in California to Aralka in Kazakhstan, and the toxic remains of what was once the fourth largest lake in the world. We visit ice core researchers in Greenland, and catch a glimpse of their “cold, arduous, multi-year detective work”. We discover through vicarious experience, and not just through rhetoric, why we can’t just admire the fruits of modernity, “the iPhones, the Teslas, the staggering abundance of consumer entertainment – but should follow that tree down to its roots.”

Dust’s journeys, interviews, and historical insights serve Owens’ purpose better than the terms of art she has brought across from social anthropology. I admit I was quite taken with the idea of “Discard Studies”, that interrogates the world through its trash; but a glimpse of Lake Owens’s current condition — a sort of cyborg woodland in place of the old lake, and a place more altered than restored — says more about our ever-more dust-choked world, than a thousand modish gestures ever could.

“I want you to laugh openly at it”

Watching Sebastien Blanc’s Cerebrum for New Scientist, 12 July 2023

A year after the car he was driving span off the road and into a tree, William is shown into an all but empty room. There’s a camp bed. A TV. It’s not his old bedroom — it might not even be his house, it’s so anonymous — but it’ll have to do. William’s still learning to walk again, and the stairs will be too much for him. This is a shame, because he wants to see his mother, who never comes downstairs, never visits him, and is, it seems, constantly “under the weather”.

William scribbles a message to Richard, the man who brought him here: “Is she angry?” and Richard protests just that little bit too much. Already we feel we shouldn’t be watching, not because there’s anything bad going on, but because the script, by first-time feature director Sebastien Blanc, absolutely refuses to acknowledge our presence.

The camera work is no guide, either. Shot in the flat, pseudo-factual style of a British soap opera, Cerebrum views everything that happens with same dispassion. No jump scares. No plangent chords. We’re going to have to figure all this out for ourselves.

And so we do. Richard is William’s adoptive father. The house, for all that it is virtually empty, is indeed — or was — their family home. Dad is killing and burying women in the garden. And Mum is — or jolly well ought to be — dead, killed in the accident for which William (rightly, as it turns out) blames himself.

“You have no idea what I am doing to fix what you have done,” says Richard, in a rare moment of lost temper, and hands the astute viewer pretty much the entire plot.

It’s a gutsy, deliberate move, placing suspense over surprise. We know our Frankenstein. We know what happens to the mad professor in the attic. For one hour and 37 minutes we watch, with growing excitement and gathering horror, as the expected denouement approaches, and Ramona Von Pusch, playing William’s mother, gets the briefest of brief moments in the limelight.

Tobi King Bakare’s more or less mute turn as William, damaged in both body and mind, is visceral to a fault. Best of all, he never plays for sympathy: William hates himself so much, we rather hate him too, at least at first.

Steve Oram, who plays Richard, is a ubiquitous presence on British TV, but nothing prepared us for this. It’s impossible to keep in mind that the man is acting. Richard is a terrifying creation: a quiet, unimaginative man building his very own road to Hell.

When the floodgates finally crack, and Richard sits William down for a spot of family therapy, things take a very dark emotional turn. “I want you to visualise what is troubling you,” says Richard, “and then I want you to laugh openly at it” — at which point half of me wanted to cheer at the scriptwriter’s chutzpah, the half to run screaming from the living room.

Cerebrum is not an important movie. It’s a no-budget labour of love that gives writer-director Blanc something to talk about in pitch meetings. Structured entirely around suspense, the film can’t help but leave us feel disappointed in the final reel, though I can’t help but feel that any extra twists would have felt tacked-on. The script, which gives a black twentysomething white adoptive parents, and then hands everyone plenty of conversational rope with which to hang themselves, suggests Jordan Peele’s superbly queasy 2017 debut Get Out — but the threads here aren’t gathered nearly so tightly or so cleverly.

Watch Cerebrum for its performances, for its chillingly spare script, and for the trust it puts in its audience. Don’t expect miracles. Richard did, and look what happens to him…

“Crude to the point of vulgarity, judgmental in the extreme, and bitterly punitive”

Reading The Age of Guilt by Mark Edmundson for New Scientist, 5 July 2023

In his Freudian analysis of what we might loosely term “cancel culture”, Mark Edmundson wisely chooses not to get into facile debates about which of the pioneering psychoanalyst’s ideas have or have not been “proved right”. What would that even mean? Psychology is not so much science as it is engineering, applying ideas and evidence to a purpose. Edmundson, an author and literary scholar, simply wants to suggest that Freud’s ideas might help us better understand our current cultural moment.

In the centre of Freud’s model of the personality sits the ego, the conscious bit of ourselves, the bit that thinks, and therefore is. Bracketing the ego are two components of the personality that are inaccessible to conscious awareness: the id, and the super-ego. The id is the name Freud gives to all those drives that promote immediate individual well-being. Fancy a sandwich? A roll in the hay? A chance to clout your rival? That’s your id talking.

Much later, in an attempt to understand why so many of his clients gave themselves such a hard time (beating themselves up over trivia, calling themselves names, self-harming) Freud conceived the super-ego. This is the bit of us that warns us against misbehaviour, and promotes conformity to social norms. Anyone who’s spent time watching chimpanzees will understand why such machinery might evolve in an animal as ultra-social as Homo sapiens.

Casual descriptions of Freud’s personality model often characterise the super-ego as a sort of wise uncle, paternalistically ushering the cadet ego out of trouble.

But this, Edmundson says, is a big mistake. A power that, in each of us, watches, discovers and criticizes all our intentions, is not a power to be taken lightly.

Edmundson argues that key cultural institutions evolved not just to regulate our appetites; they also provide direction and structure for the super-ego. A priest might raise an eyebrow at your gluttony; but that same priest will relieve you of your self-hatred by offering you a simple atonement: performing it wipes your slate clean. Edmundson wonders what, in the absence of faith, can corral and direct the fulminations of our super-ego — which in this account is not so much a fount of idealism, and more a petulant, unrelenting and potentially life-threatening martinet, “crude to the point of vulgarity, judgmental in the extreme, and bitterly punitive.”

The result of unmet super-ego demands is sickness. “The super-ego punishes the ego and turns it into an anxious, frightened creature, a debilitatingly depressed creature, or both by turns,” Edmundson explains, and quotes a Pew Research study showing that, from 2007 to 2017, the percentage of 12-to-17 year olds who have experienced a major depressive episode in the past year rose from 8 percent to 13 percent. Are these severely depressed teenagers “in some measure victims of the wholesale cultural repudiation of Freud”?

Arguments from intuition need a fairly hefty health warning slapped on them, but I defy you not to find yourself nodding along to more than a few of Edmundson’s philippics: for instance, how the internet became our culture’s chief manifestation of the super-ego, its loudest users bearing all the signs of possession, “immune to irony, void of humour, unforgiving, prone to demand harsh punishments.”

Half a century ago, the anthropologist Ernest Becker wrote a book, The Denial of Death, that hypothesised all manner of connections between society, behaviour and consciousness. Its informed and closely argued speculations inspired a handful of young researchers to test his ideas, and thereby revolutionise the field of experimental psychology. (An excellent book from 2015, The Worm at the Core, tells their story.)

In a culture that’s growing so pathologically judgmental, condemnatory, and punitive, I wonder if The Age of Guilt can perform the same very valuable trick? I do hope so.

Ideas are like boomerangs

Reading In a Flight of Starlings: The Wonder of Complex Systems by Giorgio Parisi for The Telegraph, 1 July 2023

“Researchers,” writes Giorgio Parisi, recipient of the 2021 Nobel Prize in Physics, “often pass by great discoveries without being able to grasp them.” A friend’s grandfather identified and then ignored a mould that killed bacteria, and so missed out on the discovery of penicillin. This story was told to Parisi in an attempt to comfort him for the morning in 1970 he’d spent with another hot-shot physicist, Gerard ‘t Hooft, dancing around what in hindsight was a perfectly obvious application of some particle accelerator findings. Having teetered on the edges of quantum chromodynamics, they walked on by; decades would pass before either man got another stab at the Nobel. “Ideas are often like boomerangs,” Parisi explains, and you can hear the sigh in his voice; “they start out moving in one direction but end up going in another.”

In a Flight of Starlings is the latest addition to an evergreen genre: the scientific confessional. Read this, and you will get at least a frisson of what a top-flight career in physics might feel like.

There’s much here that is charming and comfortable: an eminent man sharing tales of a bygone era. Parisi began his first year of undergraduate physics in November 1966 at Sapienza University in Rome, when computer analysis involved lugging about (and sometimes dropping) metre-long drawers of punched cards.

The book’s title refers to Parisi’s efforts to compute the murmurations of starlings. Recently he’s been trying to work out how many solid spheres of different sizes will fit into a box. There’s a goofiness to these pet projects that belies their significance. The techniques developed to follow thousands of starlings through three dimensions of space and one of time bear a close resemblance to those used to solve statistical physics problems. And fitting marbles in a box? That’s a classic problem in information theory.

The implications of Parisi’s work emerge slowly. The reader, who might, in all honesty, be touched now and again by boredom, sits up straighter once the threads begin to braid.

Physics for the longest time could not handle complexity. Galileo’s model of the physical world did not include friction, not because friction was any sort of mystery, but because the mathematics of his day couldn’t handle it.

Armed with better mathematics and computational tools physics can now study phenomena that Galileo could never have imagined would be part of physics. For instance, friction. For instance, the melting of ice, and the boiling of water: phenomena that, from the point of view of physics, are very strange indeed. Coming up with models that explain the phase transitions of more complex and disordered materials, such as glass and pitch, is something Parisi has been working on, on and off, since the middle of the 1990s.

Efforts to model more and more of the world are nothing new, but once rare successes now tumble in upon the field at a dizzying rate; almost as though physics has undergone its own phase transition. This, Parisi says, is because once two systems in different fields of physics can be described by the same mathematical structure, “a rapid advancement of knowledge takes place in which the two fields cross-fertilize.”

This has clearly happened in Parisi’s own specialism. The mathematics of disorder apply whether you’re describing why some particles try to spin in opposite directions, or why certain people sell shares that others are buying, or what happens when some dinner guests want to sit as far away from other guests as possible.

Phase transitions eloquently connect the visible and quantum worlds. Not that such connections are particularly hard to make. Once you know the physics, quantum phenomena are easy to spot. Ever wondered at a rainbow?

“Much becomes obvious in hindsight,” Parisi writes. “Yet it is striking how in both physics and mathematics there is a lack of proportion between the effort needed to understand something for the first time and the simplicity and naturalness of the solution once all the required stages have been completed.”

The striking “murmurations” of airborne starlings are created when each bird in the flock pays attention to the movements of its nearest neighbour. Obvious, no?

But as Parisi in his charming way makes clear, whenever something in this world seems obvious to us, it is likely because we are perched, knowingly or not, on the shoulders of giants.

Cute but not beautiful

Reading Silk: A history in three metamorphoses by Aarathi Prasad for the Financial Times, 27 June 2023

In 1766, two years after his arrival in Buenos Aires, Father Ramon Maria Termeyer, Society of Jesuits, wandered on horseback through a carob forest and into a maze of spiders’ webs so strong, “they got in the way of me and my horse and made my hat fall from my head, unless I took care to break them with a rod.” Glancing around, he realised with a thrill that he was surrounded by cocoons quite as large as the spiders watching him from every branch; and it was a thrill, one might add, not of horror, but of mercantile possibility: what if these spiders could be forcibly silked?

Indeed they could: one of the stranger pictures in this strangest of histories is a contemporary diagram of a 1900 machine made to “milk” related Golden Orb spiders in Madagascar.

Readers coming to this globe-trotting and species-leaping volume expecting vignette after genteel vignette of 5000-odd years of Chinese silk manufacture are in for a very nasty shock indeed. Here be spiders, and not just spiders, but metre-long Mediterranean clams, never mind countless moth species spinning their silks everywhere from Singapore to Suriname. As the entomologist Rene-Antoine Ferchault de Réaumur observed in 1711, “nature does not limit itself to a few examples, even of its most singular productions.”

This sets Aarathi Prasad quite a challenge: billed as “a cultural and biological history”, Silk must flit from China, Indonesia and India to South America and Madagascar, and from there to the Mediterranean to examine Procopius of Caesarea’s “cloak made of wool, not such as produced by sheep, but gathered from the sea”.

The Chinese silkworm, Bombyx mori, necessarily holds centre stage, since it has played a leading role in our understanding of the natural world. The 17th-century naturalist and globetrotter Maria Sybilla Merian traced its lifecycle to scotch the idea that small organisms arose spontaneously out of decomposing matter. In 1807 the Italian Agostino Bassi showed how infection was transmitted from a sick caterpillar to a healthy one, in a paper that Louis Pasteur read 60 years later, as he formulated the germ theory of disease.

Then there’s the author’s own experience of these strange creatures, “cute but not beautiful”, and unable, in their adult form, to eat or defecate — “nor do they do much at all as moths, except to mate and die.” As a child Prasad used to feed her larvae with mulberry leaf paste, then watched as they span “cradles of their own making, swaddled in kilometres of pure white silk.”

Chinese silk domestication, which began around the Yellow River, sometime in the Neolithic period, between 7500 and 5,000 years ago, bequeathed us helpless, flightless grubs that require human intervention just to survive. Other domestication strategies were followed in India, where Antheraea paphia, still winged, still brightly coloured, spins extravagant cocoons of rough, rugged, golden tasar silk and hangs them on a “stalk” from its favourite trees. Roman writers, seeing their branches so thickly festooned, thought these silk farmers were harvesting fruit.

Weaving between these natural wonders are the human stories; of Marcello Malpighi, whose dissection of Bombyx mori for the Royal Society in the 1660s took an entire year; of Georg Eberhardt Rumpf, whose survey of cocoon-producing moths in Indonesia triggered a quite surreal string of personal disasters; of Thomas Wardle, whose Midlands factory hands finally worked out out how to bleach and print on tasar silk, triggering a boom in India’s silk exports.

The effort to personalise and dramatise such a wealth of unfamiliar and often downright peculiar information sometimes empurples Prasad’s prose, as when, “with her luggage ready, her younger child and a magnifying glass close at hand, Merian departed over the slick, white-capped waves of Amsterdam’s harbour on that busy seaway out of the Dutch Republic: first to the wind-bound sea channels of the island of Texel, past the Isle of Wight, the storm-worn point of Portland from whose glowing stones London’s St Paul’s Cathedral was still being built, and out of the English Channel into the treacherous, sloping swells of the Bay of Biscay.”

But flourishes like these have their charm, and did after all gift us Termeyer’s spooky, unforgettable ride among the webs of Aranea latro.

Those readers less enamoured of Prasad’s bravura scene-setting will discover more sardonic pleasures. The Austrian archduke Franz Ferdinand, it seems, owned quite the latest in personal body protection: a bullet-proof vest made of silk. On 28 June 1914 he forgot to put it on.

Scientific accounts of silk traditionally end in Arthur C Clarke territory, rather breathlessly describing how Earth-tethered space elevators made of synthetic silk will propel future astronauts into orbit. Prasad’s not holding her breath over that one, though she does treat us to a futuristic vision of “flexible and biodegradable implantable electronics that record our brain signals” and “edible sensors we could safely consume to track our fitness or the nutritional quality of our food”.

Mostly though Prasad is happy to admit that “more often, scientific progress is just tiresomely incremental.” Technological wonders will follow our continuing investigations, but they will do so in their own good time. One especially gratifying lesson to be drawn from this charming and absorbing book, is that silks will sustain their mystery and surprise and glamour for a while yet.