“The working conditions one has to put up with here!”

Watching Peter Fleischmann’s Hard to be a God (1989) for New Scientist, 19 January 2022

The scrabble for dominance in streaming video continues to heat up. As I write this, Paramount has decided to pull season 4 of Star Trek Discovery from Netflix and screen it instead on its own platform, forcing die-hard fans to shell out for yet another streaming subscription. Amazon has canceled one Game of Thrones spin-off to concentrate on another, House of the Dragon, writing off $30,000,000 in the process. And Amazon’s Lord of the Rings prequel, set millennia before the events of The Hobbit, is reputed to be costing $450,000,000 per series — that’s five times as much to produce as Game of Thrones.

All this febrile activity has one unsung benefit for viewers; while the wheels of production slowly turn, channel programmers are being tasked with finding historical material to feed our insatiable demand for epic sci-fi and fantasy. David Lynch’s curious and compelling 1984-vintage Dune is streaming on every major service. And on Amazon Prime, you can (and absolutely should) find Peter Fleischmann’s 1989 Hard to Be A God, a West German-Soviet-French-Swiss co-production based on the best known of the “Noon Universe” novels by Soviet sf writers Arkady and Boris Strugatsky.

In the Noon future (named after the brothers’ sci fi novel Noon: 22nd Century) humankind has evolved beyond money, crime and warfare to achieve an anarchist techno-utopia. Self-appointed “Progressors” cross interstellar space with ease to guide the fate of other, less sophisticated humanoid civilisations.

It’s when earnest, dashing progessors land on these benighted and backward worlds that their troubles begin and the ethical dilemmas start to pile up.

In Hard to be a God Anton, an agent of Earth’s Institute of Experimental History, is sent to spy on the city of Arkanar which is even now falling under the sway of Reba, the kingdom’s reactionary first minister. Palace coups, mass executions and a peasant war drive Anton from his position of professional indifference, first to depression, drunkenness and despair, and finally to a fiery and controversial commitment to Arkanar’s weakling revolution.

So far, so schematic: but the novel has a fairly sizeable sting in its tale, and this is admirably brought to the fore in Fleischmann’s screenplay (co-written with Jean-Claude Carriere, best known for his work with Luis Bunuel).

Yes, progressors like Anton have evolved past their propensity for violence; but in consequence, they have lost the knack of ordinary human sympathy. “The working conditions one has to put up with here!” complains Anton’s handler, fighting with a collapsible chair while, on the surveillance screen behind him, Reba’s inquisition beats a street entertainer nearly to death.

Anton — in an appalled and impassioned performance by the dashing Polish actor Edward Zentara — comes at last to understand his advanced civilisation’s dilemma: “We were able to see everything that was happening in the world,” he tells an Ankaran companion, breaking his own cover as he does so. “We saw all the misery, but couldn’t feel sympathy any more. We had our meals while seeing pictures of starving people in front of us.”

Anton’s terrible experiences in strife-torn Ankara (where every street boasts a dangling corpse) do not go unremarked. Earth’s other progressors, watching Anton from orbit, do their best to overcome their limitations. But the message here is a serious one: virtue is something we have to strive for in our lives, not a state we can attain as some sort of birthright.

Comparable to Lynch’s Dune in its ambition, and far more articulate than that cruelly cut-about effort, Fleischmann’s upbeat but moving Hard to be a God reminds us that cinema in the 1980s set later sci-fi movie makers a very high bar indeed. We can only hope that this year’s TV epics and cinema sequels take as much serious effort over their stories as they are taking over their production design.

Whatever happened to Mohammedan Hindus?

Reading Anna Della Subin’s Accidental Gods: On Men Unwittingly Turned Divine for the Telegraph, 8 January 2022

He is a prince of Greece – but he is not Greek. He is a man of Danish, German and Russian blood, but he springs from none of those places. Who is he? Prince Philip, of blessed memory, consort of Queen Elizabeth II? Or is he – as a handful of her subjects, half a world away, would have it – the son of Vanuatu’s volcano god Kalbaben?

Essayist Anna Della Subin wants you to understand why you might mistake a man for a god; why this happens more often than you’d think; and what this says about power, and identity, and about colonialism in particular.

Early proofs of Accidental Gods arrived on my doormat on Tuesday 2 November, the same day QAnon believers gathered in Dallas’s Dealey Plaza to await the resurrection of JFK’s son John (dead these 20 years). So: don’t sneer. This kind of thing can happen to anyone. It can happen now. It can happen here.

Men have been made divine by all manner of people, all over the world. Ranging widely across time and space, Accidental Gods is a treat for the adventurous armchair traveller, though a disconcerting one. We are reminded, with some force, that even the most sophisticated-seeming culture exists, by and large, to contain ordinary human panic in the face of an uncaring cosmos.

After the second world war, during the Allied occupation, ordinary Japanese folk plied American General Douglas MacArthur with lotus roots and dried persimmons, red beans, rice cakes, bonsai trees, walking sticks, samurai swords, deerskins, a kimono, and much else besides. These were offerings, explicitly made to a newcomer God. Now, we more often talk about them as acts of gratitude and respect. This is just ordinary decency — why would one poke fun at a land one has already nuked, defeated, and occupied? Japan’s written historical record lets us focus on the Meiji dynasty’s politics while drawing a veil over its frankly embarrassing theology.

But not everyone has such a rich political account of themselves to hide behind. In the early 1920s Hauka mediums in Niger, central Africa, were possessed by the spirits of their European conquerors. Their zombified antics were considered superstitious and backward. But were they? They managed, after all, to send up the entire French administration. (“In the absence of a pith helmet,” we are told, “they could fashion one out of a gourd”.) In the Congolese town of Kabinda, meanwhile, the wives of shamanic adepts found themselves channelling the spirits of Belgian settler wives. Their faces chalked and with bunches of feathers under their arms (“possibly to represent a purse”) they went around shrilly demanding bananas and hens.

Western eye-witnesses of these events weren’t at all dismissive; they were disturbed. One visitor, reporting to parliament in London in or before 1886, said these people were being driven mad by the experience of colonial subjection. Offerings made to a deified British soldier in Travancore, at India’s southernmost point were, according to this traveller, “an illustration of the horror in which the English were held by the natives.”

But what if the prevailing motive for the white man’s deification was “not horror or dislike, but pity for his melancholy end, dying as he did in a desert, far away from friends”? That was the contrary opinion of a visiting missionary, and he may have had a point: across the subcontinent, “the practice of deifying humans who had died in premature or tragic ways was age-old,” Subin tells us.

Might the “spirit possessed” just have been having a laugh? Again: it’s possible. In 1864, during a Māori uprising against the British, Captain P. W. J. Lloyd was killed, and his severed head became the divine conduit for the angel Gabriel, who, among other fulminations, had not one good word to say about the Church of England.

Subin shows how, by creating and worshipping powerful outsiders, subject peoples have found a way to contend with an overwhelming invading force. The deified outsider, be he a British Prince or a US general, Ethiopian emperor Haile Selassie or octogenarian poet Nathaniel Tarn, “appears on every continent on the map, at times of colonial invasion, nationalist struggle and political unrest.”

This story is as much about the colonisers as the conquered, as much about the present as the past, showing how the religious and the political shade into each other so that “politics is ever a continuation of the sacred under a new name”. Perhaps this is why Subin, while no enthusiast of Empire, takes aim less at the soldiers and settlers and missionaries – who at least took some personal risk and kept their eyes open – than at the academics back home in Europe, and in particular the intellectual followers and cultural descendents of German philologist Freidrich Max Müller, founder of the science of comparative religion. Their theories imposed, on wholly unrelated belief systems, a set of Protestant standards that, among other things, insisted on the insuperable gulf between the human and the divine. (Outside of Christian Europe, this divide hardly exists, and even Catholics have their saints.)

So Europe’s new-fangled science of religion “invented what it purported to describe”, ascribing “belief” to all manner of nuanced behaviours that expressed everything from contempt for the overlord to respect for the dead, to simple human charity. Subin quotes contemporary philosopher Bruno Latour: “A Modern is someone who believes that others believe.”

Subin sings a funeral hymn to religions that ossified. Writing about the catastrophic Partition of India along religious lines, she writes, “There was no place within this modern taxonomy for the hundreds of thousands who labeled themselves ‘Mohammedan Hindus’ on a 1911 census, or for those who worshipped the prophet Muhammad as an avatar of Vishnu.”

Accidental Gods is a playful, ironic, and ambiguous book about religion, at a time when religion – outside of Dealey Plaza – has grown as solemn as an owl. It’s no small achievement for Subin to have written something that, even as it explores the mostly grim religious dimensions of the colonial experience, does not reduce religion to politics but, to the contrary, leaves us hankering, like QAnon’s unlovely faithful, for a wider, wilder pantheon.

How to live an extra life

Reading Sidarta Ribeiro’s The Oracle of Night: The History and Science of Dreams for the Times, 2 January 2022

Early in January 1995 Sidarta Ribeiro, a Brazilian student of neuroscience, arrived in New York City to study for his doctorate at Rockefeller University. He rushed enthusiastically into his first meeting — only to discover he could not understand a word people were saying. He had, in that minute, completely forgotten the English language.

It did not return. He would turn up for work, struggle to make sense of what was going on, and wake up, hours later, on his supervisor’s couch. The colder and snowier the season became, the more impossible life got until, “when February came around, in the deep silence of the snow, I gave in completely and was swallowed up into the world of Morpheus.”

Ribeiro struggled into lectures so he didn’t get kicked out; otherwise he spent the entire winter in bed, sleeping; dozing; above all, dreaming.

April brought a sudden and extraordinary recovery. Ribeiro woke up understanding English again, and found he could speak it more fluently than ever before. He befriended colleagues easily, drove research, and, in time, announced the first molecular evidence of Freud’s “day residue” hypothesis, in which dreams exist to process memories of the previous day.

Ribeiro’s rich dream life that winter convinced him that it was the dreams themselves — and not just the napping — that had wrought a cognitive transformation in him. Yet dreams, it turned out, had fallen almost entirely off the scientific radar.

The last dream researcher to enter public consciousness was probably Sigmund Freud. Freud at least seemed to draw coherent meaning from dreams — dreams that had been focused to a fine point by fin de siecle Vienna’s intense milieu of sexual repression.

But Freud’s “royal road to the unconscious” has been eroded since by a revolution in our style of living. Our great-grandparents could remember a world without artificial light. Now we play on our phones until bedtime, then get up early, already focused on a day that is, when push comes to shove, more or less identical to yesterday. We neither plan our days before we sleep, nor do we interrogate our dreams when we wake. It it any wonder, then, that our dreams are no longer able to inspire us? When US philosopher Owen Flanagan says that “dreams are the spandrels of sleep”, he speaks for almost all of us.

Ribeiro’s distillation of his life’s work offers a fascinating corrective to this reductionist view. His experiments have made Freudian dream analysis and other elements of psychoanalytic theory definitively testable for the first time — and the results are astonishing. There is material evidence, now, for the connection Freud made between dreaming and desire: both involve the selective release of the brain chemical dopamine.

The middle chapters of The Oracle of Night focus on the neuroscience, capturing, with rare candour, all the frustrations, controversies, alliances, ambiguities and accidents that make up a working scientists’ life.

To study dreams, Ribeiro explains, is to study memories: how they are received in the hippocampus, then migrate out through surrounding cortical tissue, “burrowing further and further in as life goes on, ever more extensive and resistant to disturbances”. This is why some memories can survive, even for more than a hundred years, in a brain radically altered by the years.

Ribeiro is an excellent communicator of detail, and this is important, given the size and significance of his claims. “At their best,” he writes, “dreams are the actual source of our future. The unconscious is the sum of all our memories and of all their possible combinations. It comprises, therefore, much more than what we have been — it comprises all that we can be.”

To make such a large statement stick, Ribeiro is going to need more than laboratory evidence, and so his scientific account is generously bookended with well-evidenced anthropological and archaeological speculation. Dinosaurs enjoyed REM sleep, apparently — a delightfully fiendish piece of deduction. And was the Bronze Age Collapse, around 1200 BC, triggered by a qualitative shift how we interpreted dreams?

These are sizeable bread slices around an already generous Christmas-lunch sandwich. On page 114, when Ribeiro declares that “determining a point of departure for sleep requires that we go back 4.5 billion years and imagine the conditions in which the first self-replicating molecules appeared,” the poor reader’s heart may quail and their courage falter.

A more serious obstacle — and one quite out of Ribeiro’s control — is that friend (we all have one) who, feet up on the couch and both hands wrapped around the tea, baffs on about what their dreams are telling them. How do you talk about a phenomenon that’s become the sinecure of people one would happily emigrate to avoid?

And yet, by taking dreams seriously, Bibeiro must also talk seriously about shamanism, oracles, prediction and mysticism. This is only reasonable, if you think about it: dreams were the source of shamanism (one of humanity’s first social specialisations), and shamanism in its turn gave us medicine, philosophy and religion.

When lives were socially simple and threats immediate, the relevance of dreams was not just apparent; it was impelling. Even a stopped watch is correct twice a day. With a limited palette of dream materials to draw from, was it really so surprising that Rome’s first emperor Augustus found his rise to power predicted by dreams — at least according to his biographer Suetonius? “By simulating objects of desire and aversion,” Ribeiro argues, “the dream occasionally came to represent what would in fact happen”.

Growing social complexity enriches dream life, but it also fragments it (which may explain all those complaints that the gods have fallen silent, which we find in texts dated between 1200 to 800 BC). The dreams typical of our time, says Ribeiro, are “a blend of meanings, a kaleidoscope of wants, fragmented by the multiplicity of desires of our age”.

The trouble with a book of this size and scale is that the reader, feeling somewhat punch-drunk, can’t help but wish that two or three better books had been spun from the same material. Why naps are good for us, why sleep improves our creativity, how we handle grief — these are instrumentalist concerns that might, under separate covers, have greatly entertained us. In the end, though, I reckon Ribeiro made the right choice. Such books give us narrow, discrete glimpses into the power of dreams, but leave us ignorant of their real nature. Ribeiro’s brick of a book shatters our complacency entirely, and for good.

Dreaming is a kind of thinking. Treating dreams as spandrels — as so much psychic “junk code” — is not only culturally illiterate — it runs against everything current science is telling us. You are a dreaming animal, says Ribeiro, for whom “dreams are like stars: they are always there, but we can only see them at night”.

Keep a dream diary, Ribeiro insists. So I did. And as I write this, a fortnight on, I am living an extra life.

“A perfect storm of cognitive degradation”

Reading Johann Hari’s Stolen Focus: Why you can’t pay attention for the Telegraph, 2 January 2022

Drop a frog into boiling water, and it will leap from the pot. Drop it into tepid water, brought slowly to the boil, and the frog will happily let itself be cooked to death.

Just because this story is nonsense, doesn’t mean it’s not true — true of people, I mean, and their tendency to acquiesce to poorer conditions, just so long as these conditions are introduced slowly enough. (Remind yourself of this next time you check out your own groceries at the supermarket.)

Stolen Focus is about how our environment is set up to fracture our attention. It starts with our inability to set the notifications correctly on our mobile phones, and ends with climate change. Johann Hari thinks a huge number of pressing problems are fundamentally related, and that the human mind is on the receiving end of what amounts to a denial-of-service attack. One of Hari’s many interviewees is Earl Miller from MIT, who talks about “a perfect storm of cognitive degradation, as a result of distraction”; to which Hari adds the following, devastating gloss: “We are becoming less rational less intelligent, less focused.”

To make such a large argument stick, though, Hari must ape the wicked problem he’s addressing: he must bring the reader to a slow boil.

Stolen Focus begins with an extended grumble about how we don’t read as many books as we used to, or buy as many newspapers, and how we are becoming increasingly enslaved to our digital devices. Why we should listen to Hari in particular, admittedly a latecomer to the “smartphones bad, books good” campaign, is not immediately apparent. His account of his own months-long digital detox — idly beachcombing the shores of Provincetown at the northern tip of Cape Cod, War and Peace tucked snugly into his satchel — is positively maddening.

What keeps the reader engaged are the hints (very well justified, it turns out) that Hari is deliberately winding us up.

He knows perfectly well that most of us have more or less lost the right to silence and privacy — that there will be no Cape Cod for you and me, in our financial precarity.

He also knows, from bitter experience, that digital detoxes don’t work. He presents himself as hardly less of a workaholic news-freak than he was before taking off to Massachusetts.

The first half of Stolen Focus got me to sort out my phone’s notification centre, and that’s not nothing; but it is, in the greater scheme of Hari’s project, hardly more than a parody of the by now very familiar “digital diet book” — the sort of book that, as Hari eventually points out, can no more address the problems filling this book than a diet book can address epidemic obesity.

Many of the things we need to do to recover our attention and focus “are so obvious they are banal,” Hari writes: “slow down, do one thing at a time, sleep more… Why can’t we do the obvious things that would improve our attention? What forces are stopping us?”

So, having had his fun with us, Hari begins to sketch in the high sides of the pot in which he finds us being coddled.

The whole of the digital economy is powered by breaks in our attention. The finest minds in the digital business are being paid to create ever-more-addicting experiences. According to former Google engineer Tristan Harris, “we shape more than eleven billion interruptions to people’s lives every day.” Aza Raskin, co-founder of the Center for Humane Technology, calls the big tech companies “the biggest perpetrators of non-mindfulness in the world.”

Social media is particularly insidious, promoting outrage among its users because outrage is wildly more addictive than real news. Social media also promotes loneliness. Why? Because lonely people will self-medicate with still more social media. (That’s why Facebook never tells you which of your friends are nearby and up for a coffee: Facebook can’t make money from that.)

We respond to the anger and fear a digital diet instils with hypervigilance, which wrecks our attention even further and damages our memory to boot. If we have children, we’ll keep them trapped at home “for their own safety”, though our outdoor spaces are safer than they have ever been. And when that carceral upbringing shatters our children’s attention (as it surely will), we stuff them with drugs, treating what is essentially an environmental problem. And on and on.

And on. The problem is not that Stolen Focus is unfocused, but that it is relentless: an unfeasibly well-supported undergraduate rant that swells — as the hands of the clock above the bar turn round and the beers slide down — to encompass virtually every ill on the planet, from rubbish parenting to climate change.

“If the ozone layer was threatened today,” writes Hari, “the scientists warning about it would find themselves being shouted down by bigoted viral stories claiming the threat was all invented by the billionaire George Soros, or that there’s no such thing as the ozone layer anyway, or that the holes were really being made by Jewish space lasers.”

The public campaign Hari wants Stolen Focus to kick-start (there’s an appendix; there’s a weblink; there’s a newsletter) involves, among other things, a citizen’s wage, outdoor play, limits on light pollution, public ownership of social media, changes in the food supply, and a four-day week. I find it hard to disagree with any of it, but at the same time I can’t rid myself of the image of how, spiritually refreshed by War and Peace, consumed in just a few sittings in a Provincetown coffee shop, Hari must (to quote Stephen Leacock) have “flung himself from the room, flung himself upon his horse and rode madly off in all directions”.

If you read just one book about how the modern world is driving us crazy, read this one. But why would you read just one?

Stone the Fool and others

Reading Stars and Spies: Intelligence Operations and the Entertainment Business
by Christopher Andrew and Julius Green for the Spectator, 18 December 2021

On 2 October 2020, when he became chief of the UK Secret Intelligence Service (MI6, if you prefer), Richard Moore tweeted (*tweeted!*)

#Bond or #Smiley need not apply. They’re (splendid) fiction but actually we’re #secretlyjustlikeyou.

The gesture’s novelty disguised, at the time, its appalling real-world implications: Bond was, after all, competent; and Smiley had integrity.

Stars and Spies, by veteran intelligence historian Christopher Andrew and theatre director and circus producer Julius Green, is a thoroughly entertaining read, but not at all a reassuring one. “The adoption of a fictional persona, the learning of scripts and the ability to improvise” are central to career progression in both theatre and espionage, the writers explain, “and undercover agents often find themselves engaged in what is effectively an exercise in in long-form role play.”

It should, then, come as no surprise that this book boasts “no shortage of enthusiastic but inept entertainer-spies”.

There’s Aphra Behn, the first woman employed as a secret agent by the British state during the Second Anglo-Dutch War in 1665: reaping no secret intelligence from her former lover, “ASTRA, Agent 160”, she made stuff up.

As, indeed, did “The Man Called Intrepid”, Sir William Stephenson, subject, in 1976, of the biggest-selling book ever on intelligence history. His recollections, spanning everything from organising wartime resistance in Europe to developing the Spitfire and the jet engine, work on the German Enigma code, and developing nuclear weapons, turned out to be the melancholy fabulations of a man suffering catastrophic memory loss.

The authors imagine that their subject — the intersection between spying and acting — is entertaining enough that they can simply start in the England of Good Queen Bess and Christopher Marlowe (recruited to spy for Walsingham while a student at Cambridge; also wrote a play or two), and end with the ludicrous antics (and — fair’s fair — brilliant acting) of US spy show Homeland.

And, by and large, they’re right. Begin at the beginning; end at the end. Why gild the lily with anything so arduous as an argument, when your anecdotes are this engaging? (Daniel Defoe’s terrifying plans for a surveillance state were scotched because the government’s intelligence budget was being siphoned off to keep Charles II’s mistresses quiet; and why were the British establishment so resistant to the charms of Soviet ballerinas?)

This approach does, however, leave the authors’ sense of proportion open to question. They’re not wrong to point out that “the most theatrical innovations pioneered by Stalinist intelligence were the show trials”, but in the context of so many Corenesque quasi-theatrical anecdotes, this observation can’t help but feel a bit cheap.

Once the parallels between spying and acting have been pointed out, the stories told here (many of them the fruit of fairly arduous primary research) sometimes come across as slightly fatuous. Why should the popular broadcaster Maxwell Knight not be a powerful recruiter of spies during the inter-war years? There’s nothing counter-intuitive here, if you think about the circles Knight must have moved in.

We are on surer ground when the authors measure the sharp contrast between fictional spies and their real-life counterparts. In the movies, honeypots abound, still rehashing the myths attaching to the courageous World War One French spy Mistinguett and the sadly deluded Margaretha Zelle (Mata Hari).

In truth, though, and for the longest while, women in this business have been more middle management than cat-suited loot. Recruited largely from Oxford’s women’s colleges and Cheltenham Ladies’ College, women played a more important part in the Security Service than in any other wartime government department, and for years, we are told, the service has been recruiting more women at officer and executive level than any other branch of government.

As for seduction and pillow-talk, even a fleeting acquaintance with men in their natural environment will tell us that, as Maxwell Knight put it, “Nothing is easier than for a woman to gain a man’s confidence by the showing and expression of a little sympathy… I am convinced,” he went on, “that more information has been obtained by women agents by keeping out of the arms of a man, than was ever obtained by willingly sinking into them.”

Fuelled by Erskine Childers’s peerless spy novel The Riddle of the Sands (1903), by Somerset Maughan’s Ashenden stories and by everything Fleming ever wrote, of course the audience for espionage drama hankers for real-life insight from writers “in the know”. And if the writer complains that the whole espionage industry is a thing of smoke and mirrors, well, we’ll find that fascinating too. (In Ben Jonson’s spy farce Volpone Sir Pol, on being told of the death of Stone the Fool, claims that Stone actually ran a sophisticated spy ring which communicated by means of dead drops hidden in fruit and vegetables. Eat your heart out, Le Carré.)

Andrew and Green, who both at different times studied history at Corpus Christi, Christopher Marlowe’s old college, are not really giving us the inside track. I would go so far as to say that they are not really telling us anything new. But they marshall their rare facts splendidly, and use them to spin ripping yarns.

“Von Neumann proves what he wants”

Reading Ananyo Bhattacharya’s The Man from the Future for The Telegraph, 7 November 2021

Neumann János Lajos, born in Budapest in 1903 to a wealthy Jewish family, negotiated some of the most lethal traps set by the twentieth century, and did so with breathtaking grace. Not even a painful divorce could dent his reputation for charm, reliability and kindness.

A mathematician with a vise-like memory, he survived, and saved others, from the rise of Nazism. He left Austria and joined Princeton’s Institute of Advanced Study when he was just 29. He worked on ballistics in Second World War, atom and hydrogen bombs in Cold War. Disturbed yet undaunted by the prospect of nuclear armageddon, he still found time to develop game theory, to rubbish economics, and to establish artificial intelligence as a legitimate discipline.

He died plain ‘Johnny von Neumman’, in 1957, at the Walter Reed Army Medical Center in Washington, surrounded by heavy security in case, in his final delirium, he spilled any state secrets.

Following John Von Neumann’s life is rather like playing chess against a computer: he has all the best moves already figured out. ‘A time traveller,’ Ananyo Bhattacharya calls him, ‘quietly seeding ideas that he knew would be needed to shape the Earth’s future.’ Mathematician Rózsa Péter’s assessment of von Neumann’s powers is even more unsettling: ‘Other mathematicians prove what they can,’ she declared; ‘von Neumann proves what he wants.’

Von Neumann had the knack (if we can use so casual a word) of reduced a dizzying variety of seemingly intractable technical dilemmas to problems in logic. In Vienna he learned from David Hilbert how to think systematically about mathematics, using step-by-step, mechanical procedures. Later he used that insight to play midwife to the computer. In between he rendered the new-fangled quantum theory halfway comprehensible (by explaining how Heisenberg’s and Schrödinger’s wildly different quantum models said the same thing); then, at Los Alamos, he helped perfect the atom bomb and co-invented the unimaginably more powerful H-bomb.

He isn’t even dull! The worst you can point to is some mild OCD: Johnny fiddles a bit too long with the light switches. Otherwise — what? He enjoys a drink. He enjoys fast cars. He’s jolly. You can imagine having a drink with him. He’d certainly make you feel comfortable. Here’s Edward Teller in 1966: ‘Von Neumann would carry on a conversation with my three-year-old son, and the two of them would talk as equals, and I sometimes wondered if he used the same principle when he talked to the rest of us.’

In embarking on his biography of von Neumann, then, Bhattacharya sets himself a considerable challenge: writing about a man who, through crisis after crisis, through stormy intellectual disagreements and amid political controversy, contrived always, for his own sake and others’, to avoid unnecessary drama.

What’s a biographer to do, when part of his subject’s genius is his ability to blend in with his friends, and lead a good life? How to dramatise a man without flaws, who skates through life without any of the personal turmoil that makes for gripping storytelling?

If some lives resist the storyteller’s art, Ananyo Bhattacharya does a cracking job of hiding the fact. He sensibly, and very ably, moves the biographical goal-posts, making this not so much the story of a flesh-and-blood man, more the story of how an intellect evolves, moving as intellects often do (though rarely so spectacularly) from theoretical concerns to applications to philosophy. ‘As he moved from pure mathematics to physics to economics to engineering,’ observed former colleague Freeman Dyson, ‘[Von Neumann] became steadily less deep and steadily more important,’

Von Neumann did not really trust humanity to live up, morally, to its technical capacities. ‘What we are creating now,’ he told his wife, after a sleepless night contemplating an H bomb design, ‘is a monster whose influence is going to change history, provided there is any history left.’ He was a quintessentially European pessimist, forged by years that saw the world he had grown up in being utterly destroyed. It is no fanciful ‘man from the future’, and no mere cynic, who writes, ‘We will be able to go into space way beyond the moon if only people could keep pace with what they create.’

Bhattacharya’s agile, intelligent, intellectually enraptured account of John von Neumann’s life reveals, after all, not “a man from the future”, not a one-dimensional cold-war warrior and for sure not Dr Strangelove (though Peter Sellars nicked his accent). Bhattacharya argues convincingly that Von Neumann was a man in whose extraordinarily fertile head the pre-war world found an all-too-temporary lifeboat.

“A moist and feminine sucking”

Reading Susan Wedlich’s Slime: A natural history for the Times, 6 November 2021

For over two thousand years, says science writer Susan Wedlich, quoting German historian Richard Hennig, maritime history has been haunted by mention of a “congealed sea”. Ships, it is said, have been caught fast and even foundered in waters turned to slime.

Slime stalks the febrile dreams of landlubbers, too: Jean-Paul Sartre succumbed to its “soft, yielding action, a moist and feminine sucking”, in a passage, lovingly quoted here, that had this reader instinctively scrabbling for the detergent.

We’ve learned to fear slime, in a way that would have seemed quite alien to the farmers of ancient Egypt, who supposed slime and mud were the base materials of life itself. So, funnily enough, did German zoologist Ernst Haeckel, a champion of Charles Darwin, who saw primordial potential in the gellid lumps being trawled from the sea floor by various oceanographic expeditions. (This turned out to be calcium sulphate, precipitated by the chemical reaction between deep-sea mud and alcohol used for the preservation of aquatic specimens. Haeckel never quite got over his disappointment.)

For Susan Wedlich, it is not enough that we should learn about slime; nor even that we should be entertained by it (though we jolly well are). Wendlich wants us to care deeply about slime, and musters all the rhetorical at her disposal to achieve her goal. “Does even the word “slime” have to elicit gagging histrionics?” she exclaims, berating us for our phobia: “if we neither recognize nor truly know slime, how are we supposed to appreciate it or use it for our own ends?”

This is overdone. Nor do we necessarily know enough about slime to start shouting about it. To take one example, using slime to read our ecological future turns out to be a vexed business. There’s a scum of nutrients held together by slime floating on top of the oceans. A fraction of a millimetre thick, it’s called the “sea-surface micro-layer”. Global warming might be thinning it, or thickening it, and doing either might be increasing the chemical transport taking place between air and ocean — or retarding it — to unknown effect. So there: yet another thing to worry about.

For sure, slime holds the world together. Slimes, rather: there are any number of ways to stiffen water so that it acts as a lubricant, a glue, or a barrier. Whatever its origins, it is most conspicuous when it disappears — as when overtilling of America’s Great Plains caused the Dust Bowl in 1933, or when the gluey glycan coating of one’s blood vessels starts to mysteriously shear away during surgery.

There was a moment, in the 1920s, when slime shed its icky materiality and became almost cool. Artists both borrowed from and inspired Haeckel’s exquisite drawings of delicate maritime invertebrates. And biologists, looking for the mechanisms underpinning memory and heredity, would have liked nothing more than to find that the newly-identified protoplasm within our every cell was recording, like an Edison drum, the tremblings of a ubiquitous, information-rich aether. (Sounds crazy now, but the era was, after all, bathing in X-rays and other newly-discovered radiations.)

But slime’s moment of modishness passed. Now it’s the unlovely poster-child of environmental degradation: the stuff that will fill our soon-to-be-empty oceans, “home only to jellyfish, algae and microbial mats”, if we don’t do something sharpish to change our ecological ways.

Hand in hand with such millennial anxieties, of course, come the usual power fantasies: that we might harness all this unlovely slime — nothing more than water held in a cage of a few long-chain polymers — to transform our world, providing the base for new materials and soft robots, “transparent, stretchable, locomotive, biocompatible, remote-controlled, weavable, wearable, self-healing and shape-morphing, 3D-printed or improved by different ingredients”.

Wedlich’s enthusiasm is by no means misplaced. Slime is not just a largely untapped wonder material. It is also — really, truly — the source of life, and a key enabler of complex forms. We used to think the machinery of the first cells must have risen in clay hydrogels — a rather complicated and unlikely genesis — but it turns out that nucleic acids like DNA and RNA can sometimes form slimes on their own. Life, it turns out, does not need a substrate on which to arise. It is its own sticky home.

Slime’s effective barrier to pathogens may then have enabled complex tissues to differentiate and develop, slickly sequestered from a disease-ridden outside world. Wedlich’s tour of the human gut, and its multiple slime layers, (some lubricant, some gluey, and many armed with extraordinary electrostatic and molecular traps for one pathogen or another) is a tour de force of clear and gripping explanation.

Slime being, in essence, nothing more than stiffened water, there are more ways to make it than the poor reader could ever bare to hear about. So Wedlich very sensibly approaches her subject from the other direction, introducing slimes through their uses. Snails combine gluey and lubricating slimes to travel over dry ground one moment, cling to the underside of a leaf the next. Hagfish deter predators by jellifying the waters around them, shooting polymers from their skin like so many thousands of microscopic harpoons. Some squid, when threatened, add slime to their ink to create pseudomorphs — fake squidoids that hold together just long enough to distract a predator. Some squid pump out whole legions of such doppelgangers.

Wedlich’s own strategy, in writing Slime, is not dissimilar. She’s deliberately elusive. The reader never really feels they’ve got hold of the matter of her book; rather, they’re being provoked into punching through layer after dizzying layer, through masterpieces of fin de siecle glass-blowing into theories about the spontaneous generation of life, through the lifecycles of carnivorous plants into the tactics of Japanese balloon-bomb designers in the second world war, until, dizzy and gasping, they reach the end of Wedlich’s extraordinary mystery tour, not with a handle on slime exactly, but with an elemental and exultant new vision of what life may be: that which arises when the boundaries of earth, air and water are stirred in sunlight’s fire. It’s a vision that, for all its weight of well-marshalled modern detail, is one Aristotle would have recognised.

Citizen of nowhere

Watching Son of Monarchs for New Scientist, 3 November 2021

“This is you!” says Bob, Mendel’s boss at a genetics laboratory in New York City. He holds the journal out for his young colleague to see: on its cover there’s a close-up of the wing of a monarch butterfly. The cover-line announces the lab’s achievement: they have shown how the evolution and development of butterfly color and iridescence are controlled by a single master regulatory gene.

Bob (William Mapother) sees something is wrong. Softer now: “This is you. Own it.”
But Mendel, Bob’s talented Mexican post-doc (played by Tenoch Huerta, familiar from the Netflix series Narcos: Mexico), is near to tears.

Something has gone badly wrong in Mendel’s life. And he’s no more comfortable back home, in the butterfly forests of Michoacán, than he was in Manhattan. In some ways things are worse. Even at their grandmother’s funeral, his brother Simon (Noé Hernández) won’t give him an inch. At least the lab was friendly.

Bit by bit, through touching flashbacks, some disposable dream sequences and one rather overwrought row, we learn the story: how, when Mendel and Simon were children, a mining accident drowned their parents; how their grandmother took them in, but things were never the same; how Simon went to work for the predatory company responsible for the accident, and has ever since felt judged by his high-flying, science-whizz, citizen-of-nowhere brother.

When Son of Monarchs premiered at this year’s Sundance Film Festival, critics picked up on its themes of borders and belonging, the harm walls do and all the ways nature undermines them. Mendel grew up in a forest alive with clouds of Monarch butterflies. (In the film the area, a national reserve, is threatened by mining; these days, tourism is arguably the bigger threat.) Sarah, Mendel’s New York girlfriend (Alexia Rasmussen; note-perfect but somewhat under-used) is an amateur trapeze artist. The point — that airborn creatures know no frontiers — is clear enough; just in case you missed it, a flashback shows young Mendel and young Simon in happier days, discussing humanity’s airborne future.

In a strongly scripted film, such gestures would have been painfully heavy-handed. Here, though, they’re pretty much all the viewer has to go on in this sometimes painfully indirect film.
The plot does come together, though, through the character of Mendel’s old friend Vicente (a stand-out performance by the relative unknown Gabino Rodríguez). While muddling along like everyone else in the village of Angangueo (the real-life site, in 2010, of some horrific mine-related mudslides), Vicente has been developing peculiar animistic rituals. His unique brand of masked howling seems jolly silly at first glance — just a backwoodsman’s high spirits — but as the film advances, we realise that these rituals are just what Mendel needs.

For a man trapped between worlds, Vicente’s rituals offer a genuine way out: a way to re-engage imaginatively with the living world.

So, yes, Son of Monarchs is, on one level, about identity, about how a cosmopolitan high-flier learns to be a good son of Angangeo. But more than that, it’s about personality: about how Mendel learns to live both as a scientist, and as a man lost among butterflies.

French-Venezuelan filmmaker Alexis Gambis is himself a biologist and founded the Imagine Science Film Festival. While Son of Monarchs is steeped in colour, and full of cinematographer Alejandro Mejía’s mouth-watering (occasionally stomach-churning) macro-photography of butterflies and their pupae, ultimately this is a film, not about the findings of science, but about science as a vocation.

Gambis’s previous feature, The Fly Room (2014) was about the inspiration a 10-year-old girl draws from visits to T H Morgan’s famous (and famously cramped) “Fly Room” drosophila laboratory. Son of Monarchs asks what can be done if inspiration dries up. It is a hopeful film and, on more than the visual level, a beautiful one.

Chemistry off the leash

Reading Sarah Rushton’s The Science of Life and Death in Frankenstein for New Scientist, 27 October 2021

In 1817, in a book entitled Experiments on Life and its Basic Forces, the German natural philosopher Carl August Weinhold explained how he had removed the brain from a living kitten, and then inserted a mixture of zinc and silver into the empty skull. The animal “raised its head, opened its eyes, looked straight ahead with a glazed expression, tried to creep, collapsed several times, got up again, with obvious effort, hobbled about, and then fell down exhausted.”

The following year, Mary Shelley’s Frankenstein captivated a public not at all startled by its themes, but hungry for horripilating thrills and avid for the author’s take on arguably the most pressing scientific issue of the day. What was the nature of this strange zone that had opened up between the worlds of the living and the dead?

Three developments had muddied this once obvious and clear divide: in revolutionary France, the flickers of life exhibited by freshly guillotined heads; in Edinburgh, the black market in fresh (and therefore dissectable) corpses; and on the banks of busy British rivers, attempts (encouraged by the Royal Humane Society) to breathe life into the recently drowned.

Ruston covers this familiar territory well, then goes much further, revealing Mary Shelley’s superb and iron grip on the scientific issues of her day. Frankenstein was written just as life’s material basis was emerging. Properties once considered unique to living things were turning out to be common to all matter, both living and unliving. Ideas about electricity offer a startling example.

For more than a decade, from 1780 to the early 1790s, it had seemed to researchers that animal life was driven by a newly discovered life source, dubbed ‘animal electricity’. This was a notion cooked up by the Bologna-born physician Luigi Galvani to explain a discovery he had made in 1780 with his wife Lucia. They had found that the muscles of dead frogs’ legs twitch when struck by an electrical spark. Galvani concluded that living animals possessed their own kind of electricity. The distinction between ‘animal electricity’ and metallic electricity didn’t hold for long, however. By placing discs of different metals on his tongue, and feeling the jolt, Volta showed that electricity flows between two metals through biological tissue.

Galvani’s nephew, Giovanni Aldini, took these experiments further in spectacular, theatrical events in which corpses of hanged murderers attempted to stand or sit up, opened their eyes, clenched their fists, raised their arms and beat their hands violently against the table.

As Ruston points out, Frankenstein’s anguished description of the moment his Creature awakes “sounds very like the description of Aldini’s attempts to resuscitate 26-year-old George Forster”, hanged for the murder of his wife and child in January 1803.

Frankenstein cleverly clouds the issue of exactly what form of electricity animates the creature’s corpse. Indeed, the book (unlike the films) is much more interested in the Creature’s chemical composition than in its animation by a spark.

There are, Ruston shows, many echoes of Humphry Davy’s 1802 Course of Chemistry in Frankenstein. It’s not for nothing that Frankenstein’s tutor Professor Waldman tells him that chemists “have acquired new and almost unlimited powers”.

An even more intriguing contemporary development was the ongoing debate between the surgeon John Abernethy and his student William Lawrence in the Royal College of Surgeons. Abernethy claimed that electricity was the “vital principle” underpinning the behaviour of organic matter. Nonsense, said Lawrence, who saw in living things a principle of organisation. Lawrence was an early materialist, and his patent atheism horrified many. The Shelleys were friendly with Lawrence, and helped him weather the scandal engulfing him.

The Science of Life and Death is both an excellent introduction and a serious contribution to understanding Frankenstein. Through Ruston’s eyes, we see how the first science fiction novel captured the imagination of its public.

 

 

Life dies at the end

Reading Henry Gee’s A (Very) Short History of Life on Earth for the Times, 23 October 2021

The story of life on Earth is around 4.6 billion years long. We’re here to witness the most interesting bit (of course we are; our presence makes it interesting) and once we’re gone (wiped out in an eyeblink, or maybe, just maybe, speciated out of all recognition) the story will run on, and run down, for about another billion years, before the Sun incinerates the Earth.

It’s an epic story, and like most epic stories, it cries out for a good editor. In Henry Gee, a British palaeontologist and senior editor of the scientific journal Nature, it has found one. But Gee has his work cut out. The story doesn’t really get going until the end. The first two thirds are about slime. And once there are living things worth looking at, they keep keeling over. All the interesting species burn up and vanish like candles lit at both ends. Humans (the only animal we know of that’s even aware that this story exists) will last no time at all. And the five extinction events this planet has so far undergone might make you seriously wonder why life bothered in the first place.

We are told, for example, how two magma plumes in the late Permian killed this story just as it got going, wiping out nineteen of every species in the sea, and one out of every ten on land. It would take humans another 500 years of doing exactly what they’ve been doing since the Industrial Revolution to cause anything like that kind of damage.

A word about this: we have form in wiping things out and then regretting their loss (mammoths, dodos, passenger pigeons). And we really must stop mucking about with the chemistry of the air. But we’re not planet-killers. “It is not the Sixth Extinction,” Henry Gee reassures us. “At least, not yet.”

It’s perhaps a little bit belittling to cast Gee’s achievement here as mere “editing”. Gee’s a marvellously engaging writer, juggling humour, precision, polemic and poetry to enrich his impossibly telescoped account. His description of the lycopod forests that are the source of nearly all our coal — and whose trees grew only to reproduce, exploding into a crown of spore-bearing branches — brings to mind a battlefield of the First World War, a “craterscape of hollow stumps, filled with a refuse of water and death… rising from a mire of decay.” A little later a Lystrosaurus (a distant ancestor of mammals, and the most successful land animal ever) is sketched as having “the body of a pig, the uncompromising attitude toward food of a golden retriever, and the head of an electric can opener”.

Gee’s book is full of such dazzling walk-on parts, but most impressive are the elegant numbers he traces across evolutionary time. Here’s one: dinosaurs, unlike mammals, evolved a highly efficient one-way system for breathing that involved passing spent air through sacs distributed inside their bodies. They were air-cooled, which meant they could get very big without cooking themselves. They were lighter than they looked, literally full of hot air, and these advantages — lightweight structure, fast-running metabolism, air cooling — made their evolution into birds possible.

Here’s another tale: the make-up of our teeth — enamel over dentine over bone — is the same as you’d find in the armoured skin of the earliest fishes.

To braid such interconnected wonders into a book the size of a modest novel is essentially an exercise in precis, and a bravura demonstration of the editor’s art. Though the book (whose virtue is its brevity) is not illustrated, there six timelines to guide us through the scalar shifts necessary to comprehend the staggering longueurs involved in bringing a planet to life. Life was entirely stationary and mostly slimy until only about 600 million years ago. Just ten million years ago, grasses evolved, and with them, grazing animals and their predators, some of whom, the primates, were on their way to making us. The earliest Sapiens appeared just over half a million years ago. Only when sea levels fell, around 120,000 years ago, did Sapiens get to migrate around the planet.

As one reads Gee’s “(very) short history”, one feels time slowing down and growing more granular. This deceleration gives Gee the space he needs to depict the burgeoning complexity of life as it spreads and evolves. It’s a scalar game that’s reminiscent of Charles and Ray Eames’s 1967 films *Powers of Ten*, which depicted the relative scale of the Universe by zooming in (through the atom) and out (through the cosmos) at logarithmic speed. It’s a dizzying and exhilarating technique which, for all that, makes clear sense out of very complex narratives.

Eventually — and long after we are gone — life will retreat beneath the earth as the swelling sun makes conditions on the planet’s surface impossible. The distinctions between things will fall away as life, struggling to live, becomes colossal, colonial and homogenous. Imagine vast subterranean figs, populated by evolved, worm-like insects…

Then, your mind reeling, try and work out what on earth people mean when they say that humans have conquered and/or despoiled the planet.

Our planet deserves our care, for sure, because we have to live here. But the planet has yet to register our existence, and probably never will. We are, Gee explains, just two and a half million years into a series of ice ages that will last for tens of millions of years more. Our species’ story extends not much beyond one of these hundreds of cycles. The human-induced injection of carbon dioxide “will set back the date of the next glacial advance” — and that is all. 250 million years hence, any future prospectors (and they won’t be human), armed with equipment “of the most refined sensitivity”, might — just might — be able to detect that, a short way through the Cenozoic Ice Age, *something happened*, “but they might be unable to say precisely what.”

It takes a long time to bring complex life to a planet, and complex life, once it runs out of wriggle room, collapses in an instant. Humans already labour under a considerable “extinction debt” since they have made their habitat (“nothing less than the entire Earth”) progressively less habitable. Most everything that ever went extinct fell into the same trap. What makes our case tragic is that we’re conscious of what we’ve done; we’re trying to do something about it; and we know that, in the long run, it will never be enough.

Gee’s final masterstroke as editor is to make human sense, and real tragedy, from his unwieldy story’s glaring spoiler: that Life dies at the end.