“Von Neumann proves what he wants”

Reading Ananyo Bhattacharya’s The Man from the Future for The Telegraph, 7 November 2021

Neumann János Lajos, born in Budapest in 1903 to a wealthy Jewish family, negotiated some of the most lethal traps set by the twentieth century, and did so with breathtaking grace. Not even a painful divorce could dent his reputation for charm, reliability and kindness.

A mathematician with a vise-like memory, he survived, and saved others, from the rise of Nazism. He left Austria and joined Princeton’s Institute of Advanced Study when he was just 29. He worked on ballistics in Second World War, atom and hydrogen bombs in Cold War. Disturbed yet undaunted by the prospect of nuclear armageddon, he still found time to develop game theory, to rubbish economics, and to establish artificial intelligence as a legitimate discipline.

He died plain ‘Johnny von Neumman’, in 1957, at the Walter Reed Army Medical Center in Washington, surrounded by heavy security in case, in his final delirium, he spilled any state secrets.

Following John Von Neumann’s life is rather like playing chess against a computer: he has all the best moves already figured out. ‘A time traveller,’ Ananyo Bhattacharya calls him, ‘quietly seeding ideas that he knew would be needed to shape the Earth’s future.’ Mathematician Rózsa Péter’s assessment of von Neumann’s powers is even more unsettling: ‘Other mathematicians prove what they can,’ she declared; ‘von Neumann proves what he wants.’

Von Neumann had the knack (if we can use so casual a word) of reduced a dizzying variety of seemingly intractable technical dilemmas to problems in logic. In Vienna he learned from David Hilbert how to think systematically about mathematics, using step-by-step, mechanical procedures. Later he used that insight to play midwife to the computer. In between he rendered the new-fangled quantum theory halfway comprehensible (by explaining how Heisenberg’s and Schrödinger’s wildly different quantum models said the same thing); then, at Los Alamos, he helped perfect the atom bomb and co-invented the unimaginably more powerful H-bomb.

He isn’t even dull! The worst you can point to is some mild OCD: Johnny fiddles a bit too long with the light switches. Otherwise — what? He enjoys a drink. He enjoys fast cars. He’s jolly. You can imagine having a drink with him. He’d certainly make you feel comfortable. Here’s Edward Teller in 1966: ‘Von Neumann would carry on a conversation with my three-year-old son, and the two of them would talk as equals, and I sometimes wondered if he used the same principle when he talked to the rest of us.’

In embarking on his biography of von Neumann, then, Bhattacharya sets himself a considerable challenge: writing about a man who, through crisis after crisis, through stormy intellectual disagreements and amid political controversy, contrived always, for his own sake and others’, to avoid unnecessary drama.

What’s a biographer to do, when part of his subject’s genius is his ability to blend in with his friends, and lead a good life? How to dramatise a man without flaws, who skates through life without any of the personal turmoil that makes for gripping storytelling?

If some lives resist the storyteller’s art, Ananyo Bhattacharya does a cracking job of hiding the fact. He sensibly, and very ably, moves the biographical goal-posts, making this not so much the story of a flesh-and-blood man, more the story of how an intellect evolves, moving as intellects often do (though rarely so spectacularly) from theoretical concerns to applications to philosophy. ‘As he moved from pure mathematics to physics to economics to engineering,’ observed former colleague Freeman Dyson, ‘[Von Neumann] became steadily less deep and steadily more important,’

Von Neumann did not really trust humanity to live up, morally, to its technical capacities. ‘What we are creating now,’ he told his wife, after a sleepless night contemplating an H bomb design, ‘is a monster whose influence is going to change history, provided there is any history left.’ He was a quintessentially European pessimist, forged by years that saw the world he had grown up in being utterly destroyed. It is no fanciful ‘man from the future’, and no mere cynic, who writes, ‘We will be able to go into space way beyond the moon if only people could keep pace with what they create.’

Bhattacharya’s agile, intelligent, intellectually enraptured account of John von Neumann’s life reveals, after all, not “a man from the future”, not a one-dimensional cold-war warrior and for sure not Dr Strangelove (though Peter Sellars nicked his accent). Bhattacharya argues convincingly that Von Neumann was a man in whose extraordinarily fertile head the pre-war world found an all-too-temporary lifeboat.

“A moist and feminine sucking”

Reading Susan Wedlich’s Slime: A natural history for the Times, 6 November 2021

For over two thousand years, says science writer Susan Wedlich, quoting German historian Richard Hennig, maritime history has been haunted by mention of a “congealed sea”. Ships, it is said, have been caught fast and even foundered in waters turned to slime.

Slime stalks the febrile dreams of landlubbers, too: Jean-Paul Sartre succumbed to its “soft, yielding action, a moist and feminine sucking”, in a passage, lovingly quoted here, that had this reader instinctively scrabbling for the detergent.

We’ve learned to fear slime, in a way that would have seemed quite alien to the farmers of ancient Egypt, who supposed slime and mud were the base materials of life itself. So, funnily enough, did German zoologist Ernst Haeckel, a champion of Charles Darwin, who saw primordial potential in the gellid lumps being trawled from the sea floor by various oceanographic expeditions. (This turned out to be calcium sulphate, precipitated by the chemical reaction between deep-sea mud and alcohol used for the preservation of aquatic specimens. Haeckel never quite got over his disappointment.)

For Susan Wedlich, it is not enough that we should learn about slime; nor even that we should be entertained by it (though we jolly well are). Wendlich wants us to care deeply about slime, and musters all the rhetorical at her disposal to achieve her goal. “Does even the word “slime” have to elicit gagging histrionics?” she exclaims, berating us for our phobia: “if we neither recognize nor truly know slime, how are we supposed to appreciate it or use it for our own ends?”

This is overdone. Nor do we necessarily know enough about slime to start shouting about it. To take one example, using slime to read our ecological future turns out to be a vexed business. There’s a scum of nutrients held together by slime floating on top of the oceans. A fraction of a millimetre thick, it’s called the “sea-surface micro-layer”. Global warming might be thinning it, or thickening it, and doing either might be increasing the chemical transport taking place between air and ocean — or retarding it — to unknown effect. So there: yet another thing to worry about.

For sure, slime holds the world together. Slimes, rather: there are any number of ways to stiffen water so that it acts as a lubricant, a glue, or a barrier. Whatever its origins, it is most conspicuous when it disappears — as when overtilling of America’s Great Plains caused the Dust Bowl in 1933, or when the gluey glycan coating of one’s blood vessels starts to mysteriously shear away during surgery.

There was a moment, in the 1920s, when slime shed its icky materiality and became almost cool. Artists both borrowed from and inspired Haeckel’s exquisite drawings of delicate maritime invertebrates. And biologists, looking for the mechanisms underpinning memory and heredity, would have liked nothing more than to find that the newly-identified protoplasm within our every cell was recording, like an Edison drum, the tremblings of a ubiquitous, information-rich aether. (Sounds crazy now, but the era was, after all, bathing in X-rays and other newly-discovered radiations.)

But slime’s moment of modishness passed. Now it’s the unlovely poster-child of environmental degradation: the stuff that will fill our soon-to-be-empty oceans, “home only to jellyfish, algae and microbial mats”, if we don’t do something sharpish to change our ecological ways.

Hand in hand with such millennial anxieties, of course, come the usual power fantasies: that we might harness all this unlovely slime — nothing more than water held in a cage of a few long-chain polymers — to transform our world, providing the base for new materials and soft robots, “transparent, stretchable, locomotive, biocompatible, remote-controlled, weavable, wearable, self-healing and shape-morphing, 3D-printed or improved by different ingredients”.

Wedlich’s enthusiasm is by no means misplaced. Slime is not just a largely untapped wonder material. It is also — really, truly — the source of life, and a key enabler of complex forms. We used to think the machinery of the first cells must have risen in clay hydrogels — a rather complicated and unlikely genesis — but it turns out that nucleic acids like DNA and RNA can sometimes form slimes on their own. Life, it turns out, does not need a substrate on which to arise. It is its own sticky home.

Slime’s effective barrier to pathogens may then have enabled complex tissues to differentiate and develop, slickly sequestered from a disease-ridden outside world. Wedlich’s tour of the human gut, and its multiple slime layers, (some lubricant, some gluey, and many armed with extraordinary electrostatic and molecular traps for one pathogen or another) is a tour de force of clear and gripping explanation.

Slime being, in essence, nothing more than stiffened water, there are more ways to make it than the poor reader could ever bare to hear about. So Wedlich very sensibly approaches her subject from the other direction, introducing slimes through their uses. Snails combine gluey and lubricating slimes to travel over dry ground one moment, cling to the underside of a leaf the next. Hagfish deter predators by jellifying the waters around them, shooting polymers from their skin like so many thousands of microscopic harpoons. Some squid, when threatened, add slime to their ink to create pseudomorphs — fake squidoids that hold together just long enough to distract a predator. Some squid pump out whole legions of such doppelgangers.

Wedlich’s own strategy, in writing Slime, is not dissimilar. She’s deliberately elusive. The reader never really feels they’ve got hold of the matter of her book; rather, they’re being provoked into punching through layer after dizzying layer, through masterpieces of fin de siecle glass-blowing into theories about the spontaneous generation of life, through the lifecycles of carnivorous plants into the tactics of Japanese balloon-bomb designers in the second world war, until, dizzy and gasping, they reach the end of Wedlich’s extraordinary mystery tour, not with a handle on slime exactly, but with an elemental and exultant new vision of what life may be: that which arises when the boundaries of earth, air and water are stirred in sunlight’s fire. It’s a vision that, for all its weight of well-marshalled modern detail, is one Aristotle would have recognised.

Chemistry off the leash

Reading Sarah Rushton’s The Science of Life and Death in Frankenstein for New Scientist, 27 October 2021

In 1817, in a book entitled Experiments on Life and its Basic Forces, the German natural philosopher Carl August Weinhold explained how he had removed the brain from a living kitten, and then inserted a mixture of zinc and silver into the empty skull. The animal “raised its head, opened its eyes, looked straight ahead with a glazed expression, tried to creep, collapsed several times, got up again, with obvious effort, hobbled about, and then fell down exhausted.”

The following year, Mary Shelley’s Frankenstein captivated a public not at all startled by its themes, but hungry for horripilating thrills and avid for the author’s take on arguably the most pressing scientific issue of the day. What was the nature of this strange zone that had opened up between the worlds of the living and the dead?

Three developments had muddied this once obvious and clear divide: in revolutionary France, the flickers of life exhibited by freshly guillotined heads; in Edinburgh, the black market in fresh (and therefore dissectable) corpses; and on the banks of busy British rivers, attempts (encouraged by the Royal Humane Society) to breathe life into the recently drowned.

Ruston covers this familiar territory well, then goes much further, revealing Mary Shelley’s superb and iron grip on the scientific issues of her day. Frankenstein was written just as life’s material basis was emerging. Properties once considered unique to living things were turning out to be common to all matter, both living and unliving. Ideas about electricity offer a startling example.

For more than a decade, from 1780 to the early 1790s, it had seemed to researchers that animal life was driven by a newly discovered life source, dubbed ‘animal electricity’. This was a notion cooked up by the Bologna-born physician Luigi Galvani to explain a discovery he had made in 1780 with his wife Lucia. They had found that the muscles of dead frogs’ legs twitch when struck by an electrical spark. Galvani concluded that living animals possessed their own kind of electricity. The distinction between ‘animal electricity’ and metallic electricity didn’t hold for long, however. By placing discs of different metals on his tongue, and feeling the jolt, Volta showed that electricity flows between two metals through biological tissue.

Galvani’s nephew, Giovanni Aldini, took these experiments further in spectacular, theatrical events in which corpses of hanged murderers attempted to stand or sit up, opened their eyes, clenched their fists, raised their arms and beat their hands violently against the table.

As Ruston points out, Frankenstein’s anguished description of the moment his Creature awakes “sounds very like the description of Aldini’s attempts to resuscitate 26-year-old George Forster”, hanged for the murder of his wife and child in January 1803.

Frankenstein cleverly clouds the issue of exactly what form of electricity animates the creature’s corpse. Indeed, the book (unlike the films) is much more interested in the Creature’s chemical composition than in its animation by a spark.

There are, Ruston shows, many echoes of Humphry Davy’s 1802 Course of Chemistry in Frankenstein. It’s not for nothing that Frankenstein’s tutor Professor Waldman tells him that chemists “have acquired new and almost unlimited powers”.

An even more intriguing contemporary development was the ongoing debate between the surgeon John Abernethy and his student William Lawrence in the Royal College of Surgeons. Abernethy claimed that electricity was the “vital principle” underpinning the behaviour of organic matter. Nonsense, said Lawrence, who saw in living things a principle of organisation. Lawrence was an early materialist, and his patent atheism horrified many. The Shelleys were friendly with Lawrence, and helped him weather the scandal engulfing him.

The Science of Life and Death is both an excellent introduction and a serious contribution to understanding Frankenstein. Through Ruston’s eyes, we see how the first science fiction novel captured the imagination of its public.

 

 

Life dies at the end

Reading Henry Gee’s A (Very) Short History of Life on Earth for the Times, 23 October 2021

The story of life on Earth is around 4.6 billion years long. We’re here to witness the most interesting bit (of course we are; our presence makes it interesting) and once we’re gone (wiped out in an eyeblink, or maybe, just maybe, speciated out of all recognition) the story will run on, and run down, for about another billion years, before the Sun incinerates the Earth.

It’s an epic story, and like most epic stories, it cries out for a good editor. In Henry Gee, a British palaeontologist and senior editor of the scientific journal Nature, it has found one. But Gee has his work cut out. The story doesn’t really get going until the end. The first two thirds are about slime. And once there are living things worth looking at, they keep keeling over. All the interesting species burn up and vanish like candles lit at both ends. Humans (the only animal we know of that’s even aware that this story exists) will last no time at all. And the five extinction events this planet has so far undergone might make you seriously wonder why life bothered in the first place.

We are told, for example, how two magma plumes in the late Permian killed this story just as it got going, wiping out nineteen of every species in the sea, and one out of every ten on land. It would take humans another 500 years of doing exactly what they’ve been doing since the Industrial Revolution to cause anything like that kind of damage.

A word about this: we have form in wiping things out and then regretting their loss (mammoths, dodos, passenger pigeons). And we really must stop mucking about with the chemistry of the air. But we’re not planet-killers. “It is not the Sixth Extinction,” Henry Gee reassures us. “At least, not yet.”

It’s perhaps a little bit belittling to cast Gee’s achievement here as mere “editing”. Gee’s a marvellously engaging writer, juggling humour, precision, polemic and poetry to enrich his impossibly telescoped account. His description of the lycopod forests that are the source of nearly all our coal — and whose trees grew only to reproduce, exploding into a crown of spore-bearing branches — brings to mind a battlefield of the First World War, a “craterscape of hollow stumps, filled with a refuse of water and death… rising from a mire of decay.” A little later a Lystrosaurus (a distant ancestor of mammals, and the most successful land animal ever) is sketched as having “the body of a pig, the uncompromising attitude toward food of a golden retriever, and the head of an electric can opener”.

Gee’s book is full of such dazzling walk-on parts, but most impressive are the elegant numbers he traces across evolutionary time. Here’s one: dinosaurs, unlike mammals, evolved a highly efficient one-way system for breathing that involved passing spent air through sacs distributed inside their bodies. They were air-cooled, which meant they could get very big without cooking themselves. They were lighter than they looked, literally full of hot air, and these advantages — lightweight structure, fast-running metabolism, air cooling — made their evolution into birds possible.

Here’s another tale: the make-up of our teeth — enamel over dentine over bone — is the same as you’d find in the armoured skin of the earliest fishes.

To braid such interconnected wonders into a book the size of a modest novel is essentially an exercise in precis, and a bravura demonstration of the editor’s art. Though the book (whose virtue is its brevity) is not illustrated, there six timelines to guide us through the scalar shifts necessary to comprehend the staggering longueurs involved in bringing a planet to life. Life was entirely stationary and mostly slimy until only about 600 million years ago. Just ten million years ago, grasses evolved, and with them, grazing animals and their predators, some of whom, the primates, were on their way to making us. The earliest Sapiens appeared just over half a million years ago. Only when sea levels fell, around 120,000 years ago, did Sapiens get to migrate around the planet.

As one reads Gee’s “(very) short history”, one feels time slowing down and growing more granular. This deceleration gives Gee the space he needs to depict the burgeoning complexity of life as it spreads and evolves. It’s a scalar game that’s reminiscent of Charles and Ray Eames’s 1967 films *Powers of Ten*, which depicted the relative scale of the Universe by zooming in (through the atom) and out (through the cosmos) at logarithmic speed. It’s a dizzying and exhilarating technique which, for all that, makes clear sense out of very complex narratives.

Eventually — and long after we are gone — life will retreat beneath the earth as the swelling sun makes conditions on the planet’s surface impossible. The distinctions between things will fall away as life, struggling to live, becomes colossal, colonial and homogenous. Imagine vast subterranean figs, populated by evolved, worm-like insects…

Then, your mind reeling, try and work out what on earth people mean when they say that humans have conquered and/or despoiled the planet.

Our planet deserves our care, for sure, because we have to live here. But the planet has yet to register our existence, and probably never will. We are, Gee explains, just two and a half million years into a series of ice ages that will last for tens of millions of years more. Our species’ story extends not much beyond one of these hundreds of cycles. The human-induced injection of carbon dioxide “will set back the date of the next glacial advance” — and that is all. 250 million years hence, any future prospectors (and they won’t be human), armed with equipment “of the most refined sensitivity”, might — just might — be able to detect that, a short way through the Cenozoic Ice Age, *something happened*, “but they might be unable to say precisely what.”

It takes a long time to bring complex life to a planet, and complex life, once it runs out of wriggle room, collapses in an instant. Humans already labour under a considerable “extinction debt” since they have made their habitat (“nothing less than the entire Earth”) progressively less habitable. Most everything that ever went extinct fell into the same trap. What makes our case tragic is that we’re conscious of what we’ve done; we’re trying to do something about it; and we know that, in the long run, it will never be enough.

Gee’s final masterstroke as editor is to make human sense, and real tragedy, from his unwieldy story’s glaring spoiler: that Life dies at the end.

A lousy container for thought

A critical survey of climate fiction for The Bookseller, 18 October 2021

A genre to contain our imaginative responses to climate change was only ever going to be a house built of straw. Climate change is not like any other problem — the nuclear threat, say, or the hole in the ozone layer, or big tobacco, or big pharma. We are used to satirising, vilifying and sometimes even explaining and humanising our societal mistakes. But our species’ role in the earth’s average temperature rise isn’t, in any meaningful sense, either an accident or an oversight; nor is it a deliberate act of malevolence or of willful blindness. It’s a wicked problem, embracing generations living and dead and still to be born, implicating everyone on earth, and calling into question every stab we’ve ever made at progress.

Calling attention to the problem has been the easy bit. We had tools we could use. Science fiction, in particular, had over half a century’s experience exploring risks of all kinds, not all of them goofy or existential, when George Turner’s The Sea and Summer (1987) turned a generation of SF readers on to the firestorms to come.

The work we these days most easily classify as “climate fiction” has hardly got beyond that early, siren-sounding stage. That sounds like a problem, but I’m not sure it is one. Promoting a genre means giving it clear lines and simple definitions. Of course work labelled “cli-fi” remains wedded to an essentially dystopic view of the future: those are the rules we set for it. Nor, come to that, is there anything wrong with informing emerging generations of the problems they must face. Marcus Sedgwick and Paolo Bacigalupi and their peers have built worthwhile careers on this minatory effort. There’s no point complaining about how “cli-fi” aestheticises the disasters it depicts. Artists communicate through beauty, not disgust. (Go look at Goya, if you don’t believe me.)

And there has been progress. The genre’s brief has widened. The Swan Book by Australian Aboriginal author and land rights activist Alexis Wright (2013) highlights how the colonial abuse of peoples goes hand in hand with the land’s exhaustion. Cherie Dimaline’s The Marrow Thieves (2017) would have us stop treating the world as a series of problems to be solved.

This approach, it is true, has less appeal for those of us who have more of our lives to look back on than to look forward to. For us, solutions to the End Times can’t come quickly enough.

Bruce Sterling’s Viridian Design Movement and Neal Stephenson’s Hieroglyph Project promised to amass fictional thought experiments to solve our difficult future. Though their calls for submissions of socially useful fiction were directed at younger writers from diverse backgrounds, they were, in essence, activities aimed at old western men in a hurry. These projects fell short of their promise, but not, I think, because they tried to be positive, and not because they came from an unfashionable corner of the culture. They failed *because fiction is the wrong tool for that kind of work*. Fiction is — let’s be frank here — a lousy container for thought.

One of the most successful climate-engaged books of the last couple of years, Richard Powers’s The Overstory, is, dramatically speaking, also one of the most underwhelming: little more than an animated Wikipedia trawl through contemporary abuses in the forestry sector. James Bradley contracts a milder version of the same disease in his recent novel Ghost Species (2021): a thinly fictionalised series of opinion pieces on the role of synthetic biology in addressing biome loss.

These books are symptoms of our moment, not shapers of it, and that’s because fiction undermines received opinion far more effectively than it establishes it. It’s a solvent, not a glue. Far more influential on the cli-fi scene are David Mitchell’s Cloud Atlas (2004) and Margaret Atwood’s Maddaddam trilogy — works that, significantly, leave us questioning the very idea that climate solutions are possible.

Writers! Let’s leave dealing with climate change to the grown-ups, and go back to our proper job: teasing out what it feels like to be in a climate crisis.

Yun Ko-eun’s 2020 satire The Disaster Tourist describes an economy geared to our appetite for disaster. Gathering Evidence by Martin MacInnes and The Rain Heron by Robbie Arnott (both 2020) spin twisted eco-fables thick with guilt and dripping with cognitive dissonance. Jeff Vandermeer dons the motley of a private eye to solve the murder of the Earth in Hummingbird Salamander (2021); and Laura Jean Mckay gives our doomed biome a voice in The Animals in That Country (2020).

These books and others are at last addressing the subjective experience of climate change. That’s vital psychological work, and socially useful with it: you can throw facts at our heads all day long, but people will deny and avoid that which they cannot feel.

Meanwhile, within science fiction and the high-concept thriller, and from out of the recent glut of “dystopian” fiction, a much keener, cleverer, more properly fictional approach is emerging to address the climate crisis.

It’s the creeping uncanny of the coming apocalypse that’s engaging the current wave of climate-engaged writers. Under the Blue by Oana Aristide and This Fragile Earth by Susannah Wise both exploit the fact that anything on the scale of a climate disaster is going to be slow. Civilisation will collapse, but the shelves won’t empty overnight, and the flood insurance won’t bankrupt you just yet. Wise goes even further, projecting beyond climate disaster towards a workable new world. All around us, the city’s 3D-printed buildings are spiralling into being, while a few hardy saplings in the derelict neighbourhood park are “evidence of the blight’s end”. Whether Wise’s heroine will ever be able to wrap her pre-apocalyptic head around this post-apocalyptic future is, however, uncertain.

In the Goldsmiths-winning The Sunken Land Begins to Rise Again (2020), to take an even more powerful example, M John Harrison leads his readers around the archaeological leavings of Ironbridge picking his way between the foetuses of discarded genetic experiments spilling from the back of an aquarium shop. This is a world where current tools and technology can find no purchase in a reality that’s already wedded to the future.

This new crop of climate fiction won’t, after all, help us save tomorrow — but it will, and for the first time, help us picture it. For as long as they were wedded to what might happen in the future, writers of climate fiction could only amount to a bunch of Cassandras, trumpeting their own importance. Now they are feeding on much richer meat. Cli-fi has stuck its teeth into the present.

Dogs (a love story)

Reading Pat Shipman’s Our Oldest Companions: The story of the first dogs for New Scientist, 13 October 2021

Sometimes, when Spanish and other European forces entered lands occupied by the indigenous peoples of South America, they used dogs to massacre the indigenous human population. Occasionally their mastiffs, trained to chase and kill, actually fed upon the bodies of their victims.

The locals’ response was, to say the least, surprising: they fell in love. These beasts were marvellous novelties, loyal and intelligent, and a trade in domesticated dogs spread across a harrowed continent.

What is it about the dog, that makes it so irresistible?

Anthropologist Pat Shipman is out to describe the earliest chapters in our species’ relationship with dogs. From a welter of archaeological and paleo-genetic detail, Shipman has fashioned an unnerving little epic of love and loyalty, hunting and killing.

There was, in Shipman’s account, nothing inevitable, nothing destined, about the relationship that turned the grey wolf into a dog. Yes, early Homo sapiens hunted with early “wolf-dogs” in a symbiotic relationship that let humans borrow a wolf’s superior speed and senses, while allowing wolves to share in a human’s magical ability to kill prey at a distance with spears or arrows. But why, in the pursuit of more food, would humans take in, feed, nurture, and breed another meat-eating animal? Shipman puts it more forcefully: “Who would select such a ferocious and formidable predator as a wolf for an ally and companion?”

To find the answer, says Shipman, forget intentionality. Forget the old tale in which someone captures a baby animal, tames it, raises it, selects a mate for it, and brings up the friendliest babies.

Instead, it was the particular ecology of Europe about 50,000 years ago that drove grey wolves and human interlopers from Mesopotamia into close cooperation, thereby seeing off most large game and Europe’s own indigenous Neanderthals.

This story was explored in Shipman’s 2015 The Invaders. Our Oldest Companions develops that argument to explain why dogs and humans did not co-evolve similar behaviours elsewhere.

Australia provides Shipman with her most striking example. Homo sapiens first arrived in Australia without dogs, because back then (around 35,000 years ago, possibly earlier) there were no such things. (The first undisputed dog remains belong to the Bonn-Oberkassel dog, buried beside humans 14,200 years ago.)

The ancestors of today’s dingoes were brought to Australia by ship only around 3000 years ago, where their behaviour and charisma immediately earned them a central place in Native Australian folklore.

Yet, in a land very different to Europe, less densely populated by large animals and boasting only two large-sized mammalian predators, the Tasmanian tiger and the marsupial lion (both now extinct), there was never any mutual advantage to be had in dingoes and humans working, hunting, feasting and camping together. Consequently dingoes, though they’re eminently tameable, remain undomesticated.

The story of humans and dogs in Asia remains somewhat unclear, and some researchers still argue that the bond between wolf and man was first established here. Shipman, who’s having none of it, points to a crucial piece of non-evidence: if dogs first arose in Asia, then where are the ancient dog burials?

“Deliberate burial,” Shipman writes, “is just about the gold standard in terms of evidence that an animal was domesticated.” There are no such ancient graves in Asia. It’s near Bonn, on the right bank of the Rhine, that the earliest remains of a clearly domesticated dog were discovered in 1914, tucked between two human skeletons, their grave decorated with works of art made of bones and antlers.

Domesticated dogs now comprise more than 300 subspecies, though overbreeding has left hardly any that are capable of carrying out their intended functions of hunting, guarding, or herding.

Shipman passes no comment, but I can’t help but think this a sad and trivial end to a story that began so heroically, among mammoth and tiger and lion.

 

“Grotesque, awkward, and disagreeable”

Reading Stanislaw Lem’s Dialogues for the Times, 5 October 2021

Some writers follow you through life. Some writers follow you beyond the grave. I was seven when Andrei Tarkovsky filmed Lem’s satirical sci-fi novel Solaris, thirty seven when Steven Soderbergh’s very different (and hugely underrated) Solaris came out, forty when Lem died. Since then, a whole other Stanslaw Lem has arisen, reflected in philosophical work that, while widely available elsewhere, had to wait half a century or more for an English translation. In life I have nursed many regrets: that I didn’t learn Polish is not the least of them.

The point about Lem is that he writes about the future, predicting the way humanity’s inveterate tinkering will enable, pervert and frustrate its ordinary wants and desires. This isn’t “the future of technology” or “the future of the western world” or “the future of the environment”. It’s neither “the future as the author would like it to be”, nor “the future if the present moment outstayed its welcome”. Lem knows a frightening amount of science, and even more about technology, but what really matters is what he knows about people. His writing is not just surprisingly prescient; it’s timeless.

Dialogues is about cybernetics, the science of systems. A system is any material arrangement that responds to environmental feedback. A steam engine is a mere mechanism, until you add the governor that controls its internal pressure. Then it becomes a system. When Lem was writing, systems thinking was meant to transform everything, conciliating between the physical sciences and the humanities to usher in a technocratic Utopia.

Enthusiastic as 1957-vintage Lem was, there is something deliciously levelling about how he introduces the cybernetic idea. We can bloviate all we like about using data and algorithms to create a better society; what drives Philonous and Hylas’s interest in these eight dialogues (modelled on Berkeley’s Three Dialogues of 1713) is Hylas’s desperate desire to elude Death. This new-fangled science of systems reimagines the world as information, and the thing about information is that it can be transmitted, stored and (best of all) copied. Why then can’t it transmit, store and copy poor Death-haunted Hylas?

Well, of course, that’s certainly do-able, Philonous agrees — though Hylas might find cybernetic immortality “grotesque, awkward, and disagreeable”. Sure enough, Hylas baulks at Philomous’s culminating vision of humanity immortalised in serried ranks of humming metal cabinets.

This image certainly was prescient: Cybernetics was supposed to be a philosophy, one that would profoundly change our understanding of the animate and inanimate world. The philosophy failed to catch on, but its insights created something utterly unexpected: the computer.

Dialogues is important now because it describes (or described, rather, more than half a century ago — you can almost hear Lem’s slow hand-clapping from the Beyond) all the ways we do not comprehend the world we have made.

Cybernetics teaches us that systems are animate. It doesn’t matter what a system is made from. Workers in an office, onse and zeroes clouding a chip, proteins folding and refolding in a living cell, string and pulleys in a playground: are all good building materials for systems, and once a system is up and running, it is no longer reducible to its parts. It’s a distinct, unified whole, shaped by its past history and actively coexisting with its environment, and exhibiting behavior that cannot be precisely predicted from its structure. “If you insist on calling this new system a mechanism,” Lem remarks, drily, “then you must apply that term to living beings as well.”

We’ve yet to grasp this nettle: that between the living and non-living worlds sits a world of systems, unalive yet animate. No wonder, lacking this insight, we spend half our lives sneering at the mechanisms we do understand (“Alexa, stop calling my Mum!”) and the other half on our knees, worshipping the mechanisms we don’t. (“It says here on Facebook…”) The very words we use — “artificial intelligence” indeed! — reveal the paucity of our understanding.

“Lem understood, as no-one then or since has understood, how undeserving of worship are the systems (be they military, industrial or social) that are already strong enough to determine our fate. A couple of years ago, around the time Hong Kong protesters were destroying facial recognition towers, a London pedestrian was fined £90 for hiding his face from an experimental Met camera. The consumer credit reporting company Experian uses machine learning to decide the financial trustworthiness of over a billion people. China’s Social Credit System (actually the least digitised of China’s surveillance systems) operates under multiple, often contradictory legal codes.

The point about Lem is not that he was terrifyingly smart (though he was that); it’s that he had skin in the game. He was largely self-taught, because he had to quit university after writing satirical pieces about Soviet poster-boy Trofim Lysenko (who denied the existence of genes). Before that, he was dodging Nazis in Lv’v (and mending their staff cars so that they would break down). In his essay “Applied Cybernetics: An Example from Sociology”, Lem uses the new-fangled science of systems to anatomise the Soviet thinking of his day, and from there, to explain how totalitarianism is conceived, spread and performed. Worth the price of the book in itself, this little essay is a tour de force of human sympathy and forensic fury, shorter than Solzhenitsyn, and much, much funnier than Hannah Arendt.

Peter Butko’s translations of the Dialogues, and the revisionist essays Lem added to the 1971 second edition, are as witty and playful as Lem’s allusive Polish prose demands. His endnotes are practically a book in themselves (and an entertaining one, too).

Translated so well, Lem needs no explanation, no contextualisation, no excuse-making. Lem’s expertise lay in technology, but his loyalty lay with people, in all their maddening tolerance for bad systems. “There is nothing easier than to create a state in which everyone claims to be completely satisfied,” he wrote; “being stretched on the bed, people would still insist — with sincerity — that their life is perfectly fine, and if there was any discomfort, the fault lay in their own bodies or in their nearest neighbor.”

 

If this is Wednesday then this must be Thai red curry with prawns

Reading Dan Saladino’s Eating to Extinction for the Telegraph, 26 September 2021

Within five minutes of my desk: an Italian delicatessen, a Vietnamese pho house, a pizzeria, two Chinese, a Thai, and an Indian “with a contemporary twist” (don’t knock it till you’ve tried it). Can such bounty be extended over the Earth?

Yes, it can. It’s already happening. And in what amounts to a distillation of a life’s work writing about food, and sporting a few predictable limitations (he’s a journalist; he puts stories in logical order, imagining this makes an argument) Dan Saladino’s Eating to Extinction explains just what price we’ll pay for this extraordinary achievement which promises, not only to end world hunger by 2030 (a much-touted UN goal), but to make California rolls available everywhere from to Kamchatka to Karachi.

The problem with my varied diet (if this is Wednesday then this must be Thai red curry with prawns) is that it’s also your varied diet, and your neighbour’s; it’s rapidly becoming the same varied diet across the whole world. You think your experience of world cuisine reflects global diversity? Humanity used to sustain itself (admittedly, not too well) on 6,000 species of plant. Now, for over three quarters of our calories, we gorge on just nine: rice, wheat and maize, potato, barley, palm oil and soy, sugar from beets and sugar from cane. The same narrowing can be found in our consumption of animals and seafood. What looks to us like the world on a plate is in fact the sum total of what’s available world-wide, now that we’ve learned to grow ever greater quantities of ever fewer foods.

Saladino is in the anecdote business; he travels the Earth to meet his pantheon of food heroes, each of whom is seen saving a rare food for our table – a red pea, a goaty cheese, a flat oyster. So far, so very Sunday supplement. Nor is there anything to snipe at in the adventures of, say, Woldemar Mammel who, searching in the attics of old farmhouses and in barns, rescued the apparently extinct Swabian “alb” lentil; nor in former chef Karlos Baca’s dedication to rehabilitating an almost wholly forgotten native American cuisine.
That said, it takes Saladino 450 pages (which is surely a good 100 pages too many) to explain why the Mammels and Bacas of this world are needed so desperately to save a food system that, far from beaking down, is feeding more and more food to more and more people.

The thing is, this system rests on two foundations: nitrogen fertiliser, and monocropping. The technology by which we fix nitrogen from the air by an industrial process is sustainable enough, or can be made so. Monocropping, on the other hand, was a dangerous strategy from the start.

In the 1910s and 1920s the Soviet agronomist Nikolai Vavilov championed the worldwide uptake of productive strains, with every plant a clone of its neighbour. How else, but by monocropping, do you feed the world? By the 1930s though, he was assembling the world’s first seed banks in a desperate effort to save the genetic diversity of our crops — species that monocropping was otherwise driving to extinction.

Preserving heritage strains matters. They were bred over thousands of years to resist all manner of local environmental pressures, from drought to deluge to disease. Letting them die out is the genetic equivalent of burning the library at Alexandria.

But seed banks can’t hold everything (there is, as Saladino remarks, no Svalbard seed vault for chickens) and are anyway a desperate measure. Saladino’s tale of how, come the Allied invasion, the holdings of Iraq’s national seed bank at Abu Ghraib was bundled off to Tel Hadya in Syria, only then to be frantically transferred to Lebanon, itself an increasingly unstable state, sounds a lot more more Blade Runner 2049 then Agronomy 101.

Better to create a food system that, while not necessarily promoting rare foods (fancy some Faroese air-fermented sheep meat? — thought not) will at least not drive such foods to extinction.

The argument is a little bit woolly here, as what the Faroe islanders get up to with their sheep is unlikely to have global consequences for the world’s food supply. Letting a crucial drought-resistant strain of wheat go extinct in a forgotten corner of Afghanistan, on the other hand, could have unimaginably dire consequences for us in the future.
Saladino’s grail is a food system with enough diversity in it to adapt to environmental change and withstand the onslaught of disease.

Is such a future attainable? Only to a point. Some wild foods are done for already because the high prices they command incentivize their destruction. If you want some of Baca’s prized and pungent bear root, native to a corner of Colorado, you’d better buy it now (but please, please don’t).

Rare cultivated foods stand a better chance. The British Middle White pig is rarer than the Himalayan snow leopard, says Saladino, but the stocks are sustainable enough that it is now being bred for the table.

Attempting to encompass the Sixth Extinction on the one hand, and the antics of slow-foodies like Mammel and Baca on the other is a recipe for cognitive dissonance. In the end, though, Saladino succeeds in mapping the enormity of what human appetite has done to the planet.

Saladino says we need to preserve rare and forgotten foods, partly because they are part of our cultural heritage, but also, and more hard-headedly, so that we can study and understand them, crossing them with existing lines to shore up and enrich our dangerously over-simplified food system. He’s nostalgic for our lost food past (and who doesn’t miss apples that taste of apples?) but he doesn’t expect us to delete Deliveroo and spend our time grubbing around for roots and berries.

Unless of course it’s all to late. It would not take many wheat blights or avian flu outbreaks before slow food is all that’s left to eat.

 

The tools at our disposal

Reading Index, A History of the, by Dennis Duncan, for New Scientist, 15 September 2021

Every once in a while a book comes along to remind us that the internet isn’t new. Authors like Siegfried Zielinski and Jussi Parikka write handsomely about their adventures in “media archaeology”, revealing all kinds of arcane delights: the eighteenth-century electrical tele-writing machine of Joseph Mazzolari; Melvil Dewey’s Decimal System of book classification of 1873.

It’s a charming business, to discover the past in this way, but it does have its risks. It’s all too easy to fall into complacency, congratulating the thinkers of past ages for having caught a whiff, a trace, a spark, of our oh-so-shiny present perfection. Paul Otlet builds a media-agnostic City of Knowledge in Brussels in 1919? Lewis Fry Richardson conceives a mathematical Weather Forecasting Factory in 1922? Well, I never!

So it’s always welcome when an academic writer — in this case London based English lecturer Dennis Duncan — takes the time and trouble to tell this story straight, beginning at the beginning, ending at the end. Index, A History of the is his story of textual search, told through charming portrayals of some of the most sophisticated minds of their era, from monks and scholars shivering among the cloisters of 13th-century Europe to server-farm administrators sweltering behind the glass walls of Silicon Valley.

It’s about the unspoken and always collegiate rivalry between two kinds of search: the subject index (a humanistic exercise, largely un-automatable, requiring close reading, independent knowledge, imagination, and even wit) and the concordance (an eminently automatable listing of words in a text and their locations).

Hugh of St Cher is the father of the concordance: his list of every word in the bible and its location, begun in 1230, was a miracle of miniaturisation, smaller than a modern paperback. It and its successors were useful, too, for clerics who knew their bibles almost by heart.

But the subject index is a superior guide when the content is unfamiliar, and it’s Robert Grosseteste (born in Suffolk around 1175) who we should thank for turning the medieval distinctio (an associative list of concepts, handy for sermon-builders), into something like a modern back-of-book index.

Reaching the present day, we find that with the arrival of digital search, the concordance is once again ascendant (the search function, Ctl-F, whatever you want to call it, is an automated concordance), while the subject index, and its poorly recompensed makers, are struggling to keep up in an age of reflowable screen text. (Sewing embedded active indexes through a digital text is an excellent idea which, exasperatingly, has yet to catch on.)

Running under this story is a deeper debate, between people who want to access their information quickly, and people (especially authors) who want people to read books from beginning to end.

This argument about how to read has been raging literally for millennia, and with good reason. There is clear sense in Socrates’ argument against reading itself, as recorded in Plato’s Phaedrus (370 BCE): “You have invented an elixir not of memory, but of reminding,” his mythical King Thamus complains. Plato knew a thing or two about the psychology of reading, too: people who just look up what they need “are for the most part ignorant,” says Thamus, “and hard to get along with, since they are not wise, but only appear wise.”

Anyone who spends too many hours a day on social media will recognise that portrait — if they have not already come to resemble it.

Duncan’s arbitration of this argument is a wry one. Scholarship, rather than being timeless and immutable, “is shifting and contingent,” he says, and the questions we ask of our texts “have a lot to do with the tools at our disposal.”

“This stretch-induced feeling of awe activates our brain’s spiritual zones”

Reading Angus Fletcher’s Wonderworks: Literary invention and the science of stories for New Scientist, 1 September 2021

Can science explain art?

Certainly: in 1999 the British neurobiologist Semir Zeki published Inner Vision, an illuminating account of how, through trial and error and intuition, different schools of art have succeeded in mapping the neurological architectures of human vision. (Put crudely, Rembrandt tickles one corner of the brain, Piet Mondrian another.)

Twelve years later, Oliver Sacks contributed to an already crowded music psychology shelf with Musicophilia, a collection of true tales in which neurological injuries and diseases are successfully treated with music.

Angus Fletcher believes the time has come for drama, fiction and literature generally to succumb to neurological explanation. Over the past decade, neuroscientists have been using pulse monitors, eye-trackers, brain scanners “and other gadgets” to look inside our heads as we consume novels, poems, films, and comic books. They must have come up with some insights by now.

Fletcher’s hypothesis is that story is a technology, which he defines as “any human-made thing that helps to solve a problem”.

This technology has evolved, over at least the last 4000 years, to help us negotiate the human condition, by which Fletcher means our awareness of our own mortality, and the creeping sense of futility it engenders. Story is “an invention for overcoming the doubt and the pain of just being us”.

Wonderworks is a scientific history of literature; each of its 25 chapters identifies a narrative “tool” which triggers a different, traceable, evidenced neurological outcome. Each tool comes with a goofy label: here you will encounter Butterfly Immersers and Stress Transformers, Humanity Connectors and Gratitude Multipliers.

Don’t sneer: these tools have been proven “to alleviate depression, reduce anxiety, sharpen intelligence, increase mental energy, kindle creativity, inspire confidence, and enrich our days with myriad other psychological benefits.”

Now, you may well object that, just as area V1 of the visual cortex did not evolve so we could appreciate the paintings of Piet Mondrian, so our capacity for horror and pity didn’t arise just so we could appreciate Shakespeare. So if story is merely “holding a mirror up to nature”, then Fletcher’s long, engrossing book wouldn’t really be saying anything.

As any writer will tell you, of course, a story isn’t merely a mirror. The problem comes when you try and make this perfectly legitimate point using neuroscience.

Too often for comfort, and as the demands of concision exceed all human bounds, the reader will encounter passages like: “This stretch-induced feeling of awe activates our brain’s spiritual zones, enriching our consciousness with the sensation of meanings beyond.”

Hitting sentences like this, I normally shut the book, with some force. I stayed my hand on this occasion because, by the time this horror came to light, two things were apparent. First, Fletcher — a neuroscientist turned story analyst — actually does know his neurobiology. Second, he really does know his literature, making Wonderworks a profound and useful guide to reading for pleasure.

Wonderworks fails as popular science because of the extreme parsimony of Fletcher’s explanations; fixing this problem would, however, have involved composing a multi-part work, and lost him his general audience.

The first person through the door is the one who invariably gets shot. Wonderworks is in many respects a pug-ugly book. But it’s also the first of its kind: an intelligent, engaged, erudite attempt to tackle, neurologically, not just some abstract and simplified “story”, but some the world’s greatest literature, from the Iliad to The Dream of the Red Chamber, from Disney’s Up to the novels of Elena Ferrante.

It is easy to get annoyed with this book. But those who stay calm will reap a rich harvest.