Now and again they kill people

Reading Andrew Scull’s Desperate Remedies for the Telegraph, 3 April 2022

Are mental illnesses real?

Well, says, Andrew Scull, they hurt; they blight lives; now and again they kill people. So there’s that.

But are they illnesses in any recognisable sense? They can’t be cured. Some people, after years of suffering, experience complete remission for no reason. The search for reliable genetic markers for schizophrenia and major depression has proved a snark-hunt. And so on: Desperate Remedies is the story of what happens when the world stubbornly refuses to reward our efforts at rational understanding.

There are two traditions in psychiatry. The first, greatly shaped by our experience with syphilis, assumes that mental illness is an organic failing, perhaps the result of an infection. Henry Cotton is the unlovely poster child of this tendency, a man whose fin de siecle war on “focal infection” involved the surgical removal, of teeth and tonsils first of all, then colons and cervixes, and then just about anything his knife could reach — and killed very nearly half his clientele.

The other tradition, mindful especially of those traumatised by war, assumes mental illness is grounded in individual experience. At its psychoanalytic height, in the twenty years following the second world war, it could blame just about everything on the parents. The Hungarian-American psychoanalyst Franz Alexander believed that “the asthmatic wheeze was the ‘suppressed cry’ of a patient suffocated by an over-attentive mother.” The current crop of trauma therapies — springing from the roots of 1960s-era PTSD like mushrooms after a spring rain — is the latest lurid flowering of this tradition.

Meanwhile psychiatrists — the poor bloody footsoldiers in this intellectual conflict — have been treating ordinary people in oversubscribed, underfunded institutions (or in the absence of those institutions, where “care in the community” holds sway). It’s their “desperate remedies” — from shock therapies to lobotomies — that form the core of this book.

Andrew Scull’s erudite, precise, blisteringly critical history of 200 years of psychiatry spends many pages explaining what happens when overambitious clinicians meet clients deprived of their rights. (Not everyone in the profession is a Nurse Ratched, but it’s worth remembering that One Flew Over the Cuckoo’s Nest was drawn from personal experience.)

In spite of everything, Scull still holds out the narrow possibility that psychiatry has a future, if it would only calm down and own up to its limitations. In the psychopharmological present, for instance, much that we’re told works, doesn’t work. Or doesn’t work for very long. Or is accompanied by so many side effects that many feel they would be better off if it didn’t work. What actually works doesn’t work nearly as well as the press says it works. And — the cherry on the cake — we don’t know why it works. (Any piece of folk wisdom you may have picked up about “dopamine imbalances” or “serotonin levels” is almost certainly wrong.)

The opioid crisis in the United States is a public health scandal that’s been waiting to happen since the early 1940s, when Arthur Sackler, among others, worked out how to couch drug advertisements as clinical information. In its wake, the efficacy of countless drugs is being reassessed. Old trials are being picked over, old claims re-examined. The result? “GlaxoSmithKline has all but closed its psychiatric laboratories,” Scull remarks, surveying the ruins left by this latest “paradigm shift” in psychiatry; “AstraZeneca has essentially dropped internal research on psychopharmacology, and Pfizer has dramatically reduced its spending in the psychiatric arena.”

Were all their efforts quackery? Of course not. It is easy (and cheap) to cherry-pick horror stories from Scull’s impassioned history. But his far more worrying point is that plenty of the effort expended over the last 200 years was intelligent, sincere, and honestly conducted — and that, too, has brought only marginal and temporary relief to the suffering mind.


Clay moulded by time

Reading Thomas Halliday’s Otherlands: A world in the making for the Telegraph, 5 February 2022

Earlier books have painted tableaux of life at other epochs, but few ever got the scale right. Thomas Halliday’s visions are monstrous.

Halliday is a paleoecologist. That’s a branch of biology, which in turn has become a troublesome cousin of physics, borrowing its technology as it penetrates the living machineries of heritability and development. “My own scientific work,” writes Birmingham-based researcher Thomas Halliday, “has mostly happened in basement museum collections and within computer algorithms, using shared anatomical features to try and work out the relationships among the mammals that lived in the aftermath of the last mass extinction.”

But Halliday is also a child of Rannoch — that glacier-scoured landscape of extinct volcanoes that dominates Scotland’s central highlands. And anyone familiar with that region will see instinctively how it underpins this epic near-hallucinatory natural history of the living earth.

Otherlands works backwards through the history of life, past the icebound Pleistocene 20,000 years ago and the Chicxulub asteroid strike 66 million years ago, past the deeply weird Triassic and the lush Devonian, all the way back to the first stirrings of multicellular life in the Ediacaran, 550 million years ago.

Many readers will come for the jump-scares. The paleocene Mesodma, which looks like a rodent until it opens its mouth, revealing a terrifying notched tooth, as though a buzzsaw were buried in its gum. The Gigatitan, a Triassic forerunner of the grasshopper, whose stridulations generate a bullfrog-like baritone song. The Tully Monster, the herring of the Carboniferous, with a segmented torpedo body, two rippling squid-like tail fins and at the front, “something like the hose of a vacuum cleaner, with a tiny tooth-filled grabbing claw at its end”.

Halliday weaves these snapshots of individual plants and animals into a vision of how carbon-based life continually adapts to its shifting, spinning home. It’s a story that becomes increasingly uncanny as it develops, as how could it not? In the normal course of things, we only ever get to see a single snapshot from this story, which is governed by rules that only start to make sense in geological time.

Anyone who’s looked at a crab feeding — a wriggling mass of legs that are tongues that are teeth — will not be surprised to learn that arthropods are the Swiss Army knives of the animal world, “with each segment containing a flexible, jointed appendage that can be adapted to a huge variety of functions.” But arthropods are weird-looking to begin with.

It’s when the cuddly end of nature starts to morph that the flesh begins to creep. In Gargano, that was once an island in the Mediterranean, home to dwarf elephants, giant swans and metre-long barn owls, we learn that everything on an island tends towards one particular size.

In the cold, though, this process goes into reverse. Getting big means your bulk can keep you warm for longer. Getting small means you can hibernate. Seymour Island in Antarctica, in the Eocene, boasted much wildlife, but nothing in size between a rabbit and a sheep.

So far, so Alice-like. More unnerving are the ways in which living things, quite unrelated, converge to exploit similar settings. The pangolins of Africa and South Asia are more closely related to humans than they are to South American armadilloes. At first glance, though, you’d be hard pressed to tell these two animals apart.

If nature fitted together neatly, this sort of thing might not seem so disquieting. But things don’t fit together. There is no balance, just a torrent of constant change, and plants and animals lost in its eddies. When threatened, the slow loris raises its arms up behind its head, shivers and hisses. Why? Because it’s trying to look like a cobra, though the ranges of slow loris and cobra haven’t overlapped in tens of thousands of years.

Slowly, but very surely, the the six-metre long sea serpents of the Triassic come to see almost benign next to the Giant Dormouse or Terrible Moon-Rat which, in their uncanny resemblance to familiar animals, remind us that we humans, too, are clay moulded by time.

In the story of life on Earth, the activities of Homo sapiens are an eyeblink. We’re a sudden, short-lived, energetic thing, like a volcanic eruption, like a rock from space. It doesn’t really matter what they are, or whether they take a split-second, a couple of hundred years, or a few thousand years to wreak their havoc. Sudden energetic things destroy.

But rather than fall into the contemporary cliche and attempt to conjure up some specious agency for us all — “you too are a governor of the planet!” — Halliday engages us as Victorian writers once did, filling us with wonder, not anxiety — and now with added nuance, and better evidence to hand. The chapter “Deluge”, on the filling of the Mediterranean basin in the Miocine, and the chapter “Cycles” describing Eocene polar rainforests, were personal favourites, dazzling me to the point where I wondered, dizzily, how accurate these visions might be.

I don’t mean that Halliday has taken short cuts. On the contrary: the story he tells is handsomely evidenced. I mean only that his story will eventually date. Not far from where I live, Benjamin Waterhouse Hawkins sculpted the first full-scale representations of dinosaurs, setting them in a landscape designed by Joseph Paxton, for the delectation of visitors to the Crystal Palace, once it relocated near Penge.

People now point and titter at Hawkins’s inaccuracies — the horn on one beast turns out to be the thumb belonging to another and so on — but what’s really staggering is how accurate his models are, given the evidence available at the time.

Halliday stands on the shoulders of Hawkins and other giants. He knows about dinosaur coloration, and dinosaur vision (they could see into the ultra-violet). He can trace the wax on fossilised leaves, and tell you how the seasons worked, and about the prevailing winds. He can trace our ancient insectivorous past through genes we all carry that code for digesting chitin. Picking among countless recent discoveries, he can even tell you how four-limbed flying pterosaurs came in to land (“hind-feet first, a jump, skip and hop to a stop”).

I wonder what Halliday cannot yet know?

As the author says, “Nothing provokes debate quite like the hunting of monsters.”

Whatever happened to Mohammedan Hindus?

Reading Anna Della Subin’s Accidental Gods: On Men Unwittingly Turned Divine for the Telegraph, 8 January 2022

He is a prince of Greece – but he is not Greek. He is a man of Danish, German and Russian blood, but he springs from none of those places. Who is he? Prince Philip, of blessed memory, consort of Queen Elizabeth II? Or is he – as a handful of her subjects, half a world away, would have it – the son of Vanuatu’s volcano god Kalbaben?

Essayist Anna Della Subin wants you to understand why you might mistake a man for a god; why this happens more often than you’d think; and what this says about power, and identity, and about colonialism in particular.

Early proofs of Accidental Gods arrived on my doormat on Tuesday 2 November, the same day QAnon believers gathered in Dallas’s Dealey Plaza to await the resurrection of JFK’s son John (dead these 20 years). So: don’t sneer. This kind of thing can happen to anyone. It can happen now. It can happen here.

Men have been made divine by all manner of people, all over the world. Ranging widely across time and space, Accidental Gods is a treat for the adventurous armchair traveller, though a disconcerting one. We are reminded, with some force, that even the most sophisticated-seeming culture exists, by and large, to contain ordinary human panic in the face of an uncaring cosmos.

After the second world war, during the Allied occupation, ordinary Japanese folk plied American General Douglas MacArthur with lotus roots and dried persimmons, red beans, rice cakes, bonsai trees, walking sticks, samurai swords, deerskins, a kimono, and much else besides. These were offerings, explicitly made to a newcomer God. Now, we more often talk about them as acts of gratitude and respect. This is just ordinary decency — why would one poke fun at a land one has already nuked, defeated, and occupied? Japan’s written historical record lets us focus on the Meiji dynasty’s politics while drawing a veil over its frankly embarrassing theology.

But not everyone has such a rich political account of themselves to hide behind. In the early 1920s Hauka mediums in Niger, central Africa, were possessed by the spirits of their European conquerors. Their zombified antics were considered superstitious and backward. But were they? They managed, after all, to send up the entire French administration. (“In the absence of a pith helmet,” we are told, “they could fashion one out of a gourd”.) In the Congolese town of Kabinda, meanwhile, the wives of shamanic adepts found themselves channelling the spirits of Belgian settler wives. Their faces chalked and with bunches of feathers under their arms (“possibly to represent a purse”) they went around shrilly demanding bananas and hens.

Western eye-witnesses of these events weren’t at all dismissive; they were disturbed. One visitor, reporting to parliament in London in or before 1886, said these people were being driven mad by the experience of colonial subjection. Offerings made to a deified British soldier in Travancore, at India’s southernmost point were, according to this traveller, “an illustration of the horror in which the English were held by the natives.”

But what if the prevailing motive for the white man’s deification was “not horror or dislike, but pity for his melancholy end, dying as he did in a desert, far away from friends”? That was the contrary opinion of a visiting missionary, and he may have had a point: across the subcontinent, “the practice of deifying humans who had died in premature or tragic ways was age-old,” Subin tells us.

Might the “spirit possessed” just have been having a laugh? Again: it’s possible. In 1864, during a Māori uprising against the British, Captain P. W. J. Lloyd was killed, and his severed head became the divine conduit for the angel Gabriel, who, among other fulminations, had not one good word to say about the Church of England.

Subin shows how, by creating and worshipping powerful outsiders, subject peoples have found a way to contend with an overwhelming invading force. The deified outsider, be he a British Prince or a US general, Ethiopian emperor Haile Selassie or octogenarian poet Nathaniel Tarn, “appears on every continent on the map, at times of colonial invasion, nationalist struggle and political unrest.”

This story is as much about the colonisers as the conquered, as much about the present as the past, showing how the religious and the political shade into each other so that “politics is ever a continuation of the sacred under a new name”. Perhaps this is why Subin, while no enthusiast of Empire, takes aim less at the soldiers and settlers and missionaries – who at least took some personal risk and kept their eyes open – than at the academics back home in Europe, and in particular the intellectual followers and cultural descendents of German philologist Freidrich Max Müller, founder of the science of comparative religion. Their theories imposed, on wholly unrelated belief systems, a set of Protestant standards that, among other things, insisted on the insuperable gulf between the human and the divine. (Outside of Christian Europe, this divide hardly exists, and even Catholics have their saints.)

So Europe’s new-fangled science of religion “invented what it purported to describe”, ascribing “belief” to all manner of nuanced behaviours that expressed everything from contempt for the overlord to respect for the dead, to simple human charity. Subin quotes contemporary philosopher Bruno Latour: “A Modern is someone who believes that others believe.”

Subin sings a funeral hymn to religions that ossified. Writing about the catastrophic Partition of India along religious lines, she writes, “There was no place within this modern taxonomy for the hundreds of thousands who labeled themselves ‘Mohammedan Hindus’ on a 1911 census, or for those who worshipped the prophet Muhammad as an avatar of Vishnu.”

Accidental Gods is a playful, ironic, and ambiguous book about religion, at a time when religion – outside of Dealey Plaza – has grown as solemn as an owl. It’s no small achievement for Subin to have written something that, even as it explores the mostly grim religious dimensions of the colonial experience, does not reduce religion to politics but, to the contrary, leaves us hankering, like QAnon’s unlovely faithful, for a wider, wilder pantheon.

“A perfect storm of cognitive degradation”

Reading Johann Hari’s Stolen Focus: Why you can’t pay attention for the Telegraph, 2 January 2022

Drop a frog into boiling water, and it will leap from the pot. Drop it into tepid water, brought slowly to the boil, and the frog will happily let itself be cooked to death.

Just because this story is nonsense, doesn’t mean it’s not true — true of people, I mean, and their tendency to acquiesce to poorer conditions, just so long as these conditions are introduced slowly enough. (Remind yourself of this next time you check out your own groceries at the supermarket.)

Stolen Focus is about how our environment is set up to fracture our attention. It starts with our inability to set the notifications correctly on our mobile phones, and ends with climate change. Johann Hari thinks a huge number of pressing problems are fundamentally related, and that the human mind is on the receiving end of what amounts to a denial-of-service attack. One of Hari’s many interviewees is Earl Miller from MIT, who talks about “a perfect storm of cognitive degradation, as a result of distraction”; to which Hari adds the following, devastating gloss: “We are becoming less rational less intelligent, less focused.”

To make such a large argument stick, though, Hari must ape the wicked problem he’s addressing: he must bring the reader to a slow boil.

Stolen Focus begins with an extended grumble about how we don’t read as many books as we used to, or buy as many newspapers, and how we are becoming increasingly enslaved to our digital devices. Why we should listen to Hari in particular, admittedly a latecomer to the “smartphones bad, books good” campaign, is not immediately apparent. His account of his own months-long digital detox — idly beachcombing the shores of Provincetown at the northern tip of Cape Cod, War and Peace tucked snugly into his satchel — is positively maddening.

What keeps the reader engaged are the hints (very well justified, it turns out) that Hari is deliberately winding us up.

He knows perfectly well that most of us have more or less lost the right to silence and privacy — that there will be no Cape Cod for you and me, in our financial precarity.

He also knows, from bitter experience, that digital detoxes don’t work. He presents himself as hardly less of a workaholic news-freak than he was before taking off to Massachusetts.

The first half of Stolen Focus got me to sort out my phone’s notification centre, and that’s not nothing; but it is, in the greater scheme of Hari’s project, hardly more than a parody of the by now very familiar “digital diet book” — the sort of book that, as Hari eventually points out, can no more address the problems filling this book than a diet book can address epidemic obesity.

Many of the things we need to do to recover our attention and focus “are so obvious they are banal,” Hari writes: “slow down, do one thing at a time, sleep more… Why can’t we do the obvious things that would improve our attention? What forces are stopping us?”

So, having had his fun with us, Hari begins to sketch in the high sides of the pot in which he finds us being coddled.

The whole of the digital economy is powered by breaks in our attention. The finest minds in the digital business are being paid to create ever-more-addicting experiences. According to former Google engineer Tristan Harris, “we shape more than eleven billion interruptions to people’s lives every day.” Aza Raskin, co-founder of the Center for Humane Technology, calls the big tech companies “the biggest perpetrators of non-mindfulness in the world.”

Social media is particularly insidious, promoting outrage among its users because outrage is wildly more addictive than real news. Social media also promotes loneliness. Why? Because lonely people will self-medicate with still more social media. (That’s why Facebook never tells you which of your friends are nearby and up for a coffee: Facebook can’t make money from that.)

We respond to the anger and fear a digital diet instils with hypervigilance, which wrecks our attention even further and damages our memory to boot. If we have children, we’ll keep them trapped at home “for their own safety”, though our outdoor spaces are safer than they have ever been. And when that carceral upbringing shatters our children’s attention (as it surely will), we stuff them with drugs, treating what is essentially an environmental problem. And on and on.

And on. The problem is not that Stolen Focus is unfocused, but that it is relentless: an unfeasibly well-supported undergraduate rant that swells — as the hands of the clock above the bar turn round and the beers slide down — to encompass virtually every ill on the planet, from rubbish parenting to climate change.

“If the ozone layer was threatened today,” writes Hari, “the scientists warning about it would find themselves being shouted down by bigoted viral stories claiming the threat was all invented by the billionaire George Soros, or that there’s no such thing as the ozone layer anyway, or that the holes were really being made by Jewish space lasers.”

The public campaign Hari wants Stolen Focus to kick-start (there’s an appendix; there’s a weblink; there’s a newsletter) involves, among other things, a citizen’s wage, outdoor play, limits on light pollution, public ownership of social media, changes in the food supply, and a four-day week. I find it hard to disagree with any of it, but at the same time I can’t rid myself of the image of how, spiritually refreshed by War and Peace, consumed in just a few sittings in a Provincetown coffee shop, Hari must (to quote Stephen Leacock) have “flung himself from the room, flung himself upon his horse and rode madly off in all directions”.

If you read just one book about how the modern world is driving us crazy, read this one. But why would you read just one?

82.8 per cent perfect

Visiting Amazonia at London’s Science Museum for the Telegraph, 13 October 2021

The much-garlanded Brazilian photographer Sebastião Salgado is at London’s Science Museum to launch a seven-plus-years-in-the-making exhibition of photographs from Amazônia — and, not coincidentally, there’s barely a fortnight to go before the 26th United Nations Climate Change Conference convenes in Glasgow.

Salgado speaks to the urgency of the moment. We must save the Amazon rainforest for many reasons, but chiefly because the world’s rainfall patterns depend on it. We should stop buying Amazonian wood; we should stop buying beef fed on Amazonian soya; we should stop investing in companies who have interests in Amazonian mining.

There are only so many ways to say these things, and only so many times a poor mortal can hear them. On the face of it, Salgado’s enormous exhibition, set to an immersive soundscape by Seventies new-age pioneer Jean-Michel Jarre, sounds more impressive than impactful. Selgado is everyone’s idea of an engaged artist — his photographs of workers at the Serra Pelada gold mine in Brazil are world-famous — but is it even in us, now, to feel more concerned about the rainforest?

Turns out that it is. Jarre’s music plays a significant part in this show, curated and designed by Sebastiao’s wife Lelia Wanick Salgado. Assembled from audio archives in Geneva, it manages to be both politely ambient and often quite frightening in its dizzying assemblage of elemental roars (touches of Jóhann Jóhannsson, there), bird calls, forest sounds and human voices. And Selgado’s epic visions of the Amazon more than earn such Stürm und Drang.

This is not an exhibition about the 17.2 per cent of the rainforest that is already lost us. It’s not about logging companies or soy farms, gold mines or cattle ranches. It’s about what’s left. Ecologically the region’s losses are catastrophic; but there’s still plenty to save and, for a photographer, plenty to see.

Here, rendered in Selgado’s exquisitely detailed, thumpingly immediate monochrome, is Anavilhanas, the world’s largest freshwater archipelago, a wetland so complex and mutable, no-one has ever been able to settle there. There are mountains, “inselbergs”, rising out of the forest like volcanic islands in some fantastical South China Sea. There are bravura performances of the developer’s art: rivers turned to tin-foil, and leaves turned to photographic grain, and rainstorms turned to atom-bomb explosions, and clouds caught at angles that reveal what they truly are: airborn rivers. As they spill over the edge of Brazil, they dump more moisture into the Atlantic than the mighty Amazon itself.

Dotted about the exhibition space are oval “forest shelters”: dwellings for intimate portraits of twelve different forest peoples. Selgado acknowledges this anthropological effort merely scratches the surface: Amazonia’s 192 distinct groups constitute the most culturally and linguistically diverse region on the planet. Capturing and communicating that diversity conveys the scale of the region even better than those cloud shots.

The Ashaninka used to trade with the Incas. When the Spanish came, their supreme god Pawa turned all the wise men into animals to keep the region’s secrets. The highland Korubo (handy with a war club) became known as mud people, lathering themselves with the stuff against mosquitoes whenever they came down off their hill. The Zo’é place nuts in the mouths of the wild pigs they have killed so the meal can join in with its own feast. The Suruwahá quite happily consume the deadly spear-tip toxin timbó, figuring its better to die young and healthy (and many do).

The more we explore, the more we find it’s the profound and sometimes disturbing differences between these peoples that matter; not their surface exoticism. In the end, faced with such extraordinary diversity, we can only look in the mirror and admit our own oddness, and with it our kinship. We, too — this is the show’s deepest lesson — are, in every possible regard, like the playful, charming, touching, sometimes terrifying subjects of Selgado’s portraits, quite impossibly strange.

If this is Wednesday then this must be Thai red curry with prawns

Reading Dan Saladino’s Eating to Extinction for the Telegraph, 26 September 2021

Within five minutes of my desk: an Italian delicatessen, a Vietnamese pho house, a pizzeria, two Chinese, a Thai, and an Indian “with a contemporary twist” (don’t knock it till you’ve tried it). Can such bounty be extended over the Earth?

Yes, it can. It’s already happening. And in what amounts to a distillation of a life’s work writing about food, and sporting a few predictable limitations (he’s a journalist; he puts stories in logical order, imagining this makes an argument) Dan Saladino’s Eating to Extinction explains just what price we’ll pay for this extraordinary achievement which promises, not only to end world hunger by 2030 (a much-touted UN goal), but to make California rolls available everywhere from to Kamchatka to Karachi.

The problem with my varied diet (if this is Wednesday then this must be Thai red curry with prawns) is that it’s also your varied diet, and your neighbour’s; it’s rapidly becoming the same varied diet across the whole world. You think your experience of world cuisine reflects global diversity? Humanity used to sustain itself (admittedly, not too well) on 6,000 species of plant. Now, for over three quarters of our calories, we gorge on just nine: rice, wheat and maize, potato, barley, palm oil and soy, sugar from beets and sugar from cane. The same narrowing can be found in our consumption of animals and seafood. What looks to us like the world on a plate is in fact the sum total of what’s available world-wide, now that we’ve learned to grow ever greater quantities of ever fewer foods.

Saladino is in the anecdote business; he travels the Earth to meet his pantheon of food heroes, each of whom is seen saving a rare food for our table – a red pea, a goaty cheese, a flat oyster. So far, so very Sunday supplement. Nor is there anything to snipe at in the adventures of, say, Woldemar Mammel who, searching in the attics of old farmhouses and in barns, rescued the apparently extinct Swabian “alb” lentil; nor in former chef Karlos Baca’s dedication to rehabilitating an almost wholly forgotten native American cuisine.
That said, it takes Saladino 450 pages (which is surely a good 100 pages too many) to explain why the Mammels and Bacas of this world are needed so desperately to save a food system that, far from beaking down, is feeding more and more food to more and more people.

The thing is, this system rests on two foundations: nitrogen fertiliser, and monocropping. The technology by which we fix nitrogen from the air by an industrial process is sustainable enough, or can be made so. Monocropping, on the other hand, was a dangerous strategy from the start.

In the 1910s and 1920s the Soviet agronomist Nikolai Vavilov championed the worldwide uptake of productive strains, with every plant a clone of its neighbour. How else, but by monocropping, do you feed the world? By the 1930s though, he was assembling the world’s first seed banks in a desperate effort to save the genetic diversity of our crops — species that monocropping was otherwise driving to extinction.

Preserving heritage strains matters. They were bred over thousands of years to resist all manner of local environmental pressures, from drought to deluge to disease. Letting them die out is the genetic equivalent of burning the library at Alexandria.

But seed banks can’t hold everything (there is, as Saladino remarks, no Svalbard seed vault for chickens) and are anyway a desperate measure. Saladino’s tale of how, come the Allied invasion, the holdings of Iraq’s national seed bank at Abu Ghraib was bundled off to Tel Hadya in Syria, only then to be frantically transferred to Lebanon, itself an increasingly unstable state, sounds a lot more more Blade Runner 2049 then Agronomy 101.

Better to create a food system that, while not necessarily promoting rare foods (fancy some Faroese air-fermented sheep meat? — thought not) will at least not drive such foods to extinction.

The argument is a little bit woolly here, as what the Faroe islanders get up to with their sheep is unlikely to have global consequences for the world’s food supply. Letting a crucial drought-resistant strain of wheat go extinct in a forgotten corner of Afghanistan, on the other hand, could have unimaginably dire consequences for us in the future.
Saladino’s grail is a food system with enough diversity in it to adapt to environmental change and withstand the onslaught of disease.

Is such a future attainable? Only to a point. Some wild foods are done for already because the high prices they command incentivize their destruction. If you want some of Baca’s prized and pungent bear root, native to a corner of Colorado, you’d better buy it now (but please, please don’t).

Rare cultivated foods stand a better chance. The British Middle White pig is rarer than the Himalayan snow leopard, says Saladino, but the stocks are sustainable enough that it is now being bred for the table.

Attempting to encompass the Sixth Extinction on the one hand, and the antics of slow-foodies like Mammel and Baca on the other is a recipe for cognitive dissonance. In the end, though, Saladino succeeds in mapping the enormity of what human appetite has done to the planet.

Saladino says we need to preserve rare and forgotten foods, partly because they are part of our cultural heritage, but also, and more hard-headedly, so that we can study and understand them, crossing them with existing lines to shore up and enrich our dangerously over-simplified food system. He’s nostalgic for our lost food past (and who doesn’t miss apples that taste of apples?) but he doesn’t expect us to delete Deliveroo and spend our time grubbing around for roots and berries.

Unless of course it’s all to late. It would not take many wheat blights or avian flu outbreaks before slow food is all that’s left to eat.


The old heave-ho

The Story of Work: A New History of Humankind by Jan Lucassen, reviewed for the Telegraph 14 August 2021

“How,” asks Dutch social historian Jan Lucassen, “could people accept that the work of one person was rewarded less than that of another, that one might even be able to force the other to do certain work?”

The Story of Work is just that: a history of work (paid or otherwise, ritual or for a wage, in the home or out of it) from peasant farming in the first agrarian societies to gig-work in the post-Covid ruins of the high street, and spanning the historical experiences of working people on all five inhabited continents. The writing is, on the whole, much better than the sentence you just read, but no less exhausting. At worst, it put me in mind of the work of English social historian David Kynaston; super-precise prose stitched together to create an unreadably compacted narrative.

For all its abstractions, contractions and signposting, however, The Story of Work is full of colour, surprise and human warmth. What other social history do you know writes off the Industrial Revolution as a net loss to music? “Just think of the noise from rattling machines that made it impossible to talk,” Lucassen writes, “in contrast to small workplaces or among larger troupes of workers who mollified work in the open air by singing shanties and other work songs.”

For 98 per cent of our species’ history we lived lives of reciprocal altruism in hunting-and-gathering clan groups. With the advent of farming and the formation of the first towns came surpluses and, for the first time, the feasibility of distributing resources unequally.

At first, conspicuous generosity ameliorated the unfairnesses. As the sixteenth-century French judge Étienne de la Boétie wrote: “theatres, games, plays, spectacles, marvellous beasts, medals, tableaux, and other such drugs were for the people of antiquity the allurements of serfdom, the price of their freedom, the tools of tyranny.” (The Story of Work is full of riches of this sort: strip off the narrative, and there’s a cracking miscellany still to enjoy.)

Lucassen diverges from the popular narrative (in which the invention of agriculture is the fount of all our ills) on several points. First, agricultural societies do not inevitably become marketplaces. Bantu-speaking agriculturalists spread across central, eastern and southern Africa between 3500 BCE and 500 CE, while maintaining perfect equality. “Agriculture and egalitarianism are compatible,“ says Lucassen.

It’s not the crops, but the livestock, that are to blame for our expulsion from hunter-gatherer Eden. If notions of private property had to arise anywhere, they surely arose, Lucassen argues, among those innocent-looking shepherds and shepherdesses, whose waterholes may have been held in common but whose livestock most certainly were not. Animals were owned by individuals or households, whose success depended on them knowing every single individual in their herd.

Having dispatched the idea that agriculture made markets, Lucassen then demolishes the idea that markets made inequality. Inequality came first. It does not take much specialism to arise within a group before some acquire more resources than others. Managing this inequality doesn’t need anything so complex as a market. All it needs is an agreement. Lucassen turns to India, and the social ideologies that gave rise, from about 600 BC, to the Upanishads and the later commentaries on the Vedas: the evolving caste system, he says, is a textbook example of how human suffering can be explained to an entire culture’s satisfaction ”without victims or perpetrators being able to or needing to change anything about the situation”.

Markets, by this light, become a way of subverting the iniquitous rhetorics cooked up by rulers and their priests. Why, then, have markets not ushered in a post-political Utopia? The problem is not to do with power. It’s to do with knowledge. Jobs used to be *hard*. They used to be intellectually demanding. Never mind the seven-year apprenticeships of Medieval Europe, what about the jobs a few are still alive to remember? Everything, from chipping slate out of a Welsh quarry to unloading a cargo boat while maintaining its trim, took what seem now to be unfeasible amounts of concentration, experience and skill.

Now, though — and even as they are getting fed rather more, and rather more fairly, than at any other time in world history — the global proletariat are being starved, by automation, of the meaning of their labour. The bloodlessness of this future is not a subject Lucassen spends a great many words on, but it informs his central and abiding worry, which is that slavery — a depressing constant in his deep history of labour — remains a constant threat and a strong future possibility. The logics of a slave economy run frighteningly close to the skin in many cultures: witness the wrinkle in the 13th Amendment of the US constitution that legalises the indentured servitude of (largely black) convicts, or the profits generated for the global garment industry by interned Uighurs in China. Automation, and its ugly sister machine surveillance, seem only to encourage such experiments in carceral capitalism.

But if workers of the world are to unite, around what banner should they gather? Lucassen identifies only two forms of social agreement that have ever reconciled us to the unfair distribution of reward. One is redistributive theocracy. “Think of classical Egypt and the pre-Columbian civilizations,” he writes, “but also of an ‘ideal state’ like the Soviet Union.”

The other is the welfare state. But while theocracies have been sustained for centuries or even millennia, the welfare state, thus far, has a shelf life of only a few decades, and is easily threatened.

Exhausted yet enlightened, any reader reaching the end of Lucassen’s marathon will understand that the problem of work runs far deeper than politics, and that the grail of a fair society will only come nearer if we pay attention to real experiences, and resist the lure of utopias.

Sod provenance

Is the digital revolution that Pixar began with Toy Story stifling art – or saving it? An article for the Telegraph, 24 July 2021

In 2011 the Westfield shopping mall in Stratford, East London, acquired a new public artwork: a digital waterfall by the Shoreditch-based Jason Bruges Studio. The liquid-crystal facets of the 12 metre high sculpture form a subtle semi-random flickering display, as though water were pouring down its sides. Depending on the shopper’s mood, this either slakes their visual appetite, or leaves them gasping for a glimpse of real rocks, real water, real life.

Over its ten-year life, Bruges’s piece has gone from being a comment about natural processes (so soothing, so various, so predictable!) to being a comment about digital images, a nagging reminder that underneath the apparent smoothness of our media lurks the jagged line and the stair-stepped edge, the grid, the square: the pixel, in other words.

We suspect that the digital world is grainier than the real, coarser, more constricted, and stubbornly rectilinear. But this is a prejudice, and one that’s neatly punctured by a new book by electrical engineer and Pixar co-founder Alvy Ray Smith, “A Biography of the Pixel”. This eccentric work traces the intellectual genealogy of Toy Story (Pixar’s first feature-length computer animation in 1995) over bump-maps and around occlusions, along traced rays and through endless samples, computations and transformations, back to the mathematics of the eighteenth century.

Smith’s whig history is a little hard to take — as though, say, Joseph Fourier’s efforts in 1822 to visualise how heat passed through solids were merely a way-station on the way to Buzz Lightyear’s calamitous launch from the banister rail — but it’s a superb short-hand in which to explain the science.

We can use Fourier’s mathematics to record an image as a series of waves. (Visual patterns, patterns of light and shade and movement, “can be represented by the voltage patterns in a machine,” Smith explains.) And we can recreate these waves, and the image they represent, with perfect fidelity, so long as we have a record of the points at the crests and troughs of each wave.

The locations of these high- and low-points, recorded as numerical coordinates, are pixels. (The little dots you see if you stare far too closely at your computer screen are not pixels; strictly speaking, they’re “display elements”.)

Digital media do not cut up the world into little squares. (Only crappy screens do that). They don’t paint by numbers. On the contrary, they faithfully mimic patterns in the real world.

This leads Smith to his wonderfully upside-down-sounding catch-line: “Reality,” he says, ”is just a convenient measure of complexity.”

Once pixels are converted to images on a screen, they can be used to create any world, rooted in any geometry, and obeying any physics. And yet these possibilities remain largely unexplored. Almost every computer animation is shot through a fictitious “camera lens”, faithfully recording a Euclidean landscape. Why are digital animations so conservative?

I think this is the wrong question: its assumptions are faulty. I think the ability to ape reality at such high fidelity creates compelling and radical possibilities of its own.

I discussed some of these possibilities with Paul Franklin, co-founder of the SFX company DNEG, and who won Oscars for his work on Christopher Nolan’s sci-fi blockbusters Interstellar (2014) and Inception (2010). Franklin says the digital technologies appearing on film sets in the past decade — from lighter cameras and cooler lights to 3-D printed props and LED front-projection screens — are positively disrupting the way films are made. They are making film sets creative spaces once again, and giving the director and camera crew more opportunities for on-the-fly creative decision making. “We used a front-projection screen on the film Interstellar, so the actors could see what visual effects they were supposed to be responding to,” he remembers. “The actors loved being able to see the super-massive black hole they were supposed to be hurtling towards. Then we realised that we could capture an image of the rotating black hole’s disc reflecting in Matthew McConaughey’s helmet: now that’s not the sort of shot you plan.”

Now those projection screens are interactive. Franklin explains: “Say I’m looking down a big corridor. As I move the camera across the screen, instead of it flattening off and giving away the fact that it’s actually just a scenic backing, the corridor moves with the correct perspective, creating the illusion of a huge volume of space beyond the screen itself.“

Effects can be added to a shot in real-time, and in full view of cast and crew. More to the point, what the director sees through their viewfinder is what the audience gets. This encourages the sort of disciplined and creative filmmaking Melies and Chaplin would recognise, and spells an end to the deplorable industry habit of kicking important creative decisions into the long grass of post-production.

What’s taking shape here isn’t a “good enough for TV” reality. This is a “good enough to reveal truths” reality. (Gargantua, the spinning black hole at Interstellar’s climax, was calculated and rendered so meticulously, it ended up in a paper for the journal Classical and Quantum Gravity.) In some settings, digital facsimile is becoming, literally, a replacement reality.

In 2012 the EU High Representative Baroness Ashton gave a physical facsimile of the burial chamber of Tutankhamun to the people of Egypt. The digital studio responsible for its creation, Factum Foundation, has been working in the Valley of the Kings since 2001, creating ever-more faithful copies of places that were never meant to be visited. They also print paintings (by Velasquez, by Murillo, by Raphael…) that are indistinguishable from the originals.

From the perspective of this burgeoning replacement reality, much that is currently considered radical in the art world appears no more than a frantic shoring-up of old ideas and exhausted values. A couple of days ago Damien Hirst launched The Currency, a physical set of dot paintings the digitally tokenised images of which can be purchased, traded, and exchanged for the real paintings.

Eventually the purchaser has to choose whether to retain the token, or trade it in for the physical picture. They can’t own both. This, says Hirst, is supposed to challenge the concept of value through money and art. Every participant is confronted with their perception of value, and how it influences their decision.

But hang on: doesn’t money already do this? Isn’t this what money actually is?

It can be no accident that non-fungible tokens (NFTs), which make bits of the internet ownable, have emerged even as the same digital technologies are actually erasing the value of provenance in the real world. There is nothing sillier, or more dated looking, than the Neues Museum’s scan of its iconic bust of Nefertiti, released free to the public after a complex three-year legal battle. It comes complete with a copyright license in the bottom of the bust itself — a copyright claim to the scan of a 3,000-year-old sculpture created 3,000 miles away.

Digital technologies will not destroy art, but they will erode and ultimately extinguish the value of an artwork’s physical provenance. Once facsimiles become indistinguishable from originals, then originals will be considered mere “first editions”.

Of course literature has thrived for many centuries in such an environment; why should the same environment damage art? That would happen only if art had somehow already been reduced to a mere vehicle for financial speculation. As if!


How many holes has a straw?

Reading Jordan Ellenberg’s Shape for the Telegraph, 7 July 2021

“One can’t help feeling that, in those opening years of the 1900s, something was in the air,” writes mathematician Jordan Ellenburg.

It’s page 90, and he’s launching into the second act of his dramatic, complex history of geometry (think “History of the World in 100 Shapes”, some of them very screwy indeed).
For page after reassuring page, we’ve been introduced to symmetry, to topology, and to the kinds of notation that make sense of knotty-sounding questions like “how many holes has a straw”?

Now, though, the gloves are off, as Ellenburg records the fin de siecle’s “painful recognition of some unavoidable bubbling randomness at the very bottom of things.”
Normally when sentiments of this sort are trotted out, they’re there to introduce readers to the wild world of quantum mechanics (and, incidentally, we can expect a lot of that sort of thing in the next few years: there’s a centenary looming). Quantum’s got such a grip on our imagination, we tend to forget that it was the johnny-come-lately icing on an already fairly indigestible cake.

A good twenty years before physical reality was shown to be unreliable at small scales, mathematicians were pretzeling our very ideas of space. They had no choice: at the Louisiana Purchase Exposition in 1904, Henri Poincarre, by then the world’s most famous geometer, described how he was trying to keep reality stuck together in light of Maxwell’s famous equations of electromagnetism (Maxwell’s work absolutely refused to play nicely with space). In that talk, he came startlingly close to gazumping Einstein to a theory of relativity.
Also at the same exposition was Sir Ronald Ross, who had discovered that malaria was carried by the bite of the anopheles mosquito. He baffled and disappointed many with his presentation of an entirely mathematical model of disease transmission — the one we use today to predict, well, just about everything, from pandemics to political elections.
It’s hard to imagine two mathematical talks less alike than those of Poincarre and Ross. And yet they had something vital in common: both shook their audiences out of mere three-dimensional thinking.

And thank goodness for it: Ellenburg takes time to explain just how restrictive Euclidean thinking is. For Euclid, the first geometer, living in the 4th century BC, everything was geometry. When he multiplied two numbers, he thought of the result as the area of a rectangle. When he multiplied three numbers, he called the result a “solid’. Euclid’s geometric imagination gave us number theory; but tying mathematical values to physical experience locked him out of more or less everything else. Multiplying four numbers? Now how are you supposed to imagine that in three-dimensional space?

For the longest time, geometry seemed exhausted: a mental gym; sometimes a branch of rhetoric. (There’s a reason Lincoln’s Gettysburg Address characterises the United States as “dedicated to the proposition that all men are created equal”. A proposition is a Euclidean term, meaning a fact that follows logically from self-evident axioms.)

The more dimensions you add, however, the more capable and surprising geometry becomes. And this, thanks to runaway advances in our calculating ability, is why geometry has become our go-to manner of explanation for, well, everything. For games, for example: and extrapolating from games, for the sorts of algorithmical processes we saddle with that profoundly unhelpful label “artificial intelligence” (“artificial alternatives to intelligence” would be better).

All game-playing machines (from the chess player on my phone to DeepMind’s AlphaGo) share the same ghost, the “Markov chain”, formulated by Andrei Markov to map the probabilistic landscape generated by sequences of likely choices. An atheist before the Russian revolution, and treated with predictable shoddiness after it, Markov used his eponymous chain, rhetorically, to strangle religiose notions of free will in their cradle.

From isosceles triangles to free will is quite a leap, and by now you will surely have gathered that Shape is anything but a straight story. That’s the thing about mathematics: it does not advance; it proliferates. It’s the intellectual equivalent of Stephen Leacock’s Lord Ronald, who “flung himself upon his horse and rode madly off in all directions”.

Containing multitudes as he must, Ellenberg’s eyes grow wider and wider, his prose more and more energetic, as he moves from what geometry means to what geometry does in the modern world.

I mean no complaint (quite the contrary, actually) when I say that, by about two-thirds the way in, Ellenberg comes to resemble his friend John Horton Conway. Of this game-playing, toy-building celebrity of the maths world, who died from COVID last year, Ellenburg writes, “He wasn’t being wilfully difficult; it was just the way his mind worked, more associative than deductive. You asked him something and he told you what your question reminded him of.”
This is why Ellenberg took the trouble to draw out a mind map at the start of his book. This and the index offer the interested reader (and who could possibly be left indifferent?) a whole new way (“more associative than deductive”) of re-reading the book. And believe me, you will want to. Writing with passion for a nonmathematical audience, Ellenberg is a popular educator at the top of his game.

Heading north

Reading Forecast by Joe Shute for the Telegraph, 28 June 2021

As a child, journalist Joe Shute came upon four Ladybird nature books from the early 1960s called What to Look For. They described “a world in perfect balance: weather, wildlife and people all living harmoniously as the seasons progress.”

Today, he writes, “the crisply defined seasons of my Ladybird series, neatly quartered like an apple, are these days a mush.”

Forecast is a book about phenology: the study of lifecycles, and how they are affected by season, location and other factors. Unlike behemothic “climate science”, phenology doesn’t issue big data sets or barnstorming visualisations. Its subject cannot be so easily metricised. How life responds to changes in the seasons, and changes in those changes, and changes in the rates of those changes, is a multidimensional study whose richness would be entirely lost if abstracted. Instead, phenology depends on countless parochial diaries describing changes on small patches of land.

Shute, who for more than a decade has used his own diary to fuel the “Weather Watch” column in the Daily Telegraph, can look back and see “where the weather is doing strange things and nature veering spectacularly off course.” Watching his garden coming prematurely to life in late winter, Shute is left “with a slightly sickly sensation… I started to sense not a seasonal cycle, but a spiral.” (130)

Take Shute’s diary together with countless others and tabulate the findings, and you will find that all life has started shifting northwards — insects at a rate of five metres a day, some dragonflies at between 17 and 28 metres a day.

How to write about this great migration? Immediately following several affecting and quite horrifying eye-witness scenes from the global refugee crisis, Shute writes: “The same climate crisis that is rendering swathes of the earth increasingly inhospitable and driving so many young people to their deaths, is causing a similar decline in migratory bird populations.”

I’m being unkind to make a point (in context the passage isn’t nearly so wince-making), but Shute’s not the first to discover it’s impossible to speak across all scales of the climate crisis at once.

Amitav Ghosh’s 2016 The Great Derangement is canonical here. Ghosh explained in painful detail why the traditional novel can’t handle global warming. Here, Shute seems to be proving the same point for non-fiction — or at least, for non-fiction of the meditative sort.

Why doesn’t Shute reach for abstractions? Why doesn’t he reach for climate science, and for the latest IPCC report? Why doesn’t he bloviate?

No, Shute’s made of sterner stuff: he would rather go down with his corracle, stitching together a planet on fire (11 wildfires raging in the Arctic circle in July 2018), human catastrophe, bird armageddon, his and his partner’s fertility problems, and the snore of a sleeping dormouse, across just 250 pages.

And the result? Forecast is a triumph of the most unnerving sort. By the end it’s clearly not Shute’s book that’s coming unstuck: it’s us. Shute begins his book asking “what happens to centuries of folklore, identity and memory when the very thing they subsist on is changing, perhaps for good”, and the answer he arrives at is horrific: folklore, identity and memory just vanish. There is no reverse gear to this thing.

I was delighted (if that is quite the word) to see Shute nailing the creeping unease I’ve felt every morning since 2014. That was the year the Met Office decided to give storms code-names. The reduction of our once rich, allusive weather vocabulary to “weather bombs” and “thunder snow”, as though weather events were best captured in “the sort of martial language usually preserved for the defence of the realm” is Shute’s most telling measure of how much, in this emergency, we have lost of ourselves.