Return From The Stars

Here’s my foreword to the new MIT edition of Stanislaw Lem’s Return from the Stars.

For a while I lived on the penthouse level of a gherkin-shaped glass tower, five minutes from DIFC, Dubai’s international financial centre. I’d spend the day perched at my partner’s kitchen counter, or sprawled in her hammock, writing long-hand into a green notebook. In the evening I would stand at the glass wall, stir-crazy by then, fighting an ice-cream headache brought on by the air conditioning, counting lime-green Lamborghinis crawling nose-to-tail along the six lanes of Al Saada Street, far below.

Once the sun bled out into the mist, I would leave the tower and pick my way over broken and unfinished pavements to the DIFC. There are several ways into the complex; none of them resemble civic space. A narrow, wood-panelled atrium led to the foot of a single, slow escalator. It deposited me on a paved mezzanine from which several huge towers grew. There weren’t any pavements as such, just stretches of empty space between chairs and tables. There were art galleries and offices and a destination restaurant run by a car company, each unit folding into the next, giving me at one moment the feeling of exhilaration, as though this entire place was mine for the taking, the next a sense of tremendous awkwardness as though I’d stumbled uninvited into an important person’s private party.

I couldn’t tell from the outside which mirrored door led to a mall, which to a reception desk, which to a lobby. I would burst into every building with an aggressive confidence, penetrating as far as I could into these indeterminate spaces until I found an exit — though an exit into what, exactly? Another atrium. A bank. A water feature. A beauty salon. Eventually I would be stopped and politely redirected.

*

Of middling years, the space explorer Hal Bregg is having to contend with two kinds of ageing. The first is historical. The other, biological.

He has just returned to Earth after a gap of 127 years, and the place he once called home has turned incomprehensibly strange in the intervening time. Thanks to the dilation effect brought on by interstellar travel, only ten ship-board years have passed. But this interval has been hard on Bregg. The trauma and beauty of his harrowing missions aboard the Prometheus still haunt him, and power some of the Return’s most affecting passages.

Bregg’s arrival on Earth, fresh from acclimatisation sessions on the Moon, is about as exciting as a bus ride. At the terminal there are no parades, no gawkers, no journalists. He’s a curiosity at best, and not even a rare one (there have been other ships, other crews). Arriving at the terminus, he’s just another passenger. He misses the official sent to meet him, decides he’ll find his own way about, sets off for the city, and becomes quickly lost. He can’t find the edge of the terminus. He can’t find the floor he needs. Everything seems to be rising. Is this a giant elevator? “It was hard to rest the eye on anything that was not in motion, because the architecture on all sides appeared to consist in motion along, in change…” He decides to follow the crowd. Is this a hotel? Are these rooms? People are fetching objects from the walls. He does the same. In his hands is a coloured translucent tube, slightly warm. Is this a room key? A map? Food?

“And suddenly I felt like a monkey that has been given a fountain pen or a lighter; for an instant I was seized by a blind rage.”

Among Stanislaw Lem’s many gifts is his understanding of how the future works. The future is not a series of lucky predictions (though Lem was luckier than most at that particular craps table). The future is not wholly exotic; it is, by necessity, still chock full of the past. The future is not one place, governed by one idea. Suppose you get a handle on some major change (Return from the Stars boasts one of those — but we’ll get to “betrization” in a minute). This insight won’t give you some mysterious, magical insight into everything else. You’ll be armed, but you’ll still be lost. The future is bigger than you think.

The point about the future is that it’s unreadable. You recognise the language, but you can no longer speak it. The words fell out of order years ago. The vowels shifted.

“I was numb from the strain of trying not to do anything wrong. This, for four days now. From the very first moment I was invariably behind in everything that went on, and the constant effort to understand the simplest conversation or situation turned that tension into a feeling horribly like despair. ”

Bregg contemplates the horrid beauty of the terminal (yes, he’s still in the terminal. Settle in: he’s going to be wandering that terminal for a very long time) and admires (if that is quite the word) its “coloured galaxies of squares, clusters of spiral lights, glows shimmering above skyscrapers, the streets: a creeping peristalsis with necklaces of light, and over this, in the perpendicular, cauldrons of neon, feather crests and lightning bolts, circles. airplanes, and bottles of flame, red dandelions made of needle-signal lights, momentary suns and haemorrhages of advertising, mechanical and violent.”

But if you think Lem will stop there, in contemplation of an overdriven New York skyline, then you need to read more Lem. The window gutters out. “You have been watching clips from newsreels of the seventies,” the air announces, “in the series Views of the Ancient Capitals.”

*

Science fiction delights in expensive real-estate. Sky-high homes give its protagonists the geostationary perspective they require to scry large amounts of weird information very quickly. In cinema, Rick Deckard and his replicant protege K are both penthouse dwellers. And in J G Ballard’s High Rise (1975) Robert Laing eats roast dog on the balcony of, yes, a 25th-floor apartment.

The world of Return from the Stars is altogether more socially progressive. Its buildings are only partly real, Their continuation is an image, “so that the people living on each level do not feel deprived. Not in any way.” The politics of this future, which teaches its infant children “the principles of tolerance, coexistence, respect for other beliefs and attitudes,” are disconcertingly familiar, and mostly admirable.

But let’s stay with the media for a moment. This is a future adept at immersing its denizens in a thoroughly mediated environment — thus an enhanced movie becomes a “real” in Michael Kandel’s savvy, punning translation. For those who know the canon, there’s a delicious hint here of what’s to come: in Lem’s The Futurological Congress (1971) 29 billion people live out lives of unremitting squalor, yet each thinks they’re a tycoon, wrapt as they are in drug-induced hallucination.

The Return’s future, by contrast, handles the physical world perfectly well, thank you. Its population are not narcotised. Not at all: they’re full of good ideas, hold down rewarding jobs, maintain close friendships, enjoy working marriages, and nurture happy children. They have all manner of things to live for. This future — “a world of tranquility, of gentle manners and customs, easy transitions, undramatic situations” — is anything but a dystopia. Why, it’s not even dull!

*

An operation, “betrization”, conducted in early childhood, renders people incapable of serious violence. Murder becomes literally inconceivable, and as a side-effect, risk loses its allure. Betrization has been universally adopted across the Earth, and it’s mandatory.

Bregg and his ancient fellows cannot help but view betrization with horror. Consequently, no-one has the stomach to force it upon them. This leaves the returning astronauts as predators in fields full of friendly, intelligent, accommodating sheep.

But why would Hal Bregg want to predate? Why would any of them? What would it gain them, beyond a spoiled conscience? Everyone on this contented Earth assumes the returnees are savages, though most are far too polite (or risk-averse) to say so. For Bregg and his fellows, it’s enraging. It’s alienating. It’s a prompt to the sort of misbehaviour that they would otherwise never dream of indulging.

Lem himself considered Return from the Stars a failure, and blamed betrization for it. As an idea, it was too on-the-nose: a melodramatic conceit that he had to keep underplaying so the story — a quiet affair about friendship and love and misunderstanding — would stay on track. But times and customs change, and history has been kind to the Return. It has become, in 2020, a better book than it was in the 1960s. This is because we have grown into the very future it predicted. Indeed, we are embroiled in precisely the kind of cultural conflict Lem said would ensue, once betrization was invented.

“Young people, betrizated, became strangers to their own parents, whose interests they did not share. They abhorred their parents’ bloody tastes. For a quarter of a century is was necessary to have two types of periodicals, books, plays: one for the old generation, one for the new.”

Western readers with experience of university life and politics over the last thirty years will surely recognise themselves in these pages, where timid passions dabble with free love, where thin skins heal in safe spaces, and intellectual gadflies navigate a landscape of extreme emotional delicacy, under constant threat of cancellation.

Will they blush, to see themselves thus reflected? I doubt it. Emulsify them as you like, kindness and a sense of humour do not mix. Anyway, the Return is not a satire, any more than it is a dystopia. It is not, when push comes to shove, a book about a world at all (for which some may read: not science fiction).

It is a book about Hal Bregg. About his impulse towards solitude and his need for company. About his deep respect for old friends, and his earnest desire for new ones. It’s about a kind and thoughtful bull in an emotional china shop, trying desperately not to rape things. It’s about men.

*

We would meet at last and order a cab and within the hour we would be sitting overlooking the Gulf in a bar fashioned to resemble the hollowed interior of a golden nugget. I remember one evening my partner chose what to order and I was handed a glass of a colourless liquid topped with a film of crude oil. I thought: Do I drink this or do I light it? And suddenly I felt like a monkey that has been given a fountain pen or a lighter; for an instant I was seized by a blind rage.

That was the evening she told me why she didn’t want to see me any more. The next day I left for London. 19 March 2017: World Happiness Day. As we turned north for the airport I noticed that the sign at the junction had changed. In place of Al Saada: “Happiness Street.”

And a ten-storey-high yellow smiley icon had been draped across the face of a government building.

To hell with the philosopause!

Reading Hawking Hawking: The Selling of a Scientific Celebrity by Charles Seife for the Spectator, 1 May 2021

I could never muster much enthusiasm for the theoretical physicist Stephen Hawking. His work, on the early universe and the nature of spacetime, was Nobel-worthy, but those of us outside his narrow community were horribly short-changed. His 1988 global best-seller A Brief History of Time was incomprehensible, not because it was difficult, but because it was bad.

Nobody, naturally, wanted to ascribe Hawking’s popular success to his rare form of Motor Neurone Disease, Hawking least of all. He afforded us no room for horror or, God forbid, pity. In 1990, asked a dumb question about how his condition might have shaped his work (because people who suffer ruinous, debilitating illnesses acquire compensating superpowers, right?) Hawking played along: “I haven’t had to lecture or teach undergraduates, and I haven’t had to sit on tedious and time-consuming committees. So I have been able to devote myself completely to research.”

The truth — that Hawking was one of the worst popular communicators of his day — is as evident as it is unsayable. A Brief History of Time was incomprehensible because after nearly five years’ superhuman effort, the author proved incapable of composing a whole book unaided. He couldn’t even do mathematics the way most people do it, by doodling, since he’d already lost the use of his hands. He could not jot notes. He could not manipulate equations. He had to turn every problem he encountered into a species of geometry, just to be able to think about it. He held his own in an impossibly rarified profession for years, but the business of popular communication was beyond him. As was communication, in the end, according to Hawking’s late collaborator Andy Strominger: “You would talk about words per minute, and then it went to minutes per word, and then, you know, it just got slower and slower until it just sort of stopped.”

Hawking became, in the end, a computerised patchwork of hackneyed, pre-stored utterances and responses. Pull the string at his back and marvel. Charles Seife, a biographer braver than most, begins by staring down the puppet. His conceit is to tell Stephen Hawking’s story backwards, peeling back the layers of celebrity and incapacity to reveal the wounded human within.

It’s a tricksy idea that works so well, you wonder why no-one thought of it before (though ordering his material and his arguments in this way must have nearly killed the poor author).

Hawking’s greatest claim to fame is that he discovered things about black holes — still unobserved at that time — that set the two great schools of theoretical physics, quantum mechanics and relativity, at a fresh and astonishingly creative loggerheads.

But a new golden era of astronomical observation dawned almost immediately after, and A Brief History was badly outdated before it even hit the shelves. It couldn’t even get the date of the universe right.

It used to be that genius that outlived its moment could reinvent itself. When new-fangled endocrine science threw Ivan Pavlov’s Nobel-winning physiology into doubt, he reinvented himself as a psychologist (and not a bad one at that).

Today’s era of narrow specialism makes such a move almost impossible but, by way of intellectual compensation, there is always philosophy — a perennially popular field more or less wholly abandoned by professional philosophers. Images of the middle-aged scientific genius indulging its philosopause in book after book about science and art, science God, science and society and so on and so forth, may raise a wry smile, but work of real worth has come out of it.

Alas, even if Hawking had shown the slightest aptitude for philosophy (and he didn’t), he couldn’t possibly have composed it.

In our imaginations, Hawking is the cartoon embodiment of the scientific sage, effectively disembodied and above ordinary mortal concerns. In truth, life denied him a path to sagacity even as it steeped him in the spit and stew of physical being. Hawking’s libido never waned. So to hell with the philosopause! Bring on the dancing girls! Bring on the cheques, from Specsavers, BT, Jaguar, Paddy Power. (Hawking never had enough money: the care he needed was so intensive and difficult, a transatlantic air flight could set him back around a quarter of a million pounds). Bring on the billionaires with their fat cheques books (naifs, the lot of them, but decent enough, and generous to a fault). Bring on the countless opportunities to bloviate about subjects he didn’t understand, a sort of Prince Charles only without Charles’s efforts at warmth.

I find it impossible, having read Seife, not to see Hawking through the lens of Jacobean tragedy, warped and raging, unable even to stick a finger up at a world that could not — but much worse, *chose* not — to understand him. Of course he was a monster, and years too late, and through a book that will anger many, I have come to love him for it.

Bacon is grey

Reading Who Poisoned Your Bacon Sandwich? by Guillaume Coudray and Hooked: How processed food became addictive by Michael Moss for the Financial Times, 21 April 2021

The story of how food arrives on our plate is a living, breathing sci-fi epic. Fertiliser produced by sucking nitrogen out of the air now sustains about half the global population. Farmers worldwide depend on the data spewing from 160 or so environmental satellite missions in low-earth orbit, not to mention literally thousands of weather satellites of various kinds.

That such a complex system is precarious hardly needs saying. It only takes one innovative product, or cheeky short-cut, to transform the health, appearance and behaviour of nations. What gastronomic historian, 50 years ago, would have ptedicted that China would grow fat, or that four and out five French cafés would shut up shop in a single generation?

To write about the food supply is to wrestle with problems of scale, as two new books on the subject demonstrate. To explain how we turned the green revolution of the 1960s into a global obesity pandemic in less than half a century, Michael Moss must reach beyond history entirely, and into the contested territories of evolutionary biology. Guillaume Coudray, Paris-based investigative journalist, prefers a narrower argument, focusing on the historical accidents, and subsequent cover-ups that even now add cancer-causing compounds to our processed meat. The industry attitudes and tactics he reveals strongly resemble those of the tobacco industry in the 1970s and 1980s.

Ably translated as Who Poisoned Your Bacon Sandwich?, Coudray’s 2017 expose tells the story of the common additives used to cure — and, crucially, colour — processed meats. Until 1820, saltpetre (potassium nitrate; a constituent of gunpowder) was our curing agent of choice — most likely because hunters in the 16th century discovered that game birds shot with their newfangled muskets kept for longer. Then sodium nitrate appeared, and — in the mid 1920s — sodium nitrite. All three give meats a convincing colour in a fraction of the time traditional salting requires. Also, their disinfectant properties allow unscrupulous producers to operate in unsanitary conditions.

Follow basic rules of hygiene, and you can easily cure meat using ordinary table salt. But traditional meats often take upwards of a year to mature; no wonder that the 90-day hams pouring out of Chicago’s meatpacking district at the turn of the 20th century conquered the world market. Parma ham producers still use salt; most everyone else has resorted to nitrate and nitrates just to survive.

It wasn’t until the 1970s that researchers found a link between these staple curing agents and cancer. This was, significantly, also the moment industry lobbyists began to rewrite food history. The claim that we’ve been preserving meat with saltpetre for over 5,000 years is particularly inventive: back then it was used to preserve Egyptian mummies, not cure hams. Along with the massaged history came obfuscation – for instance arguments that nitrates and nitrites are not carcinogenic in themselves, even if they give rise to carcinogenic agents during processing, cooking or, um, digestion.

And when, in 2015, experts of the International Agency for Research on Cancer classified all processed meats in “group 1: carcinogenic to humans” (they can cause colorectal cancer, the second most deadly cancer we face) the doubt-mongers redoubled their efforts — in particular the baseless claim that nitrates and nitrites are our only defence against certain kinds of food poisoning.

There are alternatives. If it’s a disinfectant-cum-curing agent you’re after, organic water-soluble salts called sorbates work just fine.

Crucially, though, sorbates have no colourant effect, while nitrates and nitrites give cured meat that rosy glow. Their use is so widespread, we have clean forgotten that the natural colour of ham, pate, weiner sausages and bacon is (deal with it) grey.

That the food industry wants to make food as attractive as possible, so that it can sell as much as possible is, of itself, hardly news.

And in Hooked (a rather different beast to his 2013 exposé Sugar Salt Fat), American journalist Michael Moss finds that — beyond the accusations and litigations around different foodstuffs — there’s something systemically wrong with our relationship to food. US consumers now fill three-quarters of their shopping carts with processed food. Snacking now accounts for around a quarter of our daily calorie intake. Pointing the finger at Coca-Cola or McDonalds is not going to solve the bigger problem, which Moss takes to be changes in the biology of our ancestors which have made it extremely difficult to recoup healthy eating habits once they’ve run out of control.

Moss argument is cogent, but not simple. We have to get to grips, first, with the latest thinking on addiction, which has more or less dispensed with the idea that substances are mind-altering. Rather, they are mind-engaging, and the speed of their effect has quite as much, if not more to do with their strength than their pharmacology.

By this measure, food is an incredibly powerful drug (A taste of sugar hits the brain 20 times faster than a lungful of tobacco smoke). But does it make any sense to say we’re all addicted to food?

Moss says it does — only we need to dip our toes in evolutionary biology to understand why. As primates, we have lost a long bone, called the transverse lamina, that used to separate the mouth from the nose. Consequently, we can smell food as we taste it.

No one can really explain why an enhanced appreciation of flavour gave us such a huge evolutionary advantage, but the biology is ungainsayable: we are an animal obsessed with gustatory variety. In medieval France, this inspired hundreds of different sauces. Today, in my local supermarket, it markets 50-odd different varieties of potato chip.

The problem, Moss says, is not that food manufacturers are trying to addict us. It is that they have learned how to exploit an addiction baked into our biology.

So what’s the solution? Stop drinking anything with calories? Avoid distractions when we eat? Favour foods we have to chew? All of the above, of course — though it’s hard to see how good advice on its own could ever persuade us all to act against our own appetites.

Hooked works, in a rambunctious, shorthand sort of way. Ultimately, though, it may prove be a transitional book for an author who is edging towards a much deeper reappraisal of the relationship between food, convenience (time, in other words), and money.

Neither Moss nor Coudray demands we take to the barricades just yet. But the pale, unappetisingly grey storm clouds of a food revolution are gathering.

“To penetrate humbly…”

Reading Beyond by Stephen Walker for the Telegraph, 18 April 2021

On 30 May 2020 US astronauts Bob Behnken and Doug Hurley flew to the International Space Station. It was the first time a crew had left the planet from US soil since 2011.

In the interim, something — not wrong, exactly, but certainly strange — had happened to space travel. Behnken and Hurley’s SpaceX-branded space suits looked like something I would throw together as a child, even down to my dad’s biking helmet and — were those Wellington boots? The stark interior of SpaceX’s Crew Dragon capsule was even more disconcerting. Poor Behnken and Hurley! they looked as if they were riding in the back of an Uber.

Well, what goes around comes around, I suppose. The capsule that carried Yuri Gagarin into space on 12 April 1961 boasted an almost ludicrously bare central panel of just four dials. Naysayers sniped that Gagarin had been a mere passenger — a human guinea pig.

By contrast, the design of the Mercury cockpit, that carried America’s first astronaut into space, was magnificently, and possibly redundantly fussy says Stephen Walker, in his long and always thrilling blow-by-blow account of the United States’ and the Soviet Union’s race into orbit: “Almost every inch of it was littered with dials, knobs, indicators, lights and levers just like a ‘real’ aeroplane cockpit.”

America’s “Gemini Seven” (two-seater Gemini capsules quickly succeeded the Mercuries) were celebrities, almost absurdly over-qualified for their task of being rattled around in the nose of an intercontinental ballistic missile. Their space programme was public — and so were its indignities, like the fact that virtually everything they were being asked to do, a chimpanzee had done before them.

It drove Alan Shepard — the man fated to be the first American in space — into a rage. On one training session somebody joked, “Maybe we should get somebody who works for bananas”. The ash tray Shepard threw only just missed his head.

The Soviet Union’s space programme was secret. Not even their wives knew what the “Vanguard Seven” were up to. They won no privileges. Sometimes they’d polish other people’s floors to make ends meet.

Those looking for evidence of the gimcrack quality of the Soviet space effort will find ammunition in Beyond. Contrast, for example, NASA’s capsule escape plans (involving a cherry-picker platform and an armoured vehicle) with the Soviet equivalent (involving a net and a bath tub).

But Walker’s research for this book stretches back a decade and his acknowledgements salute significant historians (Asif Siddiqi in particular), generous interviewees and a small army of researchers. He’ll not fall for such clichés. instead, he shows how the efforts of each side in the race to space were shaped by the technology they had to hand.

Soviet hydrogen bombs were huge and heavy, and needed big, powerful rockets to carry them. Soviet space launches were correspondingly epic. The Baikonur cosmodrome in Soviet Kazakhstan — a desolate, scorpion-infested region described in Soviet encyclopaedias as “the Home of the Black Death” — was around a hundred times the size of Cape Canaveral. Its launch bunkers were buried beneath several metres of reinforced concrete and earth because, says Walker, “a rocket the size and power of the R-7 would probably have flattened the sort of surface blockhouse near the little Redstone in Cape Canaveral.”

Because the US had better (lighter, smaller) nuclear bombs, its available rocket technology was — in space-piercing terms — seriously underpowered. When Alan Shepard finally launched from Cape Canaveral on 5 May 1961, twenty-three days after Yuri Gagarin circled the earth, his flight lasted just over fifteen minutes. He splashed down in the Atlantic Ocean 302 miles from the Cape. Gagarin travelled some 26,000 miles around the planet.

The space race was the Soviets’ to lose. Once Khrushchev discovered the political power of space “firsts” he couldn’t get enough of them. “Each successive space ‘spectacular’ was exactly that,” Walker writes, “not so much part of a carefully structured progressive space programme but yet another glittering showpiece, preferably tied to an important political anniversary”. Attempts to build a co-ordinated strategy were rejected or simply ignored. This is a book as much about disappointment as triumph.

Beyond began life as a film documentary, but the newly discovered footage Walker was offered proved too damaged for use. Thank goodness he kept his notes and his nerve. This is not a field that’s starved of insight: Jamie Doran and Piers Bizony wrote a cracking biography of Gagarin called Starman in 1998; the autobiography of Soviet systems designer Boris Chertok runs to four volumes. Still, Walker brings a huge amount that is new and fresh to our understanding of the space race.

Over the desk of the Soviet’s chief designer Sergei Korolev hung a portrait of the nineneenth-century Russian space visionary Konstantin Tsiolkovsky, and with it his words: “Mankind will not stay on Earth for ever but in its quest for light and space it will first penetrate humbly beyond the atmosphere and then conquer the whole solar system.”

Beyond shows how that dream — what US aviation pioneer James Smith McDonnell called “the creative conquest of space” — was exploited by blocs committed to their substitute for war — and how, for all that, it survived.

Tally of a lost world

Reading Delicious: The evolution of flavor and how it made us human by Rob Dunn and Monica Sanchez for New Scientist, 31 March 2021

Dolphins need only hunger and a mental image of what food looks like. Their taste receptors broke long ago, and they no longer taste sweet, salty or even umami, thriving on hunger and satisfaction alone.

Omnivores and herbivores have a more various diet, and more chances of getting things badly wrong, so they are guided by much more highly developed senses (related, even intertwined, but not at all the same) of flavour (how something tastes) and aroma (how something smells).

Evolutionary biologist Robb Dunn and anthropologist Monica Sanchez weave together what chefs now know about the experience of food, what ecologists know about the needs of animals, and what evolutionary biologists know about how our senses evolved, to tell the story of how we have been led by our noses through evolutionary history, and turned from chimpanzee-like primate precursor to modern, dinner-obsessed Homo sapiens.

Much of the work described here dovetails neatly with work described in biological anthropologist Richard Wrangham’s 2009 book Catching Fire: How cooking made us human. Wrangham argued that releasing the calories bound up in raw food by cooking it led to a cognitive explosion in Homo sapiens, around 1.9 million years ago.

As Dunn and Sanchez rightly point out, Wrangham’s book was not short of a speculation or two: there is, after all, no evidence of fire-making this far back. Still, they incline very much to Wrangham’s hypothesis. There’s no firm evidence of hominins fermenting food at this time, either — indeed, it’s hard to imagine what such evidence would even look like. Nonetheless, the authors are convinced it took place.

Where Wrangham focused on fire, Dunn and Sanchez are more interested in other forms of basic food processing: cutting, pounding and especially fermenting. The authors make a convincing, closely argued case for their perhaps rather surprising contention that “fermenting a mastodon, mammoth, or a horse so that it remains edible and is not deadly appears to be less challenging than making fire.”

“Flavor is our new hammer,” the authors admit, “and so we are probably whacking some shiny things here that aren’t nails.” It would be all too easy, out of a surfeit of enthusiasm, for them distort their reader’s impressions of a new and exciting field, tracing the evolution of flavour. Happily, Dunn and Sanchez are thoroughly scrupulous in the way they present their evidence and their arguments.

As primates, our experience of aroma and flavour is unusual, in that we experience retronasal aromas — the aromas that rise up from our mouths into the backs of our noses. This is because we have lost a long bone, called the transverse lamina, that helps to separate the mouth from the nose. This loss had huge consequences for olfaction, enabling humans to search out convoluted tastes and aromas so complex, we have to associate them with memories in order to individually categorise them all.

The story of how Homo sapiens developed such a sophisticated palette is also, of course, the story of how it contributed to the extinction of hundreds of the largest, most unusual animals on the planet. (Delicious is a charming book, but it does have its melancholy side.)

To take one dizzying example, the Clovis peoples of North America — direct ancestors of roughly 80 per cent of all living native populations in North and South America — definitely ate mammoths, mastodons, gomphotheres, bison and giant horses; they may also have eaten Jefferson’s ground sloths, giant camels, dire wolves, short-faced bears, flat-headed peccaries, long-headed peccaries, tapirs, giant llamas, giant bison, stag moose, shrub-ox, and Harlan’s Muskox.

“The Clovis menu,” the authors write, “if written on a chalkboard, would be a tally of a lost world.”

We may never have a pandemic again

Reading The Code Breaker, Walter Isaacson’s biography of Jennifer Doudna, for the Telegraph, 27 March 2021

In a co-written account of her work published in 2017, biochemist Jennifer Doudna creates a system that can cut and paste genetic information as simply as a word processor can manipulate text. Having conceived a technology that promises to predict, correct and even enhance a person’s genetic destiny she says, not without cause, “I began to feel a bit like Doctor Frankenstein.”

When it comes to breakthroughs in biology, references to Mary Shelley are irresistible. One of Walter Isaacson’s minor triumphs, in a book not short of major triumphs, is that, over 500 pages, he mentions that over-quoted, under-read novel less than half a dozen times. In biotechnology circles, this is probably a record.

We explain science by telling stories of discovery. It’s a way of unpacking complicated ideas in narrative form. It’s not really history, or if it is, it’s whig history, defined by a young Herbert Butterfield in 1931 as “the tendency… to praise revolutions provided they have been successful, to emphasise certain principles of progress in the past and to produce a story which is the ratification if not the glorification of the present.”

To explain the science, you falsify the history.
So all discovers and inventors are heroes on the Promethean (or Frankensteinian) model, working in isolation, and taking on the whole weight of the world on their shoulders!

Alas, the reverse is also true. Telling the true history of discovery makes the science very difficult to unpack. And though Walter Isaacson, whose many achievements include a spell as CEO of the Aspen Institute, clearly knows his science, his account of the most significant biological breakthrough since understanding the structure of DNA is not the very best account of CRISPR out there. His folksy cajoling — inviting us to celebrate “wily bacteria” and the “plucky little molecule” RNA — suggests exasperation. Explaining CRISPR is *hard*.

The Code Breaker excels precisely where, having read Isaacson’s 2011 biography of Steve Jobs, you might expect it to excel. Isaacson understands that all institutions are political. Every institutional activity — be it blue-sky research into the genome, or the design of a consumer product — is a species of political action.

The politics of science is uniquely challenging, because its standards of honesty, precision and rigour stretch the capabilities of language itself. Again and again, Doudna’s relationships with rivals, colleagues, mentors and critics are seen to hang on fine threads of contested interpretation. We see that Doudna’s fiercest rivalry, with Feng Zhang of the Broad Institute of MIT and Harvard, was conducted in an entirely ethical manner — and yet we see both of them stumbling away, bloodied.

Isaacson’s style of biography — already evident in his appreciations of Einstein and Franklin and Leonardo — can be dubbed “qualified hagiography”. He’s trying to hit a balance between the kind of whig history that will make complex materials accessible, and the kind of account that will stand the inspection of academic historians. His heroes’ flaws are explored, but their heroism is upheld. It’s a structural device, and pick at it however you want, it makes for a rattlingly good story.

Jennifer Doudna was born in 1964 and grew up on Big Island, Hawaii. Inspired by an old paperback copy of The Double Helix by DNA pioneer James Watson, she devoted her life to understanding the chemistry of living things. Over her career she championed DNA’s smaller, more active cousin RNA, which brought to her notice a remarkable mechanism, developed by single-celled organisms in their 3.1-million-year war with viruses. Each of these cells used RNA to build their very own immune system.

Understanding that mechanism was Doudna’s triumph, shared with her colleague Emmanuelle Charpentier; both conspicuously deserved the Nobel prize awarded them last year.

Showing that this mechanism worked in cells like our own, though, would change everything, including our species’ relationship with its own evolution. This technology has the power to eradicate both disease (good) and ordinary human variety (really not so good at all).

In 2012, the year of the great race, Doudna’s Berkeley lab knew nothing like enough about working with human cells. Zhang’s lab knew nothing like enough about the biochemical wrinkles that drove CRISPR. Their rivalrous decision not to pool CRISPR-Cas9 intellectual property would pave the way for an epic patent battle.

COVID-19 has changed all that, ushering in an extraordinary cultural shift.. Led by Doudna and Zhang, last year most academic labs declared that their discoveries would be made available to anyone fighting the virus. New on-line forums have blossomed, breaking the stranglehold of expensive paywall-protected journals.

Doudna’s lab and others have developed home testing kits for COVID-19 that have a potential impact beyond this one fight, “bringing biology into the home,” as Isaacson writes, “the way that personal computers in the 1970s brought digital products and services… into people’s daily lives and consciousness.”

Meanwhile genetic vaccines powered by CRISPR — like the ones developed for COVID-19 by Moderna and BioNTech/Pfizer — portend a sudden shift of the evolutionary balance between human beings and viruses. Moderna’s chair Noubar Afeyan is punchy about the prospects: “We may never have a pandemic again,” he says.

The Code Breaker catches us at an extraordinary moment. Isaacson argues with sincerity and conviction that, blooded by this pandemic, we should now grasp the nettle, make a stab at the hard ethical questions, and apply Doudna’s Promethean knowledge, now, and everywhere, to help people. Given the growing likelihood of pandemics, we may not have a choice.

 

Reality trumped

Reading You Are Here: A field guide for navigating polarized speech, conspiracy theories, and our polluted media landscape by Whitney Phillips and Ryan M. Milner (MIT Press)
for New Scientist, 3 March 2021

This is a book about pollution, not of the physical environment, but of our civic discourse. It is about disinformation (false and misleading information deliberately spread), misinformation (false and misleading information inadvertently spread), and malinformation (information with a basis in reality spread pointedly and specifically to cause harm).

Communications experts Whitney Phillips and Ryan M. Milner completed their book just prior to the US presidential election that replaced Donald Trump with Joe Biden. That election, and the seditious activities that prompted Trump’s second impeachment, have clarified many of the issues Phillips and Milner have gone to such pains to explore. Though events have stolen some their thunder, You Are Here remains an invaluable snapshot of our current social and technological problems around news, truth and fact.

The authors’ US-centric (but universally applicable) account of “fake news” begins with the rise of the Ku Klux Klan. Its deliberately silly name, cartoonish robes, and quaint routines (which accompanied all its activities, from rallies to lynchings) prefigured the “only-joking” subcultures (Pepe the Frog and the like) dominating so much of our contemporary social media. Next, an examination of the Satanic panics of the 1980s reveals much about the birth and growth of conspiracy theories. The authors’ last act is an unpicking of QAnon — a current far-right conspiracy theory alleging that a secret cabal of cannibalistic Satan-worshippers plotted against former U.S. president Donald Trump. This brings the threads of their argument together in a conclusion all the more apocalyptic for being so closely argued.

Polluted information is, they argue, our latest public health emergency. By treating the information sphere as an ecology under threat, the authors push past factionalism to reveal how, when we use media, “the everyday actions of everyone else feed into and are reinforced by the worst actions of the worst actors”

This is their most striking takeaway: that the media machine that enabled QAnon isn’t a machine out of alignment, or out of control, or somehow infected: it’s a system working exactly as designed — “a system that damages so much because it works so well”.

This media machine is founded on principles that, in and of themselves, seem only laudable. Top of the list is the idea that to counter harms, we have to call attention to them: “in other words, that light disinfects”.

This is a grand philosophy, for so long as light is hard to generate. But what happens when the light — the confluence of competing information sets, depicting competing realities — becomes blinding?

Take Google as an example. Google is an advertising platform, that makes money the more its users use the internet to “get to the bottom of things”. The deeper the rabbit-holes go, the more money Google makes. This sets up a powerful incentive for “conspiracy entrepreneurs” to produce content, creating “alternative media echo-systems”. When the facts run out, create alternative facts. “The algorithm” (if you’ll forgive this reviewer’s dicey shorthand) doesn’t care. “The algorithm” is, in fact, designed to serve up as much pollution as possible.

What’s to be done? Here the authors hit a quite sizeable snag. They claim they’re not asking “for people to ‘remain civil’”. They claim they’re not commanding us, “don’t feed the trolls.” But so far as I could see, this is exactly what they’re saying — and good for them.

With the machismo typical of the social sciences, the authors call for “foundational, systematic, top-to-bottom change,” whatever that is supposed to mean, when what they are actually advocating is a sense of personal decency, a contempt for anonymity, a willingness to stand by what one says come hell or high water, politeness and consideration, and a willingness to listen.

These are not political ideas. These are qualities of character. One might even call them virtues, of a sort that were once particularly prized by conservatives.

Phillips and Milner bemoan the way market capitalism has swallowed political discourse. They teeter on a much more important truth: that politics has swallowed our moral discourse. Social media has made whining cowards of us all. You Are Here comes dangerously close to saying so. If you listen carefully, there’s a still, small voice hidden in this book, telling us all to grow up.

A Faustian bargain, freely made

Reading The Rare Metals War by Guillaume Pitron for New Scientist, 27 January 2021

We reap seven times as much energy from the wind, and 44 times as much energy from the sun, as we did just a decade ago. Is this is good news? Guillaume Pitron, a journalist and documentary-maker for French television, is not sure.

He’s neither a climate sceptic, nor a fan of inaction. But as the world begins to adopt a common target of net-zero carbon emissions by 2050, Pitron worries that we’re becoming selectively blind to the costs that effort will incur. His figures are stark. Changing our energy model means doubling rare metal production approximately every fifteen years, mostly to satisfy our demand for non-ferrous magnets and lithium-ion batteries. “At this rate,” says Pitron, “over the next thirty years we will need to mine more mineral ores than humans have extracted over the last 70,000 years.”

Before the Renaissance, humans had found a use for just seven metals. Over the course of the industrial revolution, this number increased to just a dozen. Today, we’ve found uses for all 86 of them, and some of them are very rare indeed. For instance, neodymium and gallium are found in iron ore, but there’s 1,200 times less neodymium and up to 2,650 times less gallium than there is iron.

Zipping from an abandoned Mountain Pass mine in the Mojave Desert to the toxic lakes and cancer villages of Baotou in China, Pitron weights the terrible price paid for refining such materials, ably blending his investigative journalism with insights from science, politics and business.

There are two sides to Pitron’s story, woven seamlessly together. First there’s the economic story, of how the Chinese government elected to dominate the global energy and digital transition, so that it now controls 95 per cent of the rare metals market, manufacturing between 80 to 90 per cent of the batteries for electric vehicles, and over half the magnets used in wind turbines and electric motors.

Then there’s the ecological story in which, to ensure success, China took on the West’s own ecological burden. Now 10 per cent of its arable land is contaminated by heavy metals, 80 per cent of its ground water is unfit for consumption and 1.6 million people die every year due to air pollution alone (a recent paper in The Lancet reckons only 1.24 million people die each year — but let’s not quibble.

China’s was a Faustian bargain, freely entered into, but it would not have been possible had Europe and the rest of the Western world not outsourced their own industrial activities, creating a world divided, as Pitron memorably describes it, “between the dirty and those who pretend to be clean”.

The West’s economic comeuppance is now at hand, as its manufacturers, starved of the rare metals they need, are coerced into taking their technologies to China. And we in the West really should have seen this coming: how our reliance on Chinese raw materials would quickly morph into a reliance on China for the very technologies of the energy and digital transition. (Piron tells us that without magnets produced by China’s ChengDu Magnetic Material Science & Technology Company, the United States’ F-35 fifth-generation stealth fighter cannot fly.)

By 2040, in our pursuit of ever-greater connectivity and a cleaner atmosphere, we will need to mine three times more rare earths, five times more tellurium, twelve times more cobalt, and sixteen times more lithium than we do today. China’s ecological ruination and its global technological dominance advance in lockstep, unstoppably — unless we start mining for rare metals ourselves — in the United States, Brazil, Russia, South Africa, Thailand, Turkey, and in the “dormant mining giant” of Pitron’s native France.

Better, says Pitron, that we attain some small shred of supply security, and start mining our own land. At least if mining takes place in the backyards of vocal First World consumers, they can agitate for (and pay for) cleaner processes. And nothing will change “so long as we do not experience, in our own backyards, the full cost of attaining our standard of happiness.”

Seventy minutes of concrete

Watching Last and First Men (2020) directed by Jóhann Jóhannsson for New Scientist

“It’s a big ask for people to sit for 70 minutes and look at concrete,” mused the Icelandic composer Jóhann Jóhannsson, about his first and only feature-length film. He was still working on Last and First Men at the time of his death, aged 48, in February 2018.

Admired in the concert hall for his subtle, keening orchestral pieces, Jóhann Jóhannsson was well known for his film work: Prisoners (2013) and Sicario (2015) are made strange by his sometimes terrifying, thumping soundtracks. Arrival (2016) — about the visitation of aliens whose experience of time proves radically different to our own — inspired a yearning, melancholy score that is, in retrospect, a kind of blockbuster-friendly version of Last and First Men. (It’s worth noting that all three films were directed by Denis Villeneuve, himself no stranger to the aesthetics of concrete — witness 2017’s Blade Runner 2049.)

Jóhannsson’s Last and First Men is, by contrast, contemplative and surreal. It’s no blockbuster. A series of zooms and tracking shots against eerie architectural forms, mesmerisingly shot in monochrome 16mm by Norwegian cinematographer Sturla Brandth Grøvlen, it draws its inspiration and its script (a haunting, melancholy, sometimes chilly off-screen monologue performed by Tilda Swinton) from the 1930 novel by British philosopher William Olaf Stapledon.

Stapledon’s day job — lecturing on politics and ethics at the University of Liverpool — seems now of little moment, but his science fiction novels have never been out of print, and continue to set a dauntingly high bar for successors. Last and First Men is a history of the solar system across two billion years, detailing the dreams and aspirations, achievements and failings of 17 different kinds of future Homo (not including sapiens).

In the light of our ageing sun, these creatures evolve, blossom, speciate, and die, and it’s in the final chapters, and the melancholy moment of humanity’s ultimate extinction, that Jóhannsson’s film is set. Last and First Men is not a drama. There are no actors. There is no action. Mind you, it’s hard to see how any attempt to film Stapledon’s future history could work otherwise. It’s not really a novel; more a haunting academic paper from the beyond.

The idea to use passages from the book came quite late in Jóhannsson project, which began life as a film essay on (and this is where the concrete comes in) the huge, brutalist war memorials, called Spomenik, erected in the former Republic of Yugoslavia between the 1960s and the 1980s.

“Spomeniks were commissioned by Marshal Tito, the dictator and creator of Yugoslavia,” Jóhannsson explained in 2017 when the film, accompanied by a live rendition of an early score, was screened at the Manchester International Festival. “Tito constructed this artificial state, a Utopian experiment uniting the Slavic nations, with so many differences of religion. The spomeniks were intended as symbols of unification. The architects couldn’t use religious iconography, so instead, they looked to prehistoric, Mayan and Sumerian art. That’s why they look so alien and otherworldly.”

Swinton’s cool, regretful, monologue proves an ideal foil for the film’s architectural explorations, lifting what would otherwise be a stunning but slight art piece into dizzying, speculative territory: the last living human, contemplating the leavings of two billion years of human history.

The film was left unfinished at Jóhannsson’s death; it took his friend, the Berlin-based composer and sound artist Yair Elazar Glotman, about a year to realise Jóhannsson’s scattered and chaotic notes. No-one, hearing the story of how Last and First Men was put together, would imagine it would ever amount to anything more than a tribute piece to the composer.

Sometimes, though, the gods are kind. This is a hugely successful science fiction film, wholly deserving of a place beside Tarkovsky’s Solaris and Kubrick’s 2001. Who knew that staring at concrete, and listening to the end of humanity, could wet the watcher’s eye, and break their heart?

It is a terrible shame that Jóhannsson’s did not live to see his hope fulfilled; that, in his own words, “we’ve taken all these elements and made something beautiful and poignant. Something like a requiem.”

 

More than the naughty world deserves

Reading Wikipedia @ 20, edited by Joseph Reagle and Jackie Koerner, for the Telegraph, 10 January 2021

In 2015 the US talk show host and comedian Stephen Colbert coined “truthiness”, one of history’s more chilling buzzwords. “We’re not talking about truth,” he declared, “we’re talking about something that seems like truth — the truth we want to exist.”

Colbert thought the poster-boy for our disinformation culture would be Wikipedia, the open-source internet encyclopedia started, more or less as an afterthought, by Jimmy Wales and Larry Sanger in 2001. If George Washington’s ownership of slaves troubles you, Colbert suggested “bringing democracy to knowledge” by editing his Wikipedia page.

Three years later the magazine Atlantic was calling Wikipedia “the last bastion of shared reality in Trump’s America”. Yes, its coverage is lumpy, idiosyncratic, often persnickety , and not terribly well written. But it’s accurate to a fault, extensive beyond all imagining, and energetically policed. (Wikipedia nixes toxic user content within minutes. Why can’t YouTube? Why can’t Twitter?)

Editors Joseph Reagle and Jackie Koerner — both energetic Wikipedians — know better than to go hunting for Wikipedia’s secret sauce. (A community adage goes that Wikipedia always works better in practice than in theory.) They neither praise nor blame Wikipedia for what it has become, but — and this comes across very strongly indeed — they love it with a passion. The essays they have selected for this volume (you can find the full roster of contributions on-line) reflect, always readably and almost always sympathetically, on the way this utopian project has bedded down in the flaws of the real world.

Wikipedia says it exists “to benefit readers by acting as an encyclopedia, a comprehensive written compendium that contains information on all branches of knowledge”. Improvements are possible. Wikipedia is shaped by the way its unvetted contributors write about what they know and delete what they do not. That women represent only about 12 per cent of the editing community is, then, not ideal.

Harder to correct is the wrinkle occasioned by language. Wikipedias written in different languages are independent of each other. There might not be anything actually wrong, but there’s certainly something screwy about the way India, Australia, the US and the UK and all the rest of the Anglophone world share a single English-language Wikipedia, while only the Finns get to enjoy the Finnish one. And it says something (obvious) about the unevenness of global development that Hindi speakers (the third largest language group in the world) read a Wikipedia that’s 53rd in a ranking of size.

To encyclopedify the world is an impossible goal. Surely the philosophes of eighteenth century France knew that much when they embarked on their Encyclopédie. Paul Otlet’s Universal Repertory and H. G. Wells’s World Brain were similarly Quixotic.

Attempting to define Wikipedia through its intellectual lineage may, however, be to miss the point. In his stand-out essay “Wikipedia As A Role-Playing Game” Dariusz Jemielniak (author of the first ethnography of Wikipedia, Common Knowledge?, in 2014) stresses the playfulness of the whole enterprise. Why else, he asks, would academics avoid it? “”When you are a soldier, you do not necessarily spend your free time playing paintball with friends.”

Since its inception, pundits have assumed that it’s Wikipedia’s reliance on the great mass of unwashed humanity — sorry, I mean “user-generated content” — that will destroy it. Contributor Heather Ford, a South African open source activist, reckons it’s not its creators that will eventually ruin Wikipedia but its readers — specifically, data aggregation giants like Google, Amazon and Apple, who fillet Wikipedia content and disseminate it through search engines like Chrome and personal assistants like Alexa and Siri. They have turned Wikipedia into the internet’s go-to source of ground truth, inflating its importance to an unsustainable level.

Wikipedia’s entries are now like swords of Damocles, suspended on threads over the heads of every major commercial and political actor in the world. How long before the powerful find a way to silence this capering non-profit fool, telling motley truths to power? As Jemielniak puts it, “”A serious game that results in creating the most popular reliable knowledge source in the world and disrupts existing knowledge hierarchies and authority, all in the time of massive anti-academic attacks — what is there not to hate?”

Though one’s dislike of Wikipedia needn’t spring from principles or ideas or even self-interest. Plain snobbery will do. Wikipedia has pricked the pretensions of the humanities like no other cultural project. Editor Joseph Reagle discovered as much ten years ago in email conversation with founder Jimmy Wales (a conversation that appears in Good Faith Collaboration, Reagle’s excellent, if by now slightly dated study of Wikipedia). “One of the things that I noticed,” Wales wrote, “is that in the humanities, a lot of people were collaborating in discussions, while in programming… people weren’t just talking about programming, they were working together to build things of value.”

This, I think, is what sticks in the craw of so many educated naysayers: that while academics were busy paying each other for the eccentricity of their beautiful opinions, nerds were out in the world winning the culture wars; that nerds stand ready on the virtual parapet to defend us from truthy, Trumpist oblivion; that nerds actually kept the promise held out by the internet, and turned it into the fifth biggest site on the Web.

Wikipedia’s guidelines to its editors include “Assume Good Faith” and “Please Do Not Bite the Newcomers.” This collection suggests to me that this is more than the naughty world deserves.