Tally-ho!

Reading Sentient by Jackie Higgins for the Times, 19 June 2021

In May 1971 a young man from Portsmouth, Ian Waterman, lost all sense of his body. He wasn’t just numb. A person has a sense of the position of their body in space. In Waterman, that sense fell away, mysteriously and permanently.

Waterman, now in his seventies, has learned to operate his body rather as the rest of us operate a car. He has executive control over his movements, but no very intimate sense of what his flesh is up to.

What must this be like?

In a late chapter of her epic account of how the senses make sense, and exhibiting the kind of left-field thinking that makes for great TV documentaries, writer-director-producer Jackie Higgins goes looking for answers among the octopuses.

The octopus’s brain, you see, has no fine control over its arms. They pretty much do their own thing. They do, though, respond to the occasional high-level executive order. “Tally-ho!” cries the brain, and the arms gallop off, the brain in no more (or less) control of its transport system than a girl on a pony at a gymkhana.

Is being Ian Waterman anything like being an octopus? Attempts to imagine our way into other animals’ experiences — or other people’s experience, for that matter — have for a long time fallen under the shadow of an essay written in 1974 by American philosopher Thomas Nagel.

“What Is It Like to Be a Bat?” wasn’t about bats so much as to do with consciousness (continuity of). I can, with enough tequila inside me) imagine what it would be like for me to be a bat. But that’s not the same as knowing what’s it’s like for a bat to be a bat.

Nagel’s lesson in gloomy solipsism is all very well in philosophy. Applied to natural history, though — where even a vague notion of what a bat feels like might help a naturalist towards a moment of insight — it merely sticks the perfect in the way of the good. Every sparky natural history writer cocks a snook at poor Nagel whenever the opportunity arises.

Advances in media technology over the last twenty years (including, for birds, tiny monitor-stuffed backpacks) have deluged us in fine-grained information about how animals behave. We now have a much better idea of what (and how) they feel.

Now, you can take this sort of thing only so far. The mantis shrimp (not a shrimp; a scampi) has up to sixteen kinds of narrow-band photoreceptor, each tuned to a different wavelength of light! Humans only have three. Does this mean that the mantis shrimp enjoys better colour vision than we do?

Nope. The mantis shrimp is blind to colour, in the human sense of the word, perceiving only wavelengths. The human brain meanwhile, by processing the relative intensities of those three wavelengths of colour vision, distinguishes between millions of colours. (Some women have four colour receptors, which is why you should never argue with a woman about which curtains match the sofa.)

What about the star-nosed mole, whose octopus-like head is a mass of feelers? (Relax: it’s otherwise quite cute, and only about 2cm long.) Its weird nose is sensitive: it gathers the same amount of information about what it touches, as a regular rodent’s eye gathers about what it sees. This makes the star-nosed mole the fastest hunter we know of, identifying and capturing prey (worms) in literally less than an eyeblink.

What can such a creature tell us about our own senses? A fair bit, actually. That nose is so sensitive, the mole’s visual cortex is used the process the information. It literally sees through its nose.

But that turns out not to be so very strange: Braille readers, for example, really do read through their fingertips, harnessing their visual cortex to the task. One veteran researcher, Paul Bach-y-Rita, has been building prosthetic eyes since the 1970s, using glorified pin-art machines to (literally) impress the visual world upon his volunteers’ backs, chests, even their tongues.

From touch to sound: in the course of learning about bats, I learned here that blind people have been using echolocation for years, especially when it rains (more auditory information, you see); researchers are only now getting a measure of their abilities.

How many senses are there that we might not have noticed? Over thirty, it seems, all served by dedicated receptors, and many of them elude our consciousness entirely. (We may even share the magnetic sense enjoyed by migrating birds! But don’t get too excited. Most mammals seem to have this sense. Your pet dog almost always pees with its head facing magnetic north.)

This embarrassment of riches leaves Higgins having to decide what to include and what to leave out. There’s a cracking chapter here on how animals sense time, and some exciting details about a sense of touch common to social mammals: one that responds specifically to cuddling.

On the other hand there’s very little about our extremely rare ability to smell what we eat while we eat it. This retronasal olfaction gives us a palate unrivalled in the animal kingdom, capable of discriminating between nearly two trillion savours: and ability which has all kinds of implications for memory and behaviour.

Is this a problem? Not at all. For all that it’s stuffed with entertaining oddities, Sentient is not a book about oddities, and Higgins’s argument, though colourful, is rigorous and focused. Over 400 exhilarating pages, she leads us to adopt an entirely unfamiliar way of thinking about the senses.

Because their mechanics are fascinating and to some degree reproduceable (the human eye is, mechanically speaking, very much like a camera) we grow up thinking of the senses as mechanical outputs.

Looking at our senses this way, however, is rather like studying fungi but only looking at the pretty fruiting bodies. The real magic of fungi is their networks. And the real magic of our senses is the more than 100 billion nerve cells in each human nervous system — greater, Higgins says, than the number of stars in the Milky Way.

And that vast complexity — adapting to reflect and organise the world, not just over evolutionary time but also over the course of an individual life — gives rise to all kinds of surprises. In some humans, the ability to see with sound. In vampire bats (who can sense the location of individual veins to sink their little fangs into), the ability to detect heat using receptors that in most other mammals are used to detect acute pain.

In De Anima, the ancient philosopher Aristotle really let the side down in listing just five senses. No one expects him to have spotted exotica like cuddlesomeness and where-to-face-when-you-pee. But what about pain? What about balance? What about proprioception?

Aristotle’s restrictive and mechanistic list left him, and generations after him, with little purchase on the subject. Insights have been hard to come by.

Aristotle himself took one look at the octopus and declared it stupid.

Let’s see him driving a car with eight legs.

Variation and brilliance

Reading Barnabas Calder’s Architecture: from prehistory to climate emergency for New Scientist, 9 June 2021

For most of us, buildings are functional. We live, work, and store things in them. They are as much part of us as the nest is a part of a community of termites.

And were this all there was to say about buildings, architectural historian Barnabas Calder might have found his book easier to write. Calder wants to ask “how humanity’s access to energy has shaped the world’s buildings through history.” And had his account remained so straightforward, we might have ended up with an eye-opening mathematical description of the increase the energy available for work — derived first from wood, charcoal and straw, then from coal, then from oil — and how it first transformed, and (because of global warming) now threatens our civilisation.

And sure enough the book is full of startling statistics. (Fun fact: the charcoal equivalent of today’s cement industry would have to cover an area larger than Australia in coppiced timber.)

But of course, buildings aren’t simply functional. They’re aspirational acts of creative expression. However debased it might seem, the most ordinary structure is a work of a species of artist, and to get built at all it must be bankrolled by people who are (at least relatively) wealthy and powerful. This was as true of the buildings of Uruk (our first known city, founded in what is now Iraq around 3200 BCE) as it is of the buildings of Shenzhen (in 1980 a Chinese fishing hamlet, today a city of nearly 13 million people).

While the economics of the build environment are crucially important, then, they don’t really make sense without the sociology, and even the psychology, especially when it comes to “the mutual stirring of hysteria between architect and client” that gave us St Peter’s Basilica in the 16th century and Chengdu’s New Century Global Center (currently the world’s biggest building) in the 21st.

Calder knows this: “What different societies chose to do with [their] energy surplus has produced endless variation and brilliance,” he says. So if sometimes his account seems to wander, this is why: architecture itself is not a wholly economic activity, and certainly not a narrowly rational one.

At the end of an insightful and often impassioned journey through the history of buildings, Calder does his level best to explain how architecture can address the climate emergency. But his advices and encouragements vanish under the enormity of the crisis. The construction and running of buildings account for 39 per cent of all human greenhouse gas emissions. Concrete is the most used material on Earth after water. And while there is plenty of “sustainability” talk in the construction sector, Calder finds precious little sign of real change. We still demolish too often, and build too often, using unsustainable cement, glass and steel.

It may be that solutions are out there, but are simply invisible. The history of architecture is curiously incomplete, as Calder himself acknowledges, pointing out that “entire traditions of impressive tent-like architecture are known mainly from pictures rather than physical remnants.”

Learning to tread more lightly on the earth means exactly that: a wholly sustainable architecture wouldn’t necessarily show up in the archaeological record. The remains of pre-fossil fuel civilisations can, then, only offer us a partial guide to what our future architecture should look like.

Perhaps we should look to existing temporary structures — to refugee camps, perhaps. The idea may be distressing, but fashions change.

Calder’s long love-poem to buildings left me, rather paradoxically, thinking about the Mongols of the 13th century, for whom a walled city was a symbol of bondage and barbarism.

They would have no more settled in a fixed house than they would have submitted to slavery. And their empire, which covered 23 million square kilometres, demolished more architecture than it raised.

Nothing happens without a reason

Reading Journey to the Edge of Reason: The Life of Kurt Gödel by Stephen Budiansky for the Spectator, 29 May 2021

The 20th-century Austrian mathematician Kurt Gödel did his level best to live in the world as his philosophical hero Gottfried Wilhelm Leibnitz imagined it: a place of pre-established harmony, whose patterns are accessible to reason.

It’s an optimistic world, and a theological one: a universe presided over by a God who does not play dice. It’s most decidedly not a 20th-century world, but “in any case”, as Gödel himself once commented, “there is no reason to trust blindly in the spirit of the time.”

His fellow mathematician Paul Erdös was appalled: “You became a mathematician so that people should study you,” he complained, “not that you should study Leibnitz.” But Gödel always did prefer study to self-expression, and is this is chiefly why we know so little about him, and why the spectacular deterioration of his final years — a fantasmagoric tale of imagined conspiracies, strange vapours and shadowy intruders, ending in his self-starvation in 1978 — has come to stand for the whole of his life.

“Nothing, Gödel believed, happened without a reason,” says Stephen Burdiansky. “It was at once an affirmation of ultrarationalism, and a recipe for utter paranoia.”

You need hindsight to see the paranoia waiting to pounce. But the ultrarationalism — that was always tripping him up. There was something worryingly non-stick about him. He didn’t so much resist the spirit of the time as blunder about totally oblivious of it. He barely noticed the Anschluss, barely escaped Vienna as the Nazis assumed control, and, once ensconced at the Institute for Advanced Study at Princeton, barely credited that tragedy was even possible, or that, say, a friend might die in a concentration camp (it took three letters for his mother to convince him).

Many believed that he’d blundered, in a way typical to him, into marriage with his life-long partner, a foot-care specialist and divorcée called Adele Nimbursky. Perhaps he did. But Burdiansky does a spirited job of defending this “uneducated but determined” woman against the sneers of snobs. If anyone kept Gödel rooted to the facts of living, it was Adele. She once stuck a concrete flamingo, painted pink and black, in a flower bed right outside his study window. All evidence suggests he adored it.

Idealistic and dysfunctional, Gödel became, in mathematician Jordan Ellenberg’s phrase, “the romantic’s favourite mathematician”, a reputation cemented by the fact that we knew hardly anything about him. Key personal correspondence was destroyed at his death, while his journals and notebooks — written in Gabelsberger script, a German shorthand that had fallen into disuse by the mid-1920s — resisted all-comers until Cheryl Dawson, wife of the man tasked with sorting through Gödel’s mountain of posthumous papers — learned how to transcribe it all.

Biographer Stephen Budiansky is the first to try to give this pile of new information a human shape, and my guess is it hasn’t been easy.

Burdiansky handles the mathematics very well, capturing the air of scientific optimism that held sway over the intellectual Vienna and induced Germany’s leading mathematician David Hilbert to declare that “in mathematics there is *nothing* unknowable!”

Solving Hilbert’s four “Problems of Laying Foundations for Mathematics” of 1928 was supposed to secure the foundations of mathematics for good, and Gödel, a 22-year-old former physics student, solved one of them. Unfortunately for Hilbert and his disciples, however, Gödel also proved the insolubility of the other three. So much for the idea that all mathematics could be derived from the propositions of logic: Gödel demonstrated that logic itself was flawed.

This discovery didn’t worry Gödel nearly so much as it did his contemporaries. For Gödel, as Burdiansky explains, “Mathematical objects and a priori truth was as real to him as anything the senses could directly perceive.” If our reason failed, well, that was no reason to throw away the world: we would always be able to recognise some truths through intuition that could never be established through computation. That, for Gödel, was the whole point of being human.

It’s one thing to be a Platonist in a world dead set against Platonism, or an idealist in the world that’s gone all-in with materialism. It’s quite another to see acts of sabotage in the errors of TV listings magazines, or political conspiracy in the suicide of King Ludwig II of Bavaria. The Elysian calm and concentration afforded Gödel after the second world war at the Institute of Advanced Study probably did him more harm than good. “Gödel is too alone,” his friend Oskar Morgenstern fretted: “he should be given teaching duties; at least an hour a week.”

In the end, though, neither his friendships nor his marriage nor that ridiculous flamingo could tether to the Earth a man who had always preferred to write for his desk drawer, and Burdiansky, for all his tremendous efforts and exhaustive interrogations of Godel’s times and places, acquaintances and offices, can only leave us, at the end, with an immeasurably enriched version of Gödel the wise child. It’s an undeniably distracting and reductive picture. But — and this is the trouble — it’s not wrong.

Life at all costs

Reading The Next 500 Years by Chris Mason for New Scientist, 12 May 2021

Humanity’s long-term prospects don’t look good. If we don’t all kill each other with nuclear weapons, that overdue planet-killing asteroid can’t be too far off; anyway, the Sun itself will (eventually) explode, obliterating all trace of life in our planetary system.

As if awareness of our own mortality hasn’t given us enough to fret about, we are also capable of imagining our own species’ extinction. Once we do that, though, are we not ethically bound to do something about it?

Cornell geneticist Chris Mason thinks so. “Engineering,” he writes, “is humanity’s innate duty, needed to ensure the survival of life.” And not just human life; Mason is out to ensure the cosmic future of all life, including species that are currently extinct.

Mason is not the first to think this way, but he arrives at a fascinating moment in the history of technology, when we may, after all be able to avoid some previously unavoidable catastrophes.

Mason’s 500-year plan for our future involves reengineering human and other genomes so that we can tolerate the (to us) extreme environments of other worlds. Our ultimate goal, Mason says, should be to settle new solar systems.

Spreading humanity to the stars would hedge our bets nicely, only we currently lack the tools to survive the trip, never mind the stay. That’s where Mason comes in. He was principal investigator on NASA’s Twins Study, begun in 2015: a foundational investigation into the health of identical twins Scott Kelly and Mark Kelly during the 340 days Scott was in space and Mark was on Earth.

Mason explains how the Twins Study informed NASA’s burgeoning understanding of the human biome, how a programme once narrowly focused on human genetics now extends to embrace bacteria and viruses, and how new genetic engineering tools like CRISPR and its hopeful successors may enable us to address the risks of spaceflight (exposure to cosmic radiation radiation is considered the most serious) and protect the health of settlers on the Moon, on Mars, and even, one day, on Saturn’s moon Titan.

Outside his specialism, Mason has some fun (a photosythesizing human would need skin flaps the size of two tennis courts — so now you know) then flounders slightly, reaching for familiar narratives to hold his sprawling vision together. More informed readers may start to lose interest in the later chapters. The role of spectroscopy in the detection of exoplanets is certainly relevant, but in a work of this gargantuan scope, I wonder if it needed rehearsing. And will readers of a book like this really need reminding of Frank’s Drake equation (regarding the likelihood of extra-terrestrial civilisations)?

Uneven as it is, Mason’s book is a genuine, timely, and very personable addition to a 1,000-year-old Western tradition, grounded in religious expectations and a quest for transcendence and salvation. Visionaries from Isaac Newton to Joseph Priestley to Russian space pioneer Konstantin Tsiolkowsky have spouted the very tenets that underpin Mason’s account: that the apocalypse is imminent; and that, by increasing human knowledge, we may recover the Paradise we enjoyed before the Flood.

Masonic beliefs follow the same pattern; significantly, many famous NASA astronauts, including John Glenn, Buzz Aldrin and Gordo Cooper, were Freemasons.

Mason puts a new layer of flesh on what have, so far, been some ardent but very sketchy dreams. And, though a proud child of his engineering culture, he is no dupe. He understands and explores all the major risks associated with genetic tinkering, and entertains all the most pertinent counter-arguments. He knows where 19th-century eugenics led. He knows the value of biological and neurological diversity. He’s not Frankenstein. His deepest hope is not that his plans are realised in any recognisable form; but that we continue to make plans, test them and remake them, for the sake of all life.

Return From The Stars

Here’s my foreword to the new MIT edition of Stanislaw Lem’s Return from the Stars.

For a while I lived on the penthouse level of a gherkin-shaped glass tower, five minutes from DIFC, Dubai’s international financial centre. I’d spend the day perched at my partner’s kitchen counter, or sprawled in her hammock, writing long-hand into a green notebook. In the evening I would stand at the glass wall, stir-crazy by then, fighting an ice-cream headache brought on by the air conditioning, counting lime-green Lamborghinis crawling nose-to-tail along the six lanes of Al Saada Street, far below.

Once the sun bled out into the mist, I would leave the tower and pick my way over broken and unfinished pavements to the DIFC. There are several ways into the complex; none of them resemble civic space. A narrow, wood-panelled atrium led to the foot of a single, slow escalator. It deposited me on a paved mezzanine from which several huge towers grew. There weren’t any pavements as such, just stretches of empty space between chairs and tables. There were art galleries and offices and a destination restaurant run by a car company, each unit folding into the next, giving me at one moment the feeling of exhilaration, as though this entire place was mine for the taking, the next a sense of tremendous awkwardness as though I’d stumbled uninvited into an important person’s private party.

I couldn’t tell from the outside which mirrored door led to a mall, which to a reception desk, which to a lobby. I would burst into every building with an aggressive confidence, penetrating as far as I could into these indeterminate spaces until I found an exit — though an exit into what, exactly? Another atrium. A bank. A water feature. A beauty salon. Eventually I would be stopped and politely redirected.

*

Of middling years, the space explorer Hal Bregg is having to contend with two kinds of ageing. The first is historical. The other, biological.

He has just returned to Earth after a gap of 127 years, and the place he once called home has turned incomprehensibly strange in the intervening time. Thanks to the dilation effect brought on by interstellar travel, only ten ship-board years have passed. But this interval has been hard on Bregg. The trauma and beauty of his harrowing missions aboard the Prometheus still haunt him, and power some of the Return’s most affecting passages.

Bregg’s arrival on Earth, fresh from acclimatisation sessions on the Moon, is about as exciting as a bus ride. At the terminal there are no parades, no gawkers, no journalists. He’s a curiosity at best, and not even a rare one (there have been other ships, other crews). Arriving at the terminus, he’s just another passenger. He misses the official sent to meet him, decides he’ll find his own way about, sets off for the city, and becomes quickly lost. He can’t find the edge of the terminus. He can’t find the floor he needs. Everything seems to be rising. Is this a giant elevator? “It was hard to rest the eye on anything that was not in motion, because the architecture on all sides appeared to consist in motion along, in change…” He decides to follow the crowd. Is this a hotel? Are these rooms? People are fetching objects from the walls. He does the same. In his hands is a coloured translucent tube, slightly warm. Is this a room key? A map? Food?

“And suddenly I felt like a monkey that has been given a fountain pen or a lighter; for an instant I was seized by a blind rage.”

Among Stanislaw Lem’s many gifts is his understanding of how the future works. The future is not a series of lucky predictions (though Lem was luckier than most at that particular craps table). The future is not wholly exotic; it is, by necessity, still chock full of the past. The future is not one place, governed by one idea. Suppose you get a handle on some major change (Return from the Stars boasts one of those — but we’ll get to “betrization” in a minute). This insight won’t give you some mysterious, magical insight into everything else. You’ll be armed, but you’ll still be lost. The future is bigger than you think.

The point about the future is that it’s unreadable. You recognise the language, but you can no longer speak it. The words fell out of order years ago. The vowels shifted.

“I was numb from the strain of trying not to do anything wrong. This, for four days now. From the very first moment I was invariably behind in everything that went on, and the constant effort to understand the simplest conversation or situation turned that tension into a feeling horribly like despair. ”

Bregg contemplates the horrid beauty of the terminal (yes, he’s still in the terminal. Settle in: he’s going to be wandering that terminal for a very long time) and admires (if that is quite the word) its “coloured galaxies of squares, clusters of spiral lights, glows shimmering above skyscrapers, the streets: a creeping peristalsis with necklaces of light, and over this, in the perpendicular, cauldrons of neon, feather crests and lightning bolts, circles. airplanes, and bottles of flame, red dandelions made of needle-signal lights, momentary suns and haemorrhages of advertising, mechanical and violent.”

But if you think Lem will stop there, in contemplation of an overdriven New York skyline, then you need to read more Lem. The window gutters out. “You have been watching clips from newsreels of the seventies,” the air announces, “in the series Views of the Ancient Capitals.”

*

Science fiction delights in expensive real-estate. Sky-high homes give its protagonists the geostationary perspective they require to scry large amounts of weird information very quickly. In cinema, Rick Deckard and his replicant protege K are both penthouse dwellers. And in J G Ballard’s High Rise (1975) Robert Laing eats roast dog on the balcony of, yes, a 25th-floor apartment.

The world of Return from the Stars is altogether more socially progressive. Its buildings are only partly real, Their continuation is an image, “so that the people living on each level do not feel deprived. Not in any way.” The politics of this future, which teaches its infant children “the principles of tolerance, coexistence, respect for other beliefs and attitudes,” are disconcertingly familiar, and mostly admirable.

But let’s stay with the media for a moment. This is a future adept at immersing its denizens in a thoroughly mediated environment — thus an enhanced movie becomes a “real” in Michael Kandel’s savvy, punning translation. For those who know the canon, there’s a delicious hint here of what’s to come: in Lem’s The Futurological Congress (1971) 29 billion people live out lives of unremitting squalor, yet each thinks they’re a tycoon, wrapt as they are in drug-induced hallucination.

The Return’s future, by contrast, handles the physical world perfectly well, thank you. Its population are not narcotised. Not at all: they’re full of good ideas, hold down rewarding jobs, maintain close friendships, enjoy working marriages, and nurture happy children. They have all manner of things to live for. This future — “a world of tranquility, of gentle manners and customs, easy transitions, undramatic situations” — is anything but a dystopia. Why, it’s not even dull!

*

An operation, “betrization”, conducted in early childhood, renders people incapable of serious violence. Murder becomes literally inconceivable, and as a side-effect, risk loses its allure. Betrization has been universally adopted across the Earth, and it’s mandatory.

Bregg and his ancient fellows cannot help but view betrization with horror. Consequently, no-one has the stomach to force it upon them. This leaves the returning astronauts as predators in fields full of friendly, intelligent, accommodating sheep.

But why would Hal Bregg want to predate? Why would any of them? What would it gain them, beyond a spoiled conscience? Everyone on this contented Earth assumes the returnees are savages, though most are far too polite (or risk-averse) to say so. For Bregg and his fellows, it’s enraging. It’s alienating. It’s a prompt to the sort of misbehaviour that they would otherwise never dream of indulging.

Lem himself considered Return from the Stars a failure, and blamed betrization for it. As an idea, it was too on-the-nose: a melodramatic conceit that he had to keep underplaying so the story — a quiet affair about friendship and love and misunderstanding — would stay on track. But times and customs change, and history has been kind to the Return. It has become, in 2020, a better book than it was in the 1960s. This is because we have grown into the very future it predicted. Indeed, we are embroiled in precisely the kind of cultural conflict Lem said would ensue, once betrization was invented.

“Young people, betrizated, became strangers to their own parents, whose interests they did not share. They abhorred their parents’ bloody tastes. For a quarter of a century is was necessary to have two types of periodicals, books, plays: one for the old generation, one for the new.”

Western readers with experience of university life and politics over the last thirty years will surely recognise themselves in these pages, where timid passions dabble with free love, where thin skins heal in safe spaces, and intellectual gadflies navigate a landscape of extreme emotional delicacy, under constant threat of cancellation.

Will they blush, to see themselves thus reflected? I doubt it. Emulsify them as you like, kindness and a sense of humour do not mix. Anyway, the Return is not a satire, any more than it is a dystopia. It is not, when push comes to shove, a book about a world at all (for which some may read: not science fiction).

It is a book about Hal Bregg. About his impulse towards solitude and his need for company. About his deep respect for old friends, and his earnest desire for new ones. It’s about a kind and thoughtful bull in an emotional china shop, trying desperately not to rape things. It’s about men.

*

We would meet at last and order a cab and within the hour we would be sitting overlooking the Gulf in a bar fashioned to resemble the hollowed interior of a golden nugget. I remember one evening my partner chose what to order and I was handed a glass of a colourless liquid topped with a film of crude oil. I thought: Do I drink this or do I light it? And suddenly I felt like a monkey that has been given a fountain pen or a lighter; for an instant I was seized by a blind rage.

That was the evening she told me why she didn’t want to see me any more. The next day I left for London. 19 March 2017: World Happiness Day. As we turned north for the airport I noticed that the sign at the junction had changed. In place of Al Saada: “Happiness Street.”

And a ten-storey-high yellow smiley icon had been draped across the face of a government building.

To hell with the philosopause!

Reading Hawking Hawking: The Selling of a Scientific Celebrity by Charles Seife for the Spectator, 1 May 2021

I could never muster much enthusiasm for the theoretical physicist Stephen Hawking. His work, on the early universe and the nature of spacetime, was Nobel-worthy, but those of us outside his narrow community were horribly short-changed. His 1988 global best-seller A Brief History of Time was incomprehensible, not because it was difficult, but because it was bad.

Nobody, naturally, wanted to ascribe Hawking’s popular success to his rare form of Motor Neurone Disease, Hawking least of all. He afforded us no room for horror or, God forbid, pity. In 1990, asked a dumb question about how his condition might have shaped his work (because people who suffer ruinous, debilitating illnesses acquire compensating superpowers, right?) Hawking played along: “I haven’t had to lecture or teach undergraduates, and I haven’t had to sit on tedious and time-consuming committees. So I have been able to devote myself completely to research.”

The truth — that Hawking was one of the worst popular communicators of his day — is as evident as it is unsayable. A Brief History of Time was incomprehensible because after nearly five years’ superhuman effort, the author proved incapable of composing a whole book unaided. He couldn’t even do mathematics the way most people do it, by doodling, since he’d already lost the use of his hands. He could not jot notes. He could not manipulate equations. He had to turn every problem he encountered into a species of geometry, just to be able to think about it. He held his own in an impossibly rarified profession for years, but the business of popular communication was beyond him. As was communication, in the end, according to Hawking’s late collaborator Andy Strominger: “You would talk about words per minute, and then it went to minutes per word, and then, you know, it just got slower and slower until it just sort of stopped.”

Hawking became, in the end, a computerised patchwork of hackneyed, pre-stored utterances and responses. Pull the string at his back and marvel. Charles Seife, a biographer braver than most, begins by staring down the puppet. His conceit is to tell Stephen Hawking’s story backwards, peeling back the layers of celebrity and incapacity to reveal the wounded human within.

It’s a tricksy idea that works so well, you wonder why no-one thought of it before (though ordering his material and his arguments in this way must have nearly killed the poor author).

Hawking’s greatest claim to fame is that he discovered things about black holes — still unobserved at that time — that set the two great schools of theoretical physics, quantum mechanics and relativity, at a fresh and astonishingly creative loggerheads.

But a new golden era of astronomical observation dawned almost immediately after, and A Brief History was badly outdated before it even hit the shelves. It couldn’t even get the date of the universe right.

It used to be that genius that outlived its moment could reinvent itself. When new-fangled endocrine science threw Ivan Pavlov’s Nobel-winning physiology into doubt, he reinvented himself as a psychologist (and not a bad one at that).

Today’s era of narrow specialism makes such a move almost impossible but, by way of intellectual compensation, there is always philosophy — a perennially popular field more or less wholly abandoned by professional philosophers. Images of the middle-aged scientific genius indulging its philosopause in book after book about science and art, science God, science and society and so on and so forth, may raise a wry smile, but work of real worth has come out of it.

Alas, even if Hawking had shown the slightest aptitude for philosophy (and he didn’t), he couldn’t possibly have composed it.

In our imaginations, Hawking is the cartoon embodiment of the scientific sage, effectively disembodied and above ordinary mortal concerns. In truth, life denied him a path to sagacity even as it steeped him in the spit and stew of physical being. Hawking’s libido never waned. So to hell with the philosopause! Bring on the dancing girls! Bring on the cheques, from Specsavers, BT, Jaguar, Paddy Power. (Hawking never had enough money: the care he needed was so intensive and difficult, a transatlantic air flight could set him back around a quarter of a million pounds). Bring on the billionaires with their fat cheques books (naifs, the lot of them, but decent enough, and generous to a fault). Bring on the countless opportunities to bloviate about subjects he didn’t understand, a sort of Prince Charles only without Charles’s efforts at warmth.

I find it impossible, having read Seife, not to see Hawking through the lens of Jacobean tragedy, warped and raging, unable even to stick a finger up at a world that could not — but much worse, *chose* not — to understand him. Of course he was a monster, and years too late, and through a book that will anger many, I have come to love him for it.

Bacon is grey

Reading Who Poisoned Your Bacon Sandwich? by Guillaume Coudray and Hooked: How processed food became addictive by Michael Moss for the Financial Times, 21 April 2021

The story of how food arrives on our plate is a living, breathing sci-fi epic. Fertiliser produced by sucking nitrogen out of the air now sustains about half the global population. Farmers worldwide depend on the data spewing from 160 or so environmental satellite missions in low-earth orbit, not to mention literally thousands of weather satellites of various kinds.

That such a complex system is precarious hardly needs saying. It only takes one innovative product, or cheeky short-cut, to transform the health, appearance and behaviour of nations. What gastronomic historian, 50 years ago, would have ptedicted that China would grow fat, or that four and out five French cafés would shut up shop in a single generation?

To write about the food supply is to wrestle with problems of scale, as two new books on the subject demonstrate. To explain how we turned the green revolution of the 1960s into a global obesity pandemic in less than half a century, Michael Moss must reach beyond history entirely, and into the contested territories of evolutionary biology. Guillaume Coudray, Paris-based investigative journalist, prefers a narrower argument, focusing on the historical accidents, and subsequent cover-ups that even now add cancer-causing compounds to our processed meat. The industry attitudes and tactics he reveals strongly resemble those of the tobacco industry in the 1970s and 1980s.

Ably translated as Who Poisoned Your Bacon Sandwich?, Coudray’s 2017 expose tells the story of the common additives used to cure — and, crucially, colour — processed meats. Until 1820, saltpetre (potassium nitrate; a constituent of gunpowder) was our curing agent of choice — most likely because hunters in the 16th century discovered that game birds shot with their newfangled muskets kept for longer. Then sodium nitrate appeared, and — in the mid 1920s — sodium nitrite. All three give meats a convincing colour in a fraction of the time traditional salting requires. Also, their disinfectant properties allow unscrupulous producers to operate in unsanitary conditions.

Follow basic rules of hygiene, and you can easily cure meat using ordinary table salt. But traditional meats often take upwards of a year to mature; no wonder that the 90-day hams pouring out of Chicago’s meatpacking district at the turn of the 20th century conquered the world market. Parma ham producers still use salt; most everyone else has resorted to nitrate and nitrates just to survive.

It wasn’t until the 1970s that researchers found a link between these staple curing agents and cancer. This was, significantly, also the moment industry lobbyists began to rewrite food history. The claim that we’ve been preserving meat with saltpetre for over 5,000 years is particularly inventive: back then it was used to preserve Egyptian mummies, not cure hams. Along with the massaged history came obfuscation – for instance arguments that nitrates and nitrites are not carcinogenic in themselves, even if they give rise to carcinogenic agents during processing, cooking or, um, digestion.

And when, in 2015, experts of the International Agency for Research on Cancer classified all processed meats in “group 1: carcinogenic to humans” (they can cause colorectal cancer, the second most deadly cancer we face) the doubt-mongers redoubled their efforts — in particular the baseless claim that nitrates and nitrites are our only defence against certain kinds of food poisoning.

There are alternatives. If it’s a disinfectant-cum-curing agent you’re after, organic water-soluble salts called sorbates work just fine.

Crucially, though, sorbates have no colourant effect, while nitrates and nitrites give cured meat that rosy glow. Their use is so widespread, we have clean forgotten that the natural colour of ham, pate, weiner sausages and bacon is (deal with it) grey.

That the food industry wants to make food as attractive as possible, so that it can sell as much as possible is, of itself, hardly news.

And in Hooked (a rather different beast to his 2013 exposé Sugar Salt Fat), American journalist Michael Moss finds that — beyond the accusations and litigations around different foodstuffs — there’s something systemically wrong with our relationship to food. US consumers now fill three-quarters of their shopping carts with processed food. Snacking now accounts for around a quarter of our daily calorie intake. Pointing the finger at Coca-Cola or McDonalds is not going to solve the bigger problem, which Moss takes to be changes in the biology of our ancestors which have made it extremely difficult to recoup healthy eating habits once they’ve run out of control.

Moss argument is cogent, but not simple. We have to get to grips, first, with the latest thinking on addiction, which has more or less dispensed with the idea that substances are mind-altering. Rather, they are mind-engaging, and the speed of their effect has quite as much, if not more to do with their strength than their pharmacology.

By this measure, food is an incredibly powerful drug (A taste of sugar hits the brain 20 times faster than a lungful of tobacco smoke). But does it make any sense to say we’re all addicted to food?

Moss says it does — only we need to dip our toes in evolutionary biology to understand why. As primates, we have lost a long bone, called the transverse lamina, that used to separate the mouth from the nose. Consequently, we can smell food as we taste it.

No one can really explain why an enhanced appreciation of flavour gave us such a huge evolutionary advantage, but the biology is ungainsayable: we are an animal obsessed with gustatory variety. In medieval France, this inspired hundreds of different sauces. Today, in my local supermarket, it markets 50-odd different varieties of potato chip.

The problem, Moss says, is not that food manufacturers are trying to addict us. It is that they have learned how to exploit an addiction baked into our biology.

So what’s the solution? Stop drinking anything with calories? Avoid distractions when we eat? Favour foods we have to chew? All of the above, of course — though it’s hard to see how good advice on its own could ever persuade us all to act against our own appetites.

Hooked works, in a rambunctious, shorthand sort of way. Ultimately, though, it may prove be a transitional book for an author who is edging towards a much deeper reappraisal of the relationship between food, convenience (time, in other words), and money.

Neither Moss nor Coudray demands we take to the barricades just yet. But the pale, unappetisingly grey storm clouds of a food revolution are gathering.

“To penetrate humbly…”

Reading Beyond by Stephen Walker for the Telegraph, 18 April 2021

On 30 May 2020 US astronauts Bob Behnken and Doug Hurley flew to the International Space Station. It was the first time a crew had left the planet from US soil since 2011.

In the interim, something — not wrong, exactly, but certainly strange — had happened to space travel. Behnken and Hurley’s SpaceX-branded space suits looked like something I would throw together as a child, even down to my dad’s biking helmet and — were those Wellington boots? The stark interior of SpaceX’s Crew Dragon capsule was even more disconcerting. Poor Behnken and Hurley! they looked as if they were riding in the back of an Uber.

Well, what goes around comes around, I suppose. The capsule that carried Yuri Gagarin into space on 12 April 1961 boasted an almost ludicrously bare central panel of just four dials. Naysayers sniped that Gagarin had been a mere passenger — a human guinea pig.

By contrast, the design of the Mercury cockpit, that carried America’s first astronaut into space, was magnificently, and possibly redundantly fussy says Stephen Walker, in his long and always thrilling blow-by-blow account of the United States’ and the Soviet Union’s race into orbit: “Almost every inch of it was littered with dials, knobs, indicators, lights and levers just like a ‘real’ aeroplane cockpit.”

America’s “Gemini Seven” (two-seater Gemini capsules quickly succeeded the Mercuries) were celebrities, almost absurdly over-qualified for their task of being rattled around in the nose of an intercontinental ballistic missile. Their space programme was public — and so were its indignities, like the fact that virtually everything they were being asked to do, a chimpanzee had done before them.

It drove Alan Shepard — the man fated to be the first American in space — into a rage. On one training session somebody joked, “Maybe we should get somebody who works for bananas”. The ash tray Shepard threw only just missed his head.

The Soviet Union’s space programme was secret. Not even their wives knew what the “Vanguard Seven” were up to. They won no privileges. Sometimes they’d polish other people’s floors to make ends meet.

Those looking for evidence of the gimcrack quality of the Soviet space effort will find ammunition in Beyond. Contrast, for example, NASA’s capsule escape plans (involving a cherry-picker platform and an armoured vehicle) with the Soviet equivalent (involving a net and a bath tub).

But Walker’s research for this book stretches back a decade and his acknowledgements salute significant historians (Asif Siddiqi in particular), generous interviewees and a small army of researchers. He’ll not fall for such clichés. instead, he shows how the efforts of each side in the race to space were shaped by the technology they had to hand.

Soviet hydrogen bombs were huge and heavy, and needed big, powerful rockets to carry them. Soviet space launches were correspondingly epic. The Baikonur cosmodrome in Soviet Kazakhstan — a desolate, scorpion-infested region described in Soviet encyclopaedias as “the Home of the Black Death” — was around a hundred times the size of Cape Canaveral. Its launch bunkers were buried beneath several metres of reinforced concrete and earth because, says Walker, “a rocket the size and power of the R-7 would probably have flattened the sort of surface blockhouse near the little Redstone in Cape Canaveral.”

Because the US had better (lighter, smaller) nuclear bombs, its available rocket technology was — in space-piercing terms — seriously underpowered. When Alan Shepard finally launched from Cape Canaveral on 5 May 1961, twenty-three days after Yuri Gagarin circled the earth, his flight lasted just over fifteen minutes. He splashed down in the Atlantic Ocean 302 miles from the Cape. Gagarin travelled some 26,000 miles around the planet.

The space race was the Soviets’ to lose. Once Khrushchev discovered the political power of space “firsts” he couldn’t get enough of them. “Each successive space ‘spectacular’ was exactly that,” Walker writes, “not so much part of a carefully structured progressive space programme but yet another glittering showpiece, preferably tied to an important political anniversary”. Attempts to build a co-ordinated strategy were rejected or simply ignored. This is a book as much about disappointment as triumph.

Beyond began life as a film documentary, but the newly discovered footage Walker was offered proved too damaged for use. Thank goodness he kept his notes and his nerve. This is not a field that’s starved of insight: Jamie Doran and Piers Bizony wrote a cracking biography of Gagarin called Starman in 1998; the autobiography of Soviet systems designer Boris Chertok runs to four volumes. Still, Walker brings a huge amount that is new and fresh to our understanding of the space race.

Over the desk of the Soviet’s chief designer Sergei Korolev hung a portrait of the nineneenth-century Russian space visionary Konstantin Tsiolkovsky, and with it his words: “Mankind will not stay on Earth for ever but in its quest for light and space it will first penetrate humbly beyond the atmosphere and then conquer the whole solar system.”

Beyond shows how that dream — what US aviation pioneer James Smith McDonnell called “the creative conquest of space” — was exploited by blocs committed to their substitute for war — and how, for all that, it survived.

Tally of a lost world

Reading Delicious: The evolution of flavor and how it made us human by Rob Dunn and Monica Sanchez for New Scientist, 31 March 2021

Dolphins need only hunger and a mental image of what food looks like. Their taste receptors broke long ago, and they no longer taste sweet, salty or even umami, thriving on hunger and satisfaction alone.

Omnivores and herbivores have a more various diet, and more chances of getting things badly wrong, so they are guided by much more highly developed senses (related, even intertwined, but not at all the same) of flavour (how something tastes) and aroma (how something smells).

Evolutionary biologist Robb Dunn and anthropologist Monica Sanchez weave together what chefs now know about the experience of food, what ecologists know about the needs of animals, and what evolutionary biologists know about how our senses evolved, to tell the story of how we have been led by our noses through evolutionary history, and turned from chimpanzee-like primate precursor to modern, dinner-obsessed Homo sapiens.

Much of the work described here dovetails neatly with work described in biological anthropologist Richard Wrangham’s 2009 book Catching Fire: How cooking made us human. Wrangham argued that releasing the calories bound up in raw food by cooking it led to a cognitive explosion in Homo sapiens, around 1.9 million years ago.

As Dunn and Sanchez rightly point out, Wrangham’s book was not short of a speculation or two: there is, after all, no evidence of fire-making this far back. Still, they incline very much to Wrangham’s hypothesis. There’s no firm evidence of hominins fermenting food at this time, either — indeed, it’s hard to imagine what such evidence would even look like. Nonetheless, the authors are convinced it took place.

Where Wrangham focused on fire, Dunn and Sanchez are more interested in other forms of basic food processing: cutting, pounding and especially fermenting. The authors make a convincing, closely argued case for their perhaps rather surprising contention that “fermenting a mastodon, mammoth, or a horse so that it remains edible and is not deadly appears to be less challenging than making fire.”

“Flavor is our new hammer,” the authors admit, “and so we are probably whacking some shiny things here that aren’t nails.” It would be all too easy, out of a surfeit of enthusiasm, for them distort their reader’s impressions of a new and exciting field, tracing the evolution of flavour. Happily, Dunn and Sanchez are thoroughly scrupulous in the way they present their evidence and their arguments.

As primates, our experience of aroma and flavour is unusual, in that we experience retronasal aromas — the aromas that rise up from our mouths into the backs of our noses. This is because we have lost a long bone, called the transverse lamina, that helps to separate the mouth from the nose. This loss had huge consequences for olfaction, enabling humans to search out convoluted tastes and aromas so complex, we have to associate them with memories in order to individually categorise them all.

The story of how Homo sapiens developed such a sophisticated palette is also, of course, the story of how it contributed to the extinction of hundreds of the largest, most unusual animals on the planet. (Delicious is a charming book, but it does have its melancholy side.)

To take one dizzying example, the Clovis peoples of North America — direct ancestors of roughly 80 per cent of all living native populations in North and South America — definitely ate mammoths, mastodons, gomphotheres, bison and giant horses; they may also have eaten Jefferson’s ground sloths, giant camels, dire wolves, short-faced bears, flat-headed peccaries, long-headed peccaries, tapirs, giant llamas, giant bison, stag moose, shrub-ox, and Harlan’s Muskox.

“The Clovis menu,” the authors write, “if written on a chalkboard, would be a tally of a lost world.”

We may never have a pandemic again

Reading The Code Breaker, Walter Isaacson’s biography of Jennifer Doudna, for the Telegraph, 27 March 2021

In a co-written account of her work published in 2017, biochemist Jennifer Doudna creates a system that can cut and paste genetic information as simply as a word processor can manipulate text. Having conceived a technology that promises to predict, correct and even enhance a person’s genetic destiny she says, not without cause, “I began to feel a bit like Doctor Frankenstein.”

When it comes to breakthroughs in biology, references to Mary Shelley are irresistible. One of Walter Isaacson’s minor triumphs, in a book not short of major triumphs, is that, over 500 pages, he mentions that over-quoted, under-read novel less than half a dozen times. In biotechnology circles, this is probably a record.

We explain science by telling stories of discovery. It’s a way of unpacking complicated ideas in narrative form. It’s not really history, or if it is, it’s whig history, defined by a young Herbert Butterfield in 1931 as “the tendency… to praise revolutions provided they have been successful, to emphasise certain principles of progress in the past and to produce a story which is the ratification if not the glorification of the present.”

To explain the science, you falsify the history.
So all discovers and inventors are heroes on the Promethean (or Frankensteinian) model, working in isolation, and taking on the whole weight of the world on their shoulders!

Alas, the reverse is also true. Telling the true history of discovery makes the science very difficult to unpack. And though Walter Isaacson, whose many achievements include a spell as CEO of the Aspen Institute, clearly knows his science, his account of the most significant biological breakthrough since understanding the structure of DNA is not the very best account of CRISPR out there. His folksy cajoling — inviting us to celebrate “wily bacteria” and the “plucky little molecule” RNA — suggests exasperation. Explaining CRISPR is *hard*.

The Code Breaker excels precisely where, having read Isaacson’s 2011 biography of Steve Jobs, you might expect it to excel. Isaacson understands that all institutions are political. Every institutional activity — be it blue-sky research into the genome, or the design of a consumer product — is a species of political action.

The politics of science is uniquely challenging, because its standards of honesty, precision and rigour stretch the capabilities of language itself. Again and again, Doudna’s relationships with rivals, colleagues, mentors and critics are seen to hang on fine threads of contested interpretation. We see that Doudna’s fiercest rivalry, with Feng Zhang of the Broad Institute of MIT and Harvard, was conducted in an entirely ethical manner — and yet we see both of them stumbling away, bloodied.

Isaacson’s style of biography — already evident in his appreciations of Einstein and Franklin and Leonardo — can be dubbed “qualified hagiography”. He’s trying to hit a balance between the kind of whig history that will make complex materials accessible, and the kind of account that will stand the inspection of academic historians. His heroes’ flaws are explored, but their heroism is upheld. It’s a structural device, and pick at it however you want, it makes for a rattlingly good story.

Jennifer Doudna was born in 1964 and grew up on Big Island, Hawaii. Inspired by an old paperback copy of The Double Helix by DNA pioneer James Watson, she devoted her life to understanding the chemistry of living things. Over her career she championed DNA’s smaller, more active cousin RNA, which brought to her notice a remarkable mechanism, developed by single-celled organisms in their 3.1-million-year war with viruses. Each of these cells used RNA to build their very own immune system.

Understanding that mechanism was Doudna’s triumph, shared with her colleague Emmanuelle Charpentier; both conspicuously deserved the Nobel prize awarded them last year.

Showing that this mechanism worked in cells like our own, though, would change everything, including our species’ relationship with its own evolution. This technology has the power to eradicate both disease (good) and ordinary human variety (really not so good at all).

In 2012, the year of the great race, Doudna’s Berkeley lab knew nothing like enough about working with human cells. Zhang’s lab knew nothing like enough about the biochemical wrinkles that drove CRISPR. Their rivalrous decision not to pool CRISPR-Cas9 intellectual property would pave the way for an epic patent battle.

COVID-19 has changed all that, ushering in an extraordinary cultural shift.. Led by Doudna and Zhang, last year most academic labs declared that their discoveries would be made available to anyone fighting the virus. New on-line forums have blossomed, breaking the stranglehold of expensive paywall-protected journals.

Doudna’s lab and others have developed home testing kits for COVID-19 that have a potential impact beyond this one fight, “bringing biology into the home,” as Isaacson writes, “the way that personal computers in the 1970s brought digital products and services… into people’s daily lives and consciousness.”

Meanwhile genetic vaccines powered by CRISPR — like the ones developed for COVID-19 by Moderna and BioNTech/Pfizer — portend a sudden shift of the evolutionary balance between human beings and viruses. Moderna’s chair Noubar Afeyan is punchy about the prospects: “We may never have a pandemic again,” he says.

The Code Breaker catches us at an extraordinary moment. Isaacson argues with sincerity and conviction that, blooded by this pandemic, we should now grasp the nettle, make a stab at the hard ethical questions, and apply Doudna’s Promethean knowledge, now, and everywhere, to help people. Given the growing likelihood of pandemics, we may not have a choice.