To hell with the philosopause!

Reading Hawking Hawking: The Selling of a Scientific Celebrity by Charles Seife for the Spectator, 1 May 2021

I could never muster much enthusiasm for the theoretical physicist Stephen Hawking. His work, on the early universe and the nature of spacetime, was Nobel-worthy, but those of us outside his narrow community were horribly short-changed. His 1988 global best-seller A Brief History of Time was incomprehensible, not because it was difficult, but because it was bad.

Nobody, naturally, wanted to ascribe Hawking’s popular success to his rare form of Motor Neurone Disease, Hawking least of all. He afforded us no room for horror or, God forbid, pity. In 1990, asked a dumb question about how his condition might have shaped his work (because people who suffer ruinous, debilitating illnesses acquire compensating superpowers, right?) Hawking played along: “I haven’t had to lecture or teach undergraduates, and I haven’t had to sit on tedious and time-consuming committees. So I have been able to devote myself completely to research.”

The truth — that Hawking was one of the worst popular communicators of his day — is as evident as it is unsayable. A Brief History of Time was incomprehensible because after nearly five years’ superhuman effort, the author proved incapable of composing a whole book unaided. He couldn’t even do mathematics the way most people do it, by doodling, since he’d already lost the use of his hands. He could not jot notes. He could not manipulate equations. He had to turn every problem he encountered into a species of geometry, just to be able to think about it. He held his own in an impossibly rarified profession for years, but the business of popular communication was beyond him. As was communication, in the end, according to Hawking’s late collaborator Andy Strominger: “You would talk about words per minute, and then it went to minutes per word, and then, you know, it just got slower and slower until it just sort of stopped.”

Hawking became, in the end, a computerised patchwork of hackneyed, pre-stored utterances and responses. Pull the string at his back and marvel. Charles Seife, a biographer braver than most, begins by staring down the puppet. His conceit is to tell Stephen Hawking’s story backwards, peeling back the layers of celebrity and incapacity to reveal the wounded human within.

It’s a tricksy idea that works so well, you wonder why no-one thought of it before (though ordering his material and his arguments in this way must have nearly killed the poor author).

Hawking’s greatest claim to fame is that he discovered things about black holes — still unobserved at that time — that set the two great schools of theoretical physics, quantum mechanics and relativity, at a fresh and astonishingly creative loggerheads.

But a new golden era of astronomical observation dawned almost immediately after, and A Brief History was badly outdated before it even hit the shelves. It couldn’t even get the date of the universe right.

It used to be that genius that outlived its moment could reinvent itself. When new-fangled endocrine science threw Ivan Pavlov’s Nobel-winning physiology into doubt, he reinvented himself as a psychologist (and not a bad one at that).

Today’s era of narrow specialism makes such a move almost impossible but, by way of intellectual compensation, there is always philosophy — a perennially popular field more or less wholly abandoned by professional philosophers. Images of the middle-aged scientific genius indulging its philosopause in book after book about science and art, science God, science and society and so on and so forth, may raise a wry smile, but work of real worth has come out of it.

Alas, even if Hawking had shown the slightest aptitude for philosophy (and he didn’t), he couldn’t possibly have composed it.

In our imaginations, Hawking is the cartoon embodiment of the scientific sage, effectively disembodied and above ordinary mortal concerns. In truth, life denied him a path to sagacity even as it steeped him in the spit and stew of physical being. Hawking’s libido never waned. So to hell with the philosopause! Bring on the dancing girls! Bring on the cheques, from Specsavers, BT, Jaguar, Paddy Power. (Hawking never had enough money: the care he needed was so intensive and difficult, a transatlantic air flight could set him back around a quarter of a million pounds). Bring on the billionaires with their fat cheques books (naifs, the lot of them, but decent enough, and generous to a fault). Bring on the countless opportunities to bloviate about subjects he didn’t understand, a sort of Prince Charles only without Charles’s efforts at warmth.

I find it impossible, having read Seife, not to see Hawking through the lens of Jacobean tragedy, warped and raging, unable even to stick a finger up at a world that could not — but much worse, *chose* not — to understand him. Of course he was a monster, and years too late, and through a book that will anger many, I have come to love him for it.

Bacon is grey

Reading Who Poisoned Your Bacon Sandwich? by Guillaume Coudray and Hooked: How processed food became addictive by Michael Moss for the Financial Times, 21 April 2021

The story of how food arrives on our plate is a living, breathing sci-fi epic. Fertiliser produced by sucking nitrogen out of the air now sustains about half the global population. Farmers worldwide depend on the data spewing from 160 or so environmental satellite missions in low-earth orbit, not to mention literally thousands of weather satellites of various kinds.

That such a complex system is precarious hardly needs saying. It only takes one innovative product, or cheeky short-cut, to transform the health, appearance and behaviour of nations. What gastronomic historian, 50 years ago, would have ptedicted that China would grow fat, or that four and out five French cafés would shut up shop in a single generation?

To write about the food supply is to wrestle with problems of scale, as two new books on the subject demonstrate. To explain how we turned the green revolution of the 1960s into a global obesity pandemic in less than half a century, Michael Moss must reach beyond history entirely, and into the contested territories of evolutionary biology. Guillaume Coudray, Paris-based investigative journalist, prefers a narrower argument, focusing on the historical accidents, and subsequent cover-ups that even now add cancer-causing compounds to our processed meat. The industry attitudes and tactics he reveals strongly resemble those of the tobacco industry in the 1970s and 1980s.

Ably translated as Who Poisoned Your Bacon Sandwich?, Coudray’s 2017 expose tells the story of the common additives used to cure — and, crucially, colour — processed meats. Until 1820, saltpetre (potassium nitrate; a constituent of gunpowder) was our curing agent of choice — most likely because hunters in the 16th century discovered that game birds shot with their newfangled muskets kept for longer. Then sodium nitrate appeared, and — in the mid 1920s — sodium nitrite. All three give meats a convincing colour in a fraction of the time traditional salting requires. Also, their disinfectant properties allow unscrupulous producers to operate in unsanitary conditions.

Follow basic rules of hygiene, and you can easily cure meat using ordinary table salt. But traditional meats often take upwards of a year to mature; no wonder that the 90-day hams pouring out of Chicago’s meatpacking district at the turn of the 20th century conquered the world market. Parma ham producers still use salt; most everyone else has resorted to nitrate and nitrates just to survive.

It wasn’t until the 1970s that researchers found a link between these staple curing agents and cancer. This was, significantly, also the moment industry lobbyists began to rewrite food history. The claim that we’ve been preserving meat with saltpetre for over 5,000 years is particularly inventive: back then it was used to preserve Egyptian mummies, not cure hams. Along with the massaged history came obfuscation – for instance arguments that nitrates and nitrites are not carcinogenic in themselves, even if they give rise to carcinogenic agents during processing, cooking or, um, digestion.

And when, in 2015, experts of the International Agency for Research on Cancer classified all processed meats in “group 1: carcinogenic to humans” (they can cause colorectal cancer, the second most deadly cancer we face) the doubt-mongers redoubled their efforts — in particular the baseless claim that nitrates and nitrites are our only defence against certain kinds of food poisoning.

There are alternatives. If it’s a disinfectant-cum-curing agent you’re after, organic water-soluble salts called sorbates work just fine.

Crucially, though, sorbates have no colourant effect, while nitrates and nitrites give cured meat that rosy glow. Their use is so widespread, we have clean forgotten that the natural colour of ham, pate, weiner sausages and bacon is (deal with it) grey.

That the food industry wants to make food as attractive as possible, so that it can sell as much as possible is, of itself, hardly news.

And in Hooked (a rather different beast to his 2013 exposé Sugar Salt Fat), American journalist Michael Moss finds that — beyond the accusations and litigations around different foodstuffs — there’s something systemically wrong with our relationship to food. US consumers now fill three-quarters of their shopping carts with processed food. Snacking now accounts for around a quarter of our daily calorie intake. Pointing the finger at Coca-Cola or McDonalds is not going to solve the bigger problem, which Moss takes to be changes in the biology of our ancestors which have made it extremely difficult to recoup healthy eating habits once they’ve run out of control.

Moss argument is cogent, but not simple. We have to get to grips, first, with the latest thinking on addiction, which has more or less dispensed with the idea that substances are mind-altering. Rather, they are mind-engaging, and the speed of their effect has quite as much, if not more to do with their strength than their pharmacology.

By this measure, food is an incredibly powerful drug (A taste of sugar hits the brain 20 times faster than a lungful of tobacco smoke). But does it make any sense to say we’re all addicted to food?

Moss says it does — only we need to dip our toes in evolutionary biology to understand why. As primates, we have lost a long bone, called the transverse lamina, that used to separate the mouth from the nose. Consequently, we can smell food as we taste it.

No one can really explain why an enhanced appreciation of flavour gave us such a huge evolutionary advantage, but the biology is ungainsayable: we are an animal obsessed with gustatory variety. In medieval France, this inspired hundreds of different sauces. Today, in my local supermarket, it markets 50-odd different varieties of potato chip.

The problem, Moss says, is not that food manufacturers are trying to addict us. It is that they have learned how to exploit an addiction baked into our biology.

So what’s the solution? Stop drinking anything with calories? Avoid distractions when we eat? Favour foods we have to chew? All of the above, of course — though it’s hard to see how good advice on its own could ever persuade us all to act against our own appetites.

Hooked works, in a rambunctious, shorthand sort of way. Ultimately, though, it may prove be a transitional book for an author who is edging towards a much deeper reappraisal of the relationship between food, convenience (time, in other words), and money.

Neither Moss nor Coudray demands we take to the barricades just yet. But the pale, unappetisingly grey storm clouds of a food revolution are gathering.

“To penetrate humbly…”

Reading Beyond by Stephen Walker for the Telegraph, 18 April 2021

On 30 May 2020 US astronauts Bob Behnken and Doug Hurley flew to the International Space Station. It was the first time a crew had left the planet from US soil since 2011.

In the interim, something — not wrong, exactly, but certainly strange — had happened to space travel. Behnken and Hurley’s SpaceX-branded space suits looked like something I would throw together as a child, even down to my dad’s biking helmet and — were those Wellington boots? The stark interior of SpaceX’s Crew Dragon capsule was even more disconcerting. Poor Behnken and Hurley! they looked as if they were riding in the back of an Uber.

Well, what goes around comes around, I suppose. The capsule that carried Yuri Gagarin into space on 12 April 1961 boasted an almost ludicrously bare central panel of just four dials. Naysayers sniped that Gagarin had been a mere passenger — a human guinea pig.

By contrast, the design of the Mercury cockpit, that carried America’s first astronaut into space, was magnificently, and possibly redundantly fussy says Stephen Walker, in his long and always thrilling blow-by-blow account of the United States’ and the Soviet Union’s race into orbit: “Almost every inch of it was littered with dials, knobs, indicators, lights and levers just like a ‘real’ aeroplane cockpit.”

America’s “Gemini Seven” (two-seater Gemini capsules quickly succeeded the Mercuries) were celebrities, almost absurdly over-qualified for their task of being rattled around in the nose of an intercontinental ballistic missile. Their space programme was public — and so were its indignities, like the fact that virtually everything they were being asked to do, a chimpanzee had done before them.

It drove Alan Shepard — the man fated to be the first American in space — into a rage. On one training session somebody joked, “Maybe we should get somebody who works for bananas”. The ash tray Shepard threw only just missed his head.

The Soviet Union’s space programme was secret. Not even their wives knew what the “Vanguard Seven” were up to. They won no privileges. Sometimes they’d polish other people’s floors to make ends meet.

Those looking for evidence of the gimcrack quality of the Soviet space effort will find ammunition in Beyond. Contrast, for example, NASA’s capsule escape plans (involving a cherry-picker platform and an armoured vehicle) with the Soviet equivalent (involving a net and a bath tub).

But Walker’s research for this book stretches back a decade and his acknowledgements salute significant historians (Asif Siddiqi in particular), generous interviewees and a small army of researchers. He’ll not fall for such clichés. instead, he shows how the efforts of each side in the race to space were shaped by the technology they had to hand.

Soviet hydrogen bombs were huge and heavy, and needed big, powerful rockets to carry them. Soviet space launches were correspondingly epic. The Baikonur cosmodrome in Soviet Kazakhstan — a desolate, scorpion-infested region described in Soviet encyclopaedias as “the Home of the Black Death” — was around a hundred times the size of Cape Canaveral. Its launch bunkers were buried beneath several metres of reinforced concrete and earth because, says Walker, “a rocket the size and power of the R-7 would probably have flattened the sort of surface blockhouse near the little Redstone in Cape Canaveral.”

Because the US had better (lighter, smaller) nuclear bombs, its available rocket technology was — in space-piercing terms — seriously underpowered. When Alan Shepard finally launched from Cape Canaveral on 5 May 1961, twenty-three days after Yuri Gagarin circled the earth, his flight lasted just over fifteen minutes. He splashed down in the Atlantic Ocean 302 miles from the Cape. Gagarin travelled some 26,000 miles around the planet.

The space race was the Soviets’ to lose. Once Khrushchev discovered the political power of space “firsts” he couldn’t get enough of them. “Each successive space ‘spectacular’ was exactly that,” Walker writes, “not so much part of a carefully structured progressive space programme but yet another glittering showpiece, preferably tied to an important political anniversary”. Attempts to build a co-ordinated strategy were rejected or simply ignored. This is a book as much about disappointment as triumph.

Beyond began life as a film documentary, but the newly discovered footage Walker was offered proved too damaged for use. Thank goodness he kept his notes and his nerve. This is not a field that’s starved of insight: Jamie Doran and Piers Bizony wrote a cracking biography of Gagarin called Starman in 1998; the autobiography of Soviet systems designer Boris Chertok runs to four volumes. Still, Walker brings a huge amount that is new and fresh to our understanding of the space race.

Over the desk of the Soviet’s chief designer Sergei Korolev hung a portrait of the nineneenth-century Russian space visionary Konstantin Tsiolkovsky, and with it his words: “Mankind will not stay on Earth for ever but in its quest for light and space it will first penetrate humbly beyond the atmosphere and then conquer the whole solar system.”

Beyond shows how that dream — what US aviation pioneer James Smith McDonnell called “the creative conquest of space” — was exploited by blocs committed to their substitute for war — and how, for all that, it survived.

Oh, shut up

Watching Chaos Walking for New Scientist, 12 April 2021

Young Todd Hewitt (Tom Holland) is learning to be a man, and in Prentisstown (ostensibly the only settlement to survive humanity’s arrival on the planet New World) this means keeping your thoughts to yourself.

Something about the planet makes men’s thoughts both audible and visible to others. Men are constantly constantly having to hide their thoughts, by thinking of something else, by rehearsing daily chores, or even just by reciting their own names, again and again. Women were unaffected, apparently, but the native (and rarely glimpsed) Spackle killed them all years ago.

(If this account of things seems a little off, imagine it delivered by an especially troubled-looking Mads Mikkelsen, playing the settlement’s mysterious mayor. Watching his settlement’s secrets come to light, one by one, is one of this film’s chief delights.)

Viola, played by Daisy Ridley, has arrived from space, scouting for a second settlement wave when her landing craft all but burns up, leaving her at the mercy of the men of Prentisstown. You’d think they’d be glad of her arrival and her company — but you would be wrong.

Chaos Walking arrives under something of a cloud; to begin with, no one could fix on a script they liked. Charlie Kaufmann (of Being John Malkovich fame) got first bite of the cherry, before the project was passed from pillar to post and ended up being crafted by Christopher Ford (Spider-Man: Homecoming (2017)) and Patrick Ness, author of the book on which this film is based, The Knife of Never Letting Go. Chaos Walking should, by all measures, have ended up a mess.

But if it’s not the blockbuster the studio expected or needed, Chaos Walking is nonetheless a real accomplishment: a disconcerting little masterpiece of sensitive acting and well-judged design.

In this film, men quite literally cannot shut up, and in her very first conversation with Mayor Prentiss, it dawns on Viola that this gives her huge advantages. She can lie, she can keep secrets, and she’s the only one here who can — crucial points made almost entirely in dialogue-less reaction shots. Daisy Ridley’s talents weren’t wildly well served in the last three Star Wars films, but she’s given her head here.

Tom Holland’s Todd is a naif who must save Viola and get her to a neighbouring settlement he never even realised existed — a place where women survive and (understandably) dominate.

Todd is the model of what a man must be in this New World: polite, honest, and circumspect. Holland’s bid to “be a man” in such circumstances is anything but straightforward — but Holland keeps our sympathy and our regard.

Indeed, the great strength of Chaos Walking is that it interrogates gender roles by creating genuine difficulties for its characters. Even Prentisstown’s lunatic and misogynist preacher Aaron — surely David Oyelowo’s most unrewarding role yet, all beetle brows and gnashing teeth — turns out to make a dreadful kind of sense.

No gender is well served by the strange telepathic gifts bestowed on half the human settlers of New World. Only good will and superhuman patience prevents human society going up like a powder keg.

This has happened once, in Prentisstown, and — given the weirdly stalled settlement of the planet — it has almost certainly happened elsewhere. The planet’s architecture and technology are an uneasy and creative mishmash of battered industrial machinery and Western-genre make-do-and-mend. The effect is oddly unsettling, particularly in the sequence where horse-riders pursue each other through a forest that had very obviously been planted in rows.

Chaos Walking is not a western. Neither is it, in any easy sense, a feminist fable. Chaos Walking is about people’s struggles in unreasonable circumstances, and for all the angst bound up in its premise, it becomes, by the end, a charming and uplifting film about love and reconciliation.

Tally of a lost world

Reading Delicious: The evolution of flavor and how it made us human by Rob Dunn and Monica Sanchez for New Scientist, 31 March 2021

Dolphins need only hunger and a mental image of what food looks like. Their taste receptors broke long ago, and they no longer taste sweet, salty or even umami, thriving on hunger and satisfaction alone.

Omnivores and herbivores have a more various diet, and more chances of getting things badly wrong, so they are guided by much more highly developed senses (related, even intertwined, but not at all the same) of flavour (how something tastes) and aroma (how something smells).

Evolutionary biologist Robb Dunn and anthropologist Monica Sanchez weave together what chefs now know about the experience of food, what ecologists know about the needs of animals, and what evolutionary biologists know about how our senses evolved, to tell the story of how we have been led by our noses through evolutionary history, and turned from chimpanzee-like primate precursor to modern, dinner-obsessed Homo sapiens.

Much of the work described here dovetails neatly with work described in biological anthropologist Richard Wrangham’s 2009 book Catching Fire: How cooking made us human. Wrangham argued that releasing the calories bound up in raw food by cooking it led to a cognitive explosion in Homo sapiens, around 1.9 million years ago.

As Dunn and Sanchez rightly point out, Wrangham’s book was not short of a speculation or two: there is, after all, no evidence of fire-making this far back. Still, they incline very much to Wrangham’s hypothesis. There’s no firm evidence of hominins fermenting food at this time, either — indeed, it’s hard to imagine what such evidence would even look like. Nonetheless, the authors are convinced it took place.

Where Wrangham focused on fire, Dunn and Sanchez are more interested in other forms of basic food processing: cutting, pounding and especially fermenting. The authors make a convincing, closely argued case for their perhaps rather surprising contention that “fermenting a mastodon, mammoth, or a horse so that it remains edible and is not deadly appears to be less challenging than making fire.”

“Flavor is our new hammer,” the authors admit, “and so we are probably whacking some shiny things here that aren’t nails.” It would be all too easy, out of a surfeit of enthusiasm, for them distort their reader’s impressions of a new and exciting field, tracing the evolution of flavour. Happily, Dunn and Sanchez are thoroughly scrupulous in the way they present their evidence and their arguments.

As primates, our experience of aroma and flavour is unusual, in that we experience retronasal aromas — the aromas that rise up from our mouths into the backs of our noses. This is because we have lost a long bone, called the transverse lamina, that helps to separate the mouth from the nose. This loss had huge consequences for olfaction, enabling humans to search out convoluted tastes and aromas so complex, we have to associate them with memories in order to individually categorise them all.

The story of how Homo sapiens developed such a sophisticated palette is also, of course, the story of how it contributed to the extinction of hundreds of the largest, most unusual animals on the planet. (Delicious is a charming book, but it does have its melancholy side.)

To take one dizzying example, the Clovis peoples of North America — direct ancestors of roughly 80 per cent of all living native populations in North and South America — definitely ate mammoths, mastodons, gomphotheres, bison and giant horses; they may also have eaten Jefferson’s ground sloths, giant camels, dire wolves, short-faced bears, flat-headed peccaries, long-headed peccaries, tapirs, giant llamas, giant bison, stag moose, shrub-ox, and Harlan’s Muskox.

“The Clovis menu,” the authors write, “if written on a chalkboard, would be a tally of a lost world.”

We may never have a pandemic again

Reading The Code Breaker, Walter Isaacson’s biography of Jennifer Doudna, for the Telegraph, 27 March 2021

In a co-written account of her work published in 2017, biochemist Jennifer Doudna creates a system that can cut and paste genetic information as simply as a word processor can manipulate text. Having conceived a technology that promises to predict, correct and even enhance a person’s genetic destiny she says, not without cause, “I began to feel a bit like Doctor Frankenstein.”

When it comes to breakthroughs in biology, references to Mary Shelley are irresistible. One of Walter Isaacson’s minor triumphs, in a book not short of major triumphs, is that, over 500 pages, he mentions that over-quoted, under-read novel less than half a dozen times. In biotechnology circles, this is probably a record.

We explain science by telling stories of discovery. It’s a way of unpacking complicated ideas in narrative form. It’s not really history, or if it is, it’s whig history, defined by a young Herbert Butterfield in 1931 as “the tendency… to praise revolutions provided they have been successful, to emphasise certain principles of progress in the past and to produce a story which is the ratification if not the glorification of the present.”

To explain the science, you falsify the history.
So all discovers and inventors are heroes on the Promethean (or Frankensteinian) model, working in isolation, and taking on the whole weight of the world on their shoulders!

Alas, the reverse is also true. Telling the true history of discovery makes the science very difficult to unpack. And though Walter Isaacson, whose many achievements include a spell as CEO of the Aspen Institute, clearly knows his science, his account of the most significant biological breakthrough since understanding the structure of DNA is not the very best account of CRISPR out there. His folksy cajoling — inviting us to celebrate “wily bacteria” and the “plucky little molecule” RNA — suggests exasperation. Explaining CRISPR is *hard*.

The Code Breaker excels precisely where, having read Isaacson’s 2011 biography of Steve Jobs, you might expect it to excel. Isaacson understands that all institutions are political. Every institutional activity — be it blue-sky research into the genome, or the design of a consumer product — is a species of political action.

The politics of science is uniquely challenging, because its standards of honesty, precision and rigour stretch the capabilities of language itself. Again and again, Doudna’s relationships with rivals, colleagues, mentors and critics are seen to hang on fine threads of contested interpretation. We see that Doudna’s fiercest rivalry, with Feng Zhang of the Broad Institute of MIT and Harvard, was conducted in an entirely ethical manner — and yet we see both of them stumbling away, bloodied.

Isaacson’s style of biography — already evident in his appreciations of Einstein and Franklin and Leonardo — can be dubbed “qualified hagiography”. He’s trying to hit a balance between the kind of whig history that will make complex materials accessible, and the kind of account that will stand the inspection of academic historians. His heroes’ flaws are explored, but their heroism is upheld. It’s a structural device, and pick at it however you want, it makes for a rattlingly good story.

Jennifer Doudna was born in 1964 and grew up on Big Island, Hawaii. Inspired by an old paperback copy of The Double Helix by DNA pioneer James Watson, she devoted her life to understanding the chemistry of living things. Over her career she championed DNA’s smaller, more active cousin RNA, which brought to her notice a remarkable mechanism, developed by single-celled organisms in their 3.1-million-year war with viruses. Each of these cells used RNA to build their very own immune system.

Understanding that mechanism was Doudna’s triumph, shared with her colleague Emmanuelle Charpentier; both conspicuously deserved the Nobel prize awarded them last year.

Showing that this mechanism worked in cells like our own, though, would change everything, including our species’ relationship with its own evolution. This technology has the power to eradicate both disease (good) and ordinary human variety (really not so good at all).

In 2012, the year of the great race, Doudna’s Berkeley lab knew nothing like enough about working with human cells. Zhang’s lab knew nothing like enough about the biochemical wrinkles that drove CRISPR. Their rivalrous decision not to pool CRISPR-Cas9 intellectual property would pave the way for an epic patent battle.

COVID-19 has changed all that, ushering in an extraordinary cultural shift.. Led by Doudna and Zhang, last year most academic labs declared that their discoveries would be made available to anyone fighting the virus. New on-line forums have blossomed, breaking the stranglehold of expensive paywall-protected journals.

Doudna’s lab and others have developed home testing kits for COVID-19 that have a potential impact beyond this one fight, “bringing biology into the home,” as Isaacson writes, “the way that personal computers in the 1970s brought digital products and services… into people’s daily lives and consciousness.”

Meanwhile genetic vaccines powered by CRISPR — like the ones developed for COVID-19 by Moderna and BioNTech/Pfizer — portend a sudden shift of the evolutionary balance between human beings and viruses. Moderna’s chair Noubar Afeyan is punchy about the prospects: “We may never have a pandemic again,” he says.

The Code Breaker catches us at an extraordinary moment. Isaacson argues with sincerity and conviction that, blooded by this pandemic, we should now grasp the nettle, make a stab at the hard ethical questions, and apply Doudna’s Promethean knowledge, now, and everywhere, to help people. Given the growing likelihood of pandemics, we may not have a choice.

 

Perfect in a special way

Watching An Impossible Project for New Scientist, 24 March 2021

Jens Meurer is a hard figure to pin down. As a producer he’s seen major mainstream movies like Black Book (2006) and Rush (2013) to the big screen; the European Academy named him ‘documentary filmmaker of the year’ in 1995; he’s also quite prepared to spend months following in the wake of an eccentric Viennese entrepreneur who’s convinced that the future of technology is analogue, or at any rate post-digital — a strange and hard to monetize mash-up of the two, perhaps.

An Impossible Project is Meurer’s passion project about Florian Kapps (everyone calls him “Doc” on account of his working studying the eye muscles of spiders). Though he can never be too sure how to meet next month’s bills, Kapps nonetheless moves in interesting circles. We follow him around Berlin, New York and Menlo Park, and say goodbye to him as he’s hosting a dinner party for “analogue champions” including higher-ups in Moleskine, Polaroid and Facebook (yes, Facebook: it has an analog research lab) in a mothballed (hence wholly analogue) grand hotel just outside Vienna.

Kapps is a one-man cultural revolution. He bought the last surviving Polaroid factory in 2008, just before it was due to be demolished. He got it running again, only to discover that several chemicals needed to make Polaroid’s signature instant-developing film were no longer in production. That film was “the most chemically complicated man-made product ever,” claims Steve Herchen a former Polaroid product manager. Early attempts to replicate the original formula were, in Kapps’s memorable phrase, “perfect in a special way” (the colours were wildly unreliable; half the time the image would melt off the backing).

Still, Kapps persevered. He reckoned analogue technology has an irresistible mystique; that if he rebuilt the technology, new customers would appear. And he was right: Impossible, the company he founded, now bears the Polaroid name and sells a million instant films a year. Kapps, though, is a dreamer, not a manager, and Impossible’s board had long since kicked him out.

It is hard to feel too sorry for him. His subsequent ventures in analogue — including a museum-cum-bar-cum-store in Vienna called Supersense — address, in a much more direct and personally satisfying fashion, his scattergun delight in goods you can touch and smell, and machines you can hear working and can take apart and understand. Kapps curates analogue printing machinery, recording equipment, cameras and telephones. All the machines work, and those that are for sale, sell quickly. Every few weeks he traipses across Austria in search of just the right meats to serve in his cafe. After hours he uses his shop floor to stage concerts that are cut straight to vinyl, creating one-of-a-kind records of live events. David Bohnett, creator of Geocities and one of Silicon Valley’s first millionaires, reckons Kapps is inventing a whole new class of luxury item — unique records of unique experiences. Is he right?

People under 25 seem to think so. It’s this cohort, who grew up in a digital world, who are Kapps’s most eager customers. Kapps believes a monotonously digital diet has starved them of sensory pleasure, and that “after a long period of analogue companies trying hard to become digital, it’s now time for the digital companies to start thinking how to connect with people in analogue ways.”

An Impossible Project is a highly ingenious movie. Meurer has gone to extraordinary lengths to portray the man who saved Polaroid in a film that captures that casual, magical, slightly unreliable Polaroid feel. It’s informal. Practically every take looks like an outtake. People grin at the camera as if they’ve never seen a camera before. The shots don’t seem particularly well framed, and yet they add up to an extraordinarily beautiful film. And the colours are gorgeous.

Reality trumped

Reading You Are Here: A field guide for navigating polarized speech, conspiracy theories, and our polluted media landscape by Whitney Phillips and Ryan M. Milner (MIT Press)
for New Scientist, 3 March 2021

This is a book about pollution, not of the physical environment, but of our civic discourse. It is about disinformation (false and misleading information deliberately spread), misinformation (false and misleading information inadvertently spread), and malinformation (information with a basis in reality spread pointedly and specifically to cause harm).

Communications experts Whitney Phillips and Ryan M. Milner completed their book just prior to the US presidential election that replaced Donald Trump with Joe Biden. That election, and the seditious activities that prompted Trump’s second impeachment, have clarified many of the issues Phillips and Milner have gone to such pains to explore. Though events have stolen some their thunder, You Are Here remains an invaluable snapshot of our current social and technological problems around news, truth and fact.

The authors’ US-centric (but universally applicable) account of “fake news” begins with the rise of the Ku Klux Klan. Its deliberately silly name, cartoonish robes, and quaint routines (which accompanied all its activities, from rallies to lynchings) prefigured the “only-joking” subcultures (Pepe the Frog and the like) dominating so much of our contemporary social media. Next, an examination of the Satanic panics of the 1980s reveals much about the birth and growth of conspiracy theories. The authors’ last act is an unpicking of QAnon — a current far-right conspiracy theory alleging that a secret cabal of cannibalistic Satan-worshippers plotted against former U.S. president Donald Trump. This brings the threads of their argument together in a conclusion all the more apocalyptic for being so closely argued.

Polluted information is, they argue, our latest public health emergency. By treating the information sphere as an ecology under threat, the authors push past factionalism to reveal how, when we use media, “the everyday actions of everyone else feed into and are reinforced by the worst actions of the worst actors”

This is their most striking takeaway: that the media machine that enabled QAnon isn’t a machine out of alignment, or out of control, or somehow infected: it’s a system working exactly as designed — “a system that damages so much because it works so well”.

This media machine is founded on principles that, in and of themselves, seem only laudable. Top of the list is the idea that to counter harms, we have to call attention to them: “in other words, that light disinfects”.

This is a grand philosophy, for so long as light is hard to generate. But what happens when the light — the confluence of competing information sets, depicting competing realities — becomes blinding?

Take Google as an example. Google is an advertising platform, that makes money the more its users use the internet to “get to the bottom of things”. The deeper the rabbit-holes go, the more money Google makes. This sets up a powerful incentive for “conspiracy entrepreneurs” to produce content, creating “alternative media echo-systems”. When the facts run out, create alternative facts. “The algorithm” (if you’ll forgive this reviewer’s dicey shorthand) doesn’t care. “The algorithm” is, in fact, designed to serve up as much pollution as possible.

What’s to be done? Here the authors hit a quite sizeable snag. They claim they’re not asking “for people to ‘remain civil’”. They claim they’re not commanding us, “don’t feed the trolls.” But so far as I could see, this is exactly what they’re saying — and good for them.

With the machismo typical of the social sciences, the authors call for “foundational, systematic, top-to-bottom change,” whatever that is supposed to mean, when what they are actually advocating is a sense of personal decency, a contempt for anonymity, a willingness to stand by what one says come hell or high water, politeness and consideration, and a willingness to listen.

These are not political ideas. These are qualities of character. One might even call them virtues, of a sort that were once particularly prized by conservatives.

Phillips and Milner bemoan the way market capitalism has swallowed political discourse. They teeter on a much more important truth: that politics has swallowed our moral discourse. Social media has made whining cowards of us all. You Are Here comes dangerously close to saying so. If you listen carefully, there’s a still, small voice hidden in this book, telling us all to grow up.