Two hundred years of electro-foolery come good

Reading We Are Electric by Sally Adee for the Times, 28 January 2023

In an attempt to elucidate the role of electricity in biology, German polymath Alexander von Humboldt once stuck a charged wire up his bum and found that “a bright light appears before both eyes”.

Why the study of biological electricity should prove so irremediably smutty — so that serious ”electricians” (as the early researchers called themselves) steered well clear of bodies for well over a century — is a mystery science journalist Sally Adee would rather not have to re-hash, though her by-the-by account of “two hundred years of electro-foolery”, during which quacks peddled any number of cockeyed devices to treat everything from cancer to excessive masturbation, is highly entertaining.

And while this history of electricity’s role in the body begins, conventionally enough, with Volta and Galvani, with spasming frog’s legs and other fairly gruesome experiments, this is really just necessary groundwork, so that Adee can better explain recent findings that are transforming our understanding of how bodies grow and develop, heal and regenerate.

Why bodies turn out the way they do has proved a vexing puzzle for the longest while. Genetics offers no answer, as DNA contains no spatial information. There are genes for, say, eye colour, but no genes for “grow two eyes”, and no genes for “stick two eyes in front of your head”

So if genes don’t tell us the shape we should take as we grow, what does? The clue is in the title: we are, indeed, electric.

Adee explains that the forty trillion or so cells in our bodies are in constant electrical communication with each other. This chatter generates a field that dictates the form we take. For every structure in the body there is a specific membrane voltage range, and our cells specialise to perform different functions in line with the electrical cues they pick up from their neighbours. Which is (by way of arresting illustration) how in 2011 a grad student by the name of Sherry Aw managed, by manipulating electrical fields, to grow eyes on a developing frog’s belly.

The wonder is that this news will come as such a shock to so many readers (including, I dare say, many jobbing scientists). That our cells communicate electrically with each other without the mediation of nerves, and that the nervous system is only one of at least two (and probably many more) electrical communications systems — all this will come as a disconcerting surprise to many. Did you know you only have to put skin, bone, blood, nerve — indeed, any biological cell — into a petri dish and apply an electric field, and you will find all the cells will crawl to the same end of the dish? It’s taken decades before anyone thought to unpick the enormous implications of that fact.

Now we have begun to understand the importance of electrical fields in biology, we can begin to manipulate them. We’ve begun to restore some function after severe spinal injury (in humans) regrown whole limbs (in mice), and even turned cancerous tumours back into healthy tissue (in petri dishes).

Has bio-electricity — once the precinct of quacks and contrarians — at last come into its own? Has it matured? Has it grown up?

Well, yes and no. Adee would like to deliver a clear, single message about bioelectricity, but the field itself is still massively divided. On the one hand there are ground-breaking researches being conducted into development, regeneration and healing. On the other, there are those who think electricity in the body is mostly to do with nerves and brains, and their project — to hack peoples’ minds through their central nervous systems and usher in some sort of psychoelectric utopia — shows no sign of faltering.

In the 1960s the American neurophysiologist Warren McCulloch worked on the assumption that the way neurons fire is a kind of biological binary code. this led to a new school of thought, called cybernetics — a science of communications and automatic control systems, both living and mechanical. The idea was we should be able to drive an animal like a robot by simply activating specific circuits, an idea “so compelling” says Adee, “there wasn’t much point bothering with whether it was based in fact.”

Very many other researchers Adee writes about are just as wedded to the idea of the body as a meat machine.

This book arose from an article Adee wrote for the magazine New Scientist about her experiences playing DARWARS Ambush!, a military training simulation conducted in a Californian defence lab that (maybe) amped up her response times and (maybe) increased her focus — all by means of a headset that magnetically tickled precise regions in her brain.

Within days of the article’s publication in early 2012, Adee had become a sort of Joan of Arc figure for the online posthumanist community, and even turns up in Noah Yuval Harai’s book, where she serves as an Awful Warning about men becoming gods.

Adee finally admits that she would “love to take this whole idea of the body as an inferior meat puppet to be augmented with metal and absolutely launch it into the sun.” Coming clean at last, she admits she is much more interested in the basic research going on into the communications within and between individual cells — a field where the more we know, the more we realise just how much we don’t understand.

Adee’s enthusiasm is infectious, and she conveys well the jaw-dropping scale and complexity of this newly discovered “electrome”. This is more than medicine. “The real excitement of the field,” she writes, “hews closer to the excitement around cosmology.”

Rogues and heroes

Reading Sam Miller’s Migrants: The Story of Us All for the Telegraph, 23 January 2023

The cultural opprobrium attached to immigration has been building at least since Aristotle’s day, according to Sam Miller’s flawed, fascinating stab at a global history of migration.

Today, “having a permanent home and a lifelong nationality are considered normal, as if they were part of the human condition.” On the contrary, says Miller: humankind is the migratory species par excellence, settling every continent bar Antarctica, not once, but many times over.

Mixed feelings about this process have a deep anthropological foundation. Forget national and regional rivalries; those came later, and are largely explanations after the fact. What really upsets settled people is the reminder that, long ago, their kind chose to live an urban life and became less as a consequence: less wily, less tough, less resilient. The emergence of the first cities coincided with the first poems in uneasy praise of wild men: think of Mesopotamian Enkidu, or Greek Heracles. Aristotle, writing in 330 BC, declares that “he who is without a city-state by nature, and not by circumstance, is either a rogue or greater than a human being” — a wonderfully uneasy and double-edge observation that acknowledges a pre-urban past populated by formidable feral heroes.

Athenians suppressed this awareness; they were the first Western people to take pride in being, in Herodotus’s words, “the only Greeks who never migrated.” Ming Dynasty China performed the same flim-flam, a 15th-century administrator declaring: “There exists a paramount boundary within Heaven and Earth: Chinese on this side, foreigners on the other. The only way to set the world in order is to respect this boundary”.

“History books have, on the whole, been written by the sedentary for the sedentary,” says Miller, and naturally reflect a settled people’s chauvinism. The migration stories we learned at school are often wrong. The Vandals who “sacked” Rome in 455, did not, as a general rule, kill or rape or burn.

Alas, neither did they write; nor did the Roma, until the nineteenth century; nor did the (handsomely literate) Chinese of Victorian London. Migrants rarely find time to write, and where first-person accounts are missing, fantasy is bred. Some of it (Asterix) is charming, some of it (Fu Manchu) is anything but.

Miller thinks that humans naturally emigrate, and our unease about this is the result of pastoralism, cities, and other historical accidents.

The trouble with this line of argument is that there are umpteen “natural” reasons why people move about the earth. Humans naturally consume and lay waste to their immediate environment. Humans naturally overbreed. Humans naturally go to war. Why invoke some innate “outward urge”?

Different distances on the human story allow one to tell wildly different stories. If you follow humanity through deep time, our settlement of the almost the entire planet looks very much like manifest destiny and we’ll all surely end up on Mars tomorrow. If on the other hand you trace the movements of people over a few dozen generations, you’ll discover that, absent force majeure, people are homebodies, moving barely a few weeks’ walking distance from their birthplaces.

What is migration, anyway? Not much more than a hundred years ago, women regularly “migrated” (as Miller says, “it might take as long to cross a large English county as it would to fly halfway around the world today”) to marry or to work as governesses, domestic servants and shop workers. And yet they would never have called themselves “migrants”.

Miller, in a praiseworthy bid to tell a global story, adopts the broadest possible definition of migration: one that embraces “slaves and spouses, refugees and retirees, nomads and expats, conquerors and job-seekers.”

Alas, the broader one’s argument, the less one ends up saying. While they’re handsomely researched and stirringly written, I’m not sure our concepts of migration are much enriched by Miller’s brief tilts at historical behemoths like slavery and the maritime spice route.

What emerges from this onion of a book (fascinating digressions around no detectable centre), is, however, more than sufficent compensation. We have here the seed of much more enticing and potentially more influential project: a modern history that treats the modern nation state — pretending to self-reliance behind ever-more-futile barriers — as but a passing political arrangement, and not always a very useful one.

In view of the geopolitical crises being triggered by climate change, we may very soon need (or else be forced by circumstances) to come up with forms of government outside the rickety and brittle nation state. And in that case, peripatetic perspectives like Miller’s may be just what we need.

 

A Gigeresque melange

Reading Cold People by Tom Robb-Smith for the Times, 14 January 2023

Harvard medical student Liza is on holiday in Lisbon with her parents and younger sister when gigantic alien fish-shapes descend from the sky and order all humans to vacate the habitable bits of their planet for Antarctica, the only continent humans have never been able to settle.

Twenty years on, in a ramshackle, endlessly retrofitted settlement on the Antarctic Peninsula called Hope Town, Liza — one of very few survivors — gives birth to Echo, a genetically engineered daughter whose modifications allow her to withstand the bitter cold. Echo is an early prototype of future human being designed in McMurdo City (the ramshackle, ice-bound, over-serious new capital of humanity) by the heroically unprincipled geneticist Song Fu, aided and abetted by her assistant Yotam Penzak, the book’s splendidly drawn antagonist. (The author of Child 44 knows how to tell a story; you know you’re in safe hands when your villain is motivated by love.)

Yotam, who attended her birth, thinks Echo and her posthuman kind are a worthy end in themselves: powerful and humane, capable of nurturing unengineered humanity in their impossible new environment, even as they succeed them over evolutionary time.

His boss disagrees. The remains of humanity will die out in not much more than a century, says Song Fu. A more radical succession is required if humans are to survive in any form.

Yotam’s unlucky love life leaves him vulnerable to browbeating by his boss, and then to seduction by Song Fu’s posthumous final creation, a Giger-esque melange of human, alligator and shark.

In this wasteland, “Eitan” and his kind are by far the dominant species — or will be, if Yotam lets them out of their cave.
Much as Roald Amundsen and his party consumed the husky dogs that had got him to the Pole in 1911, they will consume their human creators, not out of hate or revenge, but simply because they have no other use for them.

Can Yotam’s convictions be shaken? Can Eitan be stopped?

Cold People does not explore ideas; it animates them. Plot is king. Smith’s characters aren’t so much pretend people as they are admirable, animated types. The result is a page-turner that, without offering much by way of ordinary human feeling, reveals Tom Rob Smith’s view of the human condition: what he thinks about the plight of thinking, would-be ethical beings who still need to consume and burn and exploit in order to survive. In Smith’s vision, humanity’s reach so far exceeds its grasp that its downfall at its own hand seems more or less assured.

These are chewy and worthwhile themes, and Cold People cleverly distils them to the point where they play out, and reach a satisfying climax, at ordinary human scale. If Echo can protect her human family, there’s hope for humanity at large. If not, we’re all for the chum bucket.

Cold People will entertain and impress readers who enjoy novels that are containers for ideas. The rest of us may regret that Smith did not linger longer among the Polynesian navigators, seal hunters and stir-crazy researchers populating his largely irrelevant but wonderfully evocative prologue. Slow down, Smith! You were so set on your destination, you missed the scenery.

Never say die

Reading Remnants of Ancient Life by Dale Greenwalt for New Scientist, 11 January 2023

What is a fossil made of? Mineralised rocky fossils are what spring to mind at a first mention of the word, but the preserved fauna of the burgess shale are pure carbon, a kind of proto-coal. Then there are those tantalising cretaceous insects preserved in amber.

Whatever they are made of, fossils contain treasures. The first really good microscopic study of (mineralised) dinosaur bone, revealing its internal structure, was written up in 1850 by the British palaeontologist Gideon Mantell.

Still, classifying fossil organisms on the basis of their shape and their location seemed to be virtually the only weapon in the paleobiologist’s arsenal — until 1993. That was the year Michael Crichton’s novel Jurassic Park famously captured the excitement of a field in turmoil, as ancient pigments, proteins, and DNA were being detected (not too reliably at first) in all manner of fossil substrates, including rock.

Jurassic Park’s blood-sucking insects fossilised in amber were a bust. Though seemingly perfectly preserved on the outside, they turned out to be hollow.

Mind you, the author of Remnants (a dull title for this vivid and gripping book) has himself has managed to get traces of ancient haemoglobin out of the bloated stomach of a fossilised mosquito — so never say die.

Greenwalt, who spends eleven months of every year “buried deep in the bowels of the Smithsonian’s National Museum of Natural History in Washington, DC,” has brought to the surface a riveting account of a field achieving insights quite as revolutionary as any conjectured by Crichton. The finds are extraordinary enough: a cholesterol-like molecule in a 380-million-year-old crustacean; chitin from the exoskeleton of a fossil from the 505-million-year-old burgess shale. Even more extraordinary are the inferences we can then draw about the physiology, behaviour, and evolution of these extinct organisms. Even from traces that are smeared, fragmented, degraded, and condensed, even from cyclized and polymerized materials, valuable insights can be drawn. It is even possible to calculate and construct putative “ancestral proteins” and from their study, conclude that Earth’s life had its origins at the mouths of deep ocean vents!

The story of biomolecules in palaeontology has its salutary side. A generation of brilliant innovators have had to calm down, learn the limitations of their new techniques, and return, as often as not, to the insights of comparative anatomy to confirm and calibrate their work. Polymerase-chain-reaction sequencing (PCR) is the engine powering our ever older and ever more complete ancient DNA sequences, but early teething problems included publication of a DNA sequence thought to be from a 120-million-year-old weevil that actually belonged to a fungus. Technologies prove their worth over time.

More problematic are the cul-de-sacs. In 2007 Greenwalt’s colleague, the palaeontologist Mary Schweitzer reported her lab had recovered short sequences of collagen from the femur of a 68-million-year-old Tyrannosaurus rex. As Matthew Collins at the University of Copenhagen complains, “It’s great work. I just can’t replicate it.” Schweitzer’s methodology has survived 15 years’ hard interrogation, it may simply be that animal proteins cannot survive more than about 4 million years. That still makes them much hardier than plant proteins, which only last for about 30,000 years.

Against these fascinating controversies and surprising dead-ends Greenwalt sets many wonders, not least “the seemingly unlimited potential of ancient DNA to shed light on the ancestry of our species, Homo sapiens”. And for short-changed botanists, there’s an extraordinary twist in Greenwalt’s tale whereby it may become possible to classify plants based, not on their morphology or even their DNA, but on the repertoire of small biomolecules they leave behind. “The biomolecular components of plants have been found as biomarkers in rocks that are two and a half billion — with a ‘b’!—years old,” Greenwalt exclaims (p204). The 3.7-billion-year-old cyano-bacteria that produced stromatolites in Greenland are the same age as the rocks at Mars’s Gale Crater: “Are authentic ancient biomolecules on Mars so implausible?” Greenwalt asks.

His day job may keep him for months at a time in the Smithsonian’s basement, but Greenwalt’s gaze is set firmly on the stars.

Pulling a Steerpike

Reading Michael Moorcock’s The Citadel of Forgotten Myths for the Times (who spiked it)

Leaving his defeated rival Yyrkoon on the throne as regent, Elric, titular emperor of decayed, decadent, dragon-blooded Melniboné “pulls a Steerpike” (as Mervyn Peake, Elric’s chief influence, might say) and goes wandering off the edge of the Earth (literally) in search of Answers (no time: don’t ask).

Sword and sorcery began, not with sui generis Tolkein, but with the the Elric canon. This sort-of-prequel tucks itself neatly away between the very first Elric tales. It’s a delight.

There are three stories. Two of them were published in the late 2000s. In the first, Elric gets caught up in a family dispute. (“You are my sister’s son. Your sentient acid blood demands you help me!” exclaims Elric’s dragon-scaled aunt.) In the second, he battles the noibuluscus, a bone-chomping, gut-sucking succulent tended by dwarfish cannibals. The call-backs in the last, longest story (an original, and a worthy addition to the evergreen “man-into-bee” subgenre) binds with its companions to create what the genre calls a “fix-up”. Who needs proper novels when you can have this much fun?

Moorcock began the saga of Elric of Melniboné in 1961, largely to support New Worlds, the science fiction magazine that, over a single cash-strapped four-year span, introduced us to J G Ballard, Pamela Zoline, John Sladek, M John Harrison — oh, too many to mention.

The first thing to say about Elric — pale loiterer, kin-slayer, absentee emperor of Melniboné — is that he makes no physical or psychological sense whatsoever. One moment he’s chewing the furniture, the next he’s sprawled across a chaise longue. If a scene demands that he be vulpine, hear him howl! If an emotional outpouring is required, feel the floodgates tremble! Decency? No problem. Indecency? Have at it. Elric is his saga, as surely as Gilgamesh and Ulysses are theirs, not because these people are meticulously rendered but because they aren’t. Elric is not heroic or anti-heroic. He is simply whatever his story needs them to be in that moment.

Considered as beings that occupy a span of time, such protean protagonists are impossibly shallow. But that’s to misread them. Like pre-school children, they each occupy their eternal present, radically committed to an ever-shifting now. Elric, a curse to his friends and a bane to his lovers (supposedly), vampirically dependent upon his ravenous soul-hungry sword Stormbringer (when convenient) and constitutionally unable to bring happiness to the world (really?), is never properly melancholic. He can be as solemn as an owl, but his adventures are a hoot. But when he weeps (which is often, and never for long) it’s with a rare and captivating intensity.

To write quickly — and Moorcock has always been a fast worker — the language has to get under the reader’s skin (and the heightened diction on display here is uncut cocaine). Repetition is your friend (so long as it’s the right repetition; Stormbringer’s muffled grumblings are as welcome as that cowbell riff in “Don’t Fear the Reaper”). Stock characters add the illusion of texture (and Elric’s sidekick Moonglum, surprisingly accomplished for a Sancho Panza stand-in, is one of the genre’s best). Above all, turn everyone’s appetite up to eleven (for food, for wine, for cheer, for sex).

That some if not all human appetites have become culturally “problematic” is hardly Elric’s fault. He is like one of those incorrigible elder relatives whose arrival has the politically correct neighbours clutching their pearls. He needs to be given things to do that are slightly beneath him, just so he doesn’t let slip anything untoward. Quick, somebody: give him a giant plant to battle (in Book Two), or a big blue bee (in Book Three)!

Moorcock is too canny an operator to have let the years tarnish his most lucrative creation, and these days he keeps poor Elric locked out of the ladies’ bedrooms. The effect is not so much to make Elric grow up as to infantilise him. This is a very minor matter, but it’s what you get for creating so long-lived a character. The world will grind them down.

Perhaps Moorcock still writes Elric at speed. It’s just as likely that he’s learned, from long practice, how to simulate the effect. This increasingly rare technique is not one that garners much critical approval, let alone appreciation. Our current ability to revise texts electronically ad nauseam places a premium on an author’s nuance and erudition, insight and (God help us) wisdom. Even a friendly critic finds little to say about a book’s grip and speed and visceral impact, though these will always be the biggest drivers of sales.

Now that even James Bond has succumbed to nuance and insight, Elric may, by my reckoning, be the last towering 1960s kaiju left alive.

Delight and devilry

Reading Douglas Futuyma’s How Birds Evolve for new Scientist, 7 December 2022

In Douglas Futuyma’s evolutionary history of birds, the delight is in the detail, and some of the devilry too — this is not a light read. Futuyma tells a double tale: he explains how the study of birds advanced our understanding of evolution, and he shows how advances in evolutionary science solve some long-standing ornithological mysteries, even as they throw up others.

He has written How Birds Evolve for birders, and being a birder himself (he began bird-spotting around the age of eleven in New York’s public parks), he knows just how fiercely the birding bug can bite. Many are half-way to being field scientists already, and many celebrated field scientists — from Ernst Mayr to Konrad Lorenz to Niko Tinbergen — have been birders.

“I suspect few of my teachers in the 1960s imagined that we would be studying birds by combining information from geology and molecular biology — disciplines that are miles apart,” says Futuyma, giving the reader an early hint of the complexities to come.

Birds are a curious, and curiously productive field of study for evolutionary biologists: less useful in understanding the mechanisms of evolution than insects, plants, and bacteria because they don’t reproduce as quickly, but, being various and everywhere, vital to the study of behaviour, longevity, ecology, speciation, cultural evolution and a host of other specialisms.

The ability to study populations and how they interact gave evolutionary biologists a foretaste of what their science would become. “The models of how variation might persist” Futuyma remarks, “were developed by evolutionary biologists who might not have known a hawk from a handsaw but were adept in mathematics.” Applying lessons from birds to ourselves, though risky, has also proved both irresistible and, at least in the science’s early stages, highly productive. In pondering human evolution, Darwin developed the idea of sexual selection, which takes up more than half his The Descent of Man of 1871. “Darwin devotes four full chapters to birds and cites at least 170 species,” Futuyman points out. “Birds provided more evidence for his ideas about sexual selection than any other group of animals”.

To grapple with bird diversity, one pretty much has to conjure up an idea of evolution. Peculiar and apparently inutile features abound in the bird world, a sure sign of unceasing adaptation. There are also, to complicate matters, many instances of convergent evolution. Feathers may have evolved only once, and through a bizarre genetic accident at that. (They don’t arise easily, as we once assumed, from reptilian scales.) Feather and wing shapes, however, recur again and again in even distantly related species. Darwin once predicted: “Our classifications will come to be, as far as they can be so made, genealogies,” but even his credulity would have been stretched by the news that flamingoes are related genetically to grebes.

“I don’t know how similar to birds a creature would have to be for us to call it an “avioid” or an “ornithoid,” Futuyman speculates, but for it “to be bipedal with feathers, toothless kinetic jaws, highly developed vision, a gizzard, and a high constant body temperature… I think… is very unlikely indeed.”

Futuyma unpacks the story of evolutionary science alongside the story of how birds evolved, acquiring bipedal locomotion and simple filamentous feathers as Dinosauria, then clavicles fused into a wishbone in Theropoda, on and on, until we arrive at what we might as well call the modern bird, with its large, keeled breastbone, rapid growth, and unfeasibly lightweight construction.

How Birds Evolve is not meant to be an introduction to birds (though one imagines readers of this magazine would lap it up). It is personable, entertaining and deeply passionate about its subject.

Futuyma, the author of two successful textbooks about evolution, is out to inspire, and his comprehensive book more than makes up in wonder what it might lack in an easy and seductive narrative.

A sack of tech cats

Reading Long Shot by Kate Bingham and Tim Hames for the Telegraph, 15 October 2022

“Not only were we building the plane as we were flying it,” writes Kate Bingham, appointed by Boris Johnson in May 2020 to chair the UK Vaccine Task Force, “we were flying in the dark and simultaneously writing the instruction manual, and fielding endless petty questions from air traffic control asking about the strength of the orange juice we were serving to passengers.”

The tale of Britain’s vaccination effort against Covid-19 ends happily, of course: the task force arranges for clinical trials, secures 350 million doses of six vaccines, oversees any amount of novel infrastructure for their manufacture and distribution, and delivers Covid-19 as close to a knock-out blow as one could reasonably dream of.

At the time of Bingham’s first phone call, in January 2020, things looked rather different: it seemed to this British venture capitalist, who had no specialist knowledge of vaccine development, that she was being asked to take responsibility for a huge amount of government expenditure “that would, most likely, prove completely wasted.”

Vaccines normally take decades to develop, while viruses can mutate in a matter of weeks. There was, Bingham insists, very little chance of success.

Britain’s internationally celebrated vaccine development and production regime was set in motion, from something like a standing start, by a team, that included a bomb disposal expert, an Indian rowing star, an Italian consultant, a former ambassador, a football pundit, and the redoubtable Ruth Todd, whose day job was to see that submarines were delivered on time. This is a book about the skills and experiences necessary to build extraordinary ventures under pressure. Although the science is sketched ably enough here and there, this is not a science book.

Bingham’s background is in drug discovery — a notoriously unpredictable business where some of the brightest minds in biochemistry stake their future careers on one roll of the clinical-trial dice. VCs, if they’re wealthy enough, can spread their risks over a few dozen companies, but even they barely survived the dot-com crash of 2001. Bingham’s one of the new breed that emerged from the wreckage, an investor altogether more interested in managing the companies she helps create than in driving start-ups into premature IPOs. She still carries in her DNA an instinct for spreading risk, though: the VTF spent 4.6 billion on six vaccines in the hope that one might work, one day, maybe.

Coming up to speed with the science was no easy task, even for a major biotech player. There was no shortage of brilliant work to choose from, but little way of telling what innovations would pay off on time. Bingham salutes the work of Robin Shattock at Imperial College, London, graciously acknowledging that his “amplified mRNA” technology, which triggers large immune responses from very small amounts of vaccine, will surely be mature enough to help combat the next pandemic (and be in no doubt, there will be one). [Iona, “amplified mRNA” is a separate (though related) technology to the mRNA tech harnessed by Pfizer]

On the other hand, those teams who were fully ready were anxious to make an impact. Bingham only found out about Oxford University’s non-profit vaccine partnership with AstraZeneca over the radio, and Pfizer, partnering with BioNTech, much preferred to throw its own money at things than rely on any taskforce handholding.

Choreographing this sack of tech cats was not, says Bingham, her hardest task, and with the help of Tim Hames (a former chief leader writer for the Times) she patiently anatomises how she weathered the formless paranoia of politicians, the hampering good intentions of civil servants, and (most distressing of all, by some measure) the random mischief-making of government communications demons.

The ears of the National Audit Office may burn (she dubs their “help” a foolish and expensive joke); most everyone else emerges from this tale with due credit and generous thanks. Indeed, Bingham’s candid account will be uncomfortable reading for those who nurse a dogmatic hostility to “insiders”. Bingham’s husband is Jesse Norman MP, then the financial secretary to the Treasury. Bingham has no difficulty demonstrating that this has nothing to do with anything, but that didn’t stop the Guardian’s jibes, its “chumocracy” tables and the rest.

The trouble is, given a problem as complex and fluid as democratic government, expertise is only authoritative in relation to some particular subject. Without insiders — those with firsthand knowledge in a particular affair or circumstance (like, yes, the PM’s adviser Dominic Cummings, who insisted the VTF should be handled as a business) — experts are merely preaching in the wilderness. On the other hand, without experts to inform them (people like Sir Patrick Vallance, the government’s chief scientific adviser) insiders have nothing to offer but windbaggery.

Knowledge and power must meet. Bingham and Hames’s accessible, edge-of-the-seat account of how British innovators vaccinated the UK and much of the rest of the world is also a quiet, compelling, non partisan argument for dialogue between business and politics.

Being kind to the blarney

Reading Yuval Noah Harari’s How Humans Took Over the World (Unstoppable Us 1) for the Telegraph, 9 October 2022

“And just think how sad the last mammoth must have been all on her own,” writes Yuval Noah Harari, as he invites his pre-teen audience to contemplate our species’ long track record in wiping out countless varieties of big animal (giant flightless birds; elephantine sloths; the list is long).

Harari is spreading his young person’s history of humankind across four illustrated volumes. This is the first, and describes how we managed to exterminate our way to planetary dominance (so a certain mawkishness is allowable). Following the Bauplan of Harari’s 2011 adult bestseller Sapiens: A Brief History of Humankind, we can reasonably look forward to a triumphalist futuristic fourth volume in which all the monsters that terrified and/or fed our ancestors are resurrected by our CRISPR-wielding post-human descendants and released into some sort of 3D-printed world zoo.

Why survey such a vast sweep of evolutionary history through the keyhole of “what we really know”? Why not say what you’ve got to say, and leave the error-correction to your young reader’s own curiosity and further reading? When I was a kid, Patrick Moore was still writing about Venus’s already rather unlikely world ocean. I was inspired by such unchained speculation, and I don’t think I sustained any lasting intellectual harm.

But we are where we are: the world is haunted by the spectre of untruth and is besotted with the wisdom of crowds, and Harari is at pains in his afterword to point out how carefully staff at Penguin and his own “social impact company” Sapienship weighed every sentence and every illustration, lest it might misrepresent something or “hurt people”.

As a consequence, How Humans Took Over the World is, for all its many strengths, one of the least odd books I have ever read.

How Humans Took Over the World is an easy-to-read epic that sets out to be scrupulously truthful about what we do and do not know about the past. In simple, direct terms, Harari explains that we’re the only species that believes stories; stories enabled cooperation; and cooperation made it possible for us to smother and consume large amounts of the planet.

Harari’s ebullience as a storyteller is infectious. No sooner does he dry his eyes over the fate of the mammoth, than he is gleefully explaining how easy they were to get rid of. (With that long a gestation period, and that small a herd, you only had to kill a couple of mammoths a year to wipe them out.)

Harari’s concludes that we’re not a very nice species. This is risky, if only because self-hate is cheap and saves us the trouble of doing anything or changing anything about ourselves. The gloomy shade of Jean-Jacques Rousseau hovers over Harari’s dismissal of religion as a means by which powerful hominins cozened an unfair quantity of bananas from their weaker brethren. The idea that religion might be humanity’s millennia-old effort to tell uplifting stories about itself, all in the teeth of cosmic meaninglessness and the inevitability of death, gets no look-in here, though Harari still spends an inordinate amount of time being kind to the blarney and tosh spun by animists and shamans, those snake-oil salesmen of yore.

Harari’s setting us up for a thunderous and inspiring last chapter, in which we see Homo sapiens poised to use its storytelling superpower to more constructive effect.

If thousands of people believe in the same story, then they’ll all follow the same rules, and this is why we rule the world (“whereas poor chimps are locked up in zoos”).

To save the world we have been so busy consuming, we need to come up with a story about ourselves that’s better than the ones we’ve told each other in the past (or, to be less judgemental about it, a story that’s better suited to our planet’s present).

“If you can invent a good story that enough people believe,” Harari writes, “you can conquer the world.”

It is unlikely that this will be written by Harari. Though he’s packaged as a seer, there’s little in his work that is truly surprising or sui generis.

His chief skill — displayed here even more remarkably than in his work for adults — is his ability to spin complex material into a rollicking tale while still telling the truth.

Dreams of a fresh crab supper

Reading David Peña-Guzmán’s When Animals Dream for New Scientist, 17 August 2022

Heidi the octopus is dreaming. As she sleeps, her skin changes from smooth and white to flashing yellow and orange, to deepest purple, to a series of light greys and yellows, criss-crossed by ridges and spiky horns. Heidi’s human carer David Scheel has seen this pattern before in waking octopuses: Heidi, he says, is dreaming of catching and eating a crab.

The story of Heidi’s dream, screened in 2019 in the documentary “Octopuses: Making Contact”, provides the starting point for When Animals Dream, an exploration of non-human imaginations by David Pena-Guzman, a philosopher at San Francisco State University.

The Roman philosopher-poet Lucretius thought animals dreamt. So did Charles Darwin. The idea only lost its respectability for about a century, roughly between 1880 to 1980, when the reflex was king and behaviourism ruled the psychology laboratory.

In the classical conditioning developed by Ivan Pavlov, it is possible to argue that your trained salivation to the sound of a bell is “just a reflex”. But later studies in this mould never really banished the interior, imaginative lives of animals. These later studies relied on a different kind of conditioning, called “operant conditioning”, in which you behave in a certain way before you receive a reward or avoid a punishment. The experimenter can claim all they want that the trained rat is “conditioned”; still, that rat running through its maze is acting for all the world as though it expects something.

In fact, there’s no “as though” about it. Pena-Guzman, in a book rich in laboratory and experimental detail, describes how rats, during their exploration of a maze, will dream up imaginary mazes, and imaginary rewards — all as revealed by distinctive activity in their hippocampuses.

Clinical proofs that animals have imaginations are intriguing enough, but what really dragged the study of animal dreaming back into the light was our better understanding of how humans dream.

From the 1950s to the 1970s we were constantly being assured that our dreams were mere random activity in the pons (the part of the brainstem that connects the medulla to the midbrain). But we’ve since learned that dreaming involves many more brain areas, including the parietal lobes (involved in the representation of physical spaces) and frontal lobes (responsible among other things for emotional regulation).

At this point, the sight of a dog dreaming of chasing a ball became altogether too provocative to discount. The dog’s movements while dreaming mirror its waking behaviours too closely for us to say that they lack any significance.

Which animals dream? Pena-Guzman’s list is too long to quote in its entirety. There are mice, dogs and platypuses, beluga whales and ostriches, penguins, chameleons and iguanas, cuttlefish and octopuses — “the jury is still out on crocodiles and turtles.”

The brain structures of these animals may be nothing like our own; nonetheless, studies of sleeping brains throw up startling commonalities, suggesting, perhaps, that dreaming is a talent to which many different branches of the evolutionary tree have converged.

Pena-Guzman poses big questions. When did dreaming first emerge and why? By what paths did it find its way into so many branches of the evolutionary tree? And — surely the biggest question of all — what are we do with this information?

Pena-Guzman says dreams are morally significant “because they reveal animals to be both carriers and sources of moral value, which is to say, beings who matter and for whom things matter.”

In short, dreams imply the existence of a self. And whether or not that self can think rationally, act voluntarily, or produce linguistic reports, just like a human, is neither here nor there. The fact is, animals that dream “have a phenomenally charged experience of the world… they sense, feel and perceive.”

Starting from the unlikely-sounding assertion that Heidi the octopus dreams of fresh crab suppers, Pena-Guzman assembles a short, powerful, closely argued and hugely well evidenced case for animal personhood. This book will change minds.

 

 

Some rude remarks about Aberdeen

Reading Sarah Chaney’s Am I Normal? for new Scientist, 10 August 2022

In the collections of University College London there is a pair of gloves belonging to the nineteenth-century polymath Francis Galton. Galton’s motto was “Whenever you can, count”. The left glove has a pin in the thumb and a pad of felt across the fingers. Placing a strip of paper over the felt, Galton could then, by touching different fingers with the pin, keep track of what he saw without anyone noticing. A beautiful female, passing him by, was registered on one finger: her plain companion was registered on another. With these tallies, Galton thought he might in time be able to assemble a beauty map of Great Britain. The project foundered, though not before Galton had committed to paper some rude remarks about Aberdeen.

Galton’s beauty map is easy to throw rocks at. Had he completed it, it would have been not so much a map of British physiognomic variation, as a record of his own tastes, prejudices and shifting predilections during a long journey.

But as Sarah Chaney’s book makes clear, when it comes to the human body, the human mind, and human society, there can be no such thing as an altogether objective study. There is no moral or existential “outside” from which to begin such a study. The effort to gain such a perspective is worthwhile, but the best studies will always need reinterpreting for new audiences and next generations.

Am I Normal? gives often very uncomfortable social and political context to the historical effort to identify norms of human physiology, behaviour and social interaction. Study after study is shown to be hopelessly tied to its historical moment. (The less said about “drapetomiania”, the putative mental illness discovered among runaway slaves, the better.)

And it would be the easiest job in the world, and the cheapest, to wield these horrors as blunt weapons to tear down both medicine and the social sciences. It is true that in some areas, measurement has elicited surprisingly little insight — witness the relative lack of progress made in the last century in the field of mental health. But while conditions like schizophrenia are real, and ruinous, do we really want to give up our effort at understanding?

It is certainly true, that we have paid not nearly enough attention, at least until recently, to where our data was coming from. Research has to begin somewhere, of course, but should we really still be basing so much of our medicine, our social policy and even our design decisions on data drawn (and sometimes a very long time ago) from people in Western, educated, industrialised, rich and democratic (WEIRD) societies?

Chaney shows how studies that sought human norms can just as easily detect diversity. All it needs is a little humility, a little imagination, and an underlying awareness that in these fields, the truth does not stay still.