Reality trumped

Reading You Are Here: A field guide for navigating polarized speech, conspiracy theories, and our polluted media landscape by Whitney Phillips and Ryan M. Milner (MIT Press)
for New Scientist, 3 March 2021

This is a book about pollution, not of the physical environment, but of our civic discourse. It is about disinformation (false and misleading information deliberately spread), misinformation (false and misleading information inadvertently spread), and malinformation (information with a basis in reality spread pointedly and specifically to cause harm).

Communications experts Whitney Phillips and Ryan M. Milner completed their book just prior to the US presidential election that replaced Donald Trump with Joe Biden. That election, and the seditious activities that prompted Trump’s second impeachment, have clarified many of the issues Phillips and Milner have gone to such pains to explore. Though events have stolen some their thunder, You Are Here remains an invaluable snapshot of our current social and technological problems around news, truth and fact.

The authors’ US-centric (but universally applicable) account of “fake news” begins with the rise of the Ku Klux Klan. Its deliberately silly name, cartoonish robes, and quaint routines (which accompanied all its activities, from rallies to lynchings) prefigured the “only-joking” subcultures (Pepe the Frog and the like) dominating so much of our contemporary social media. Next, an examination of the Satanic panics of the 1980s reveals much about the birth and growth of conspiracy theories. The authors’ last act is an unpicking of QAnon — a current far-right conspiracy theory alleging that a secret cabal of cannibalistic Satan-worshippers plotted against former U.S. president Donald Trump. This brings the threads of their argument together in a conclusion all the more apocalyptic for being so closely argued.

Polluted information is, they argue, our latest public health emergency. By treating the information sphere as an ecology under threat, the authors push past factionalism to reveal how, when we use media, “the everyday actions of everyone else feed into and are reinforced by the worst actions of the worst actors”

This is their most striking takeaway: that the media machine that enabled QAnon isn’t a machine out of alignment, or out of control, or somehow infected: it’s a system working exactly as designed — “a system that damages so much because it works so well”.

This media machine is founded on principles that, in and of themselves, seem only laudable. Top of the list is the idea that to counter harms, we have to call attention to them: “in other words, that light disinfects”.

This is a grand philosophy, for so long as light is hard to generate. But what happens when the light — the confluence of competing information sets, depicting competing realities — becomes blinding?

Take Google as an example. Google is an advertising platform, that makes money the more its users use the internet to “get to the bottom of things”. The deeper the rabbit-holes go, the more money Google makes. This sets up a powerful incentive for “conspiracy entrepreneurs” to produce content, creating “alternative media echo-systems”. When the facts run out, create alternative facts. “The algorithm” (if you’ll forgive this reviewer’s dicey shorthand) doesn’t care. “The algorithm” is, in fact, designed to serve up as much pollution as possible.

What’s to be done? Here the authors hit a quite sizeable snag. They claim they’re not asking “for people to ‘remain civil’”. They claim they’re not commanding us, “don’t feed the trolls.” But so far as I could see, this is exactly what they’re saying — and good for them.

With the machismo typical of the social sciences, the authors call for “foundational, systematic, top-to-bottom change,” whatever that is supposed to mean, when what they are actually advocating is a sense of personal decency, a contempt for anonymity, a willingness to stand by what one says come hell or high water, politeness and consideration, and a willingness to listen.

These are not political ideas. These are qualities of character. One might even call them virtues, of a sort that were once particularly prized by conservatives.

Phillips and Milner bemoan the way market capitalism has swallowed political discourse. They teeter on a much more important truth: that politics has swallowed our moral discourse. Social media has made whining cowards of us all. You Are Here comes dangerously close to saying so. If you listen carefully, there’s a still, small voice hidden in this book, telling us all to grow up.

A Faustian bargain, freely made

Reading The Rare Metals War by Guillaume Pitron for New Scientist, 27 January 2021

We reap seven times as much energy from the wind, and 44 times as much energy from the sun, as we did just a decade ago. Is this is good news? Guillaume Pitron, a journalist and documentary-maker for French television, is not sure.

He’s neither a climate sceptic, nor a fan of inaction. But as the world begins to adopt a common target of net-zero carbon emissions by 2050, Pitron worries that we’re becoming selectively blind to the costs that effort will incur. His figures are stark. Changing our energy model means doubling rare metal production approximately every fifteen years, mostly to satisfy our demand for non-ferrous magnets and lithium-ion batteries. “At this rate,” says Pitron, “over the next thirty years we will need to mine more mineral ores than humans have extracted over the last 70,000 years.”

Before the Renaissance, humans had found a use for just seven metals. Over the course of the industrial revolution, this number increased to just a dozen. Today, we’ve found uses for all 86 of them, and some of them are very rare indeed. For instance, neodymium and gallium are found in iron ore, but there’s 1,200 times less neodymium and up to 2,650 times less gallium than there is iron.

Zipping from an abandoned Mountain Pass mine in the Mojave Desert to the toxic lakes and cancer villages of Baotou in China, Pitron weights the terrible price paid for refining such materials, ably blending his investigative journalism with insights from science, politics and business.

There are two sides to Pitron’s story, woven seamlessly together. First there’s the economic story, of how the Chinese government elected to dominate the global energy and digital transition, so that it now controls 95 per cent of the rare metals market, manufacturing between 80 to 90 per cent of the batteries for electric vehicles, and over half the magnets used in wind turbines and electric motors.

Then there’s the ecological story in which, to ensure success, China took on the West’s own ecological burden. Now 10 per cent of its arable land is contaminated by heavy metals, 80 per cent of its ground water is unfit for consumption and 1.6 million people die every year due to air pollution alone (a recent paper in The Lancet reckons only 1.24 million people die each year — but let’s not quibble.

China’s was a Faustian bargain, freely entered into, but it would not have been possible had Europe and the rest of the Western world not outsourced their own industrial activities, creating a world divided, as Pitron memorably describes it, “between the dirty and those who pretend to be clean”.

The West’s economic comeuppance is now at hand, as its manufacturers, starved of the rare metals they need, are coerced into taking their technologies to China. And we in the West really should have seen this coming: how our reliance on Chinese raw materials would quickly morph into a reliance on China for the very technologies of the energy and digital transition. (Piron tells us that without magnets produced by China’s ChengDu Magnetic Material Science & Technology Company, the United States’ F-35 fifth-generation stealth fighter cannot fly.)

By 2040, in our pursuit of ever-greater connectivity and a cleaner atmosphere, we will need to mine three times more rare earths, five times more tellurium, twelve times more cobalt, and sixteen times more lithium than we do today. China’s ecological ruination and its global technological dominance advance in lockstep, unstoppably — unless we start mining for rare metals ourselves — in the United States, Brazil, Russia, South Africa, Thailand, Turkey, and in the “dormant mining giant” of Pitron’s native France.

Better, says Pitron, that we attain some small shred of supply security, and start mining our own land. At least if mining takes place in the backyards of vocal First World consumers, they can agitate for (and pay for) cleaner processes. And nothing will change “so long as we do not experience, in our own backyards, the full cost of attaining our standard of happiness.”

Seventy minutes of concrete

Watching Last and First Men (2020) directed by Jóhann Jóhannsson for New Scientist

“It’s a big ask for people to sit for 70 minutes and look at concrete,” mused the Icelandic composer Jóhann Jóhannsson, about his first and only feature-length film. He was still working on Last and First Men at the time of his death, aged 48, in February 2018.

Admired in the concert hall for his subtle, keening orchestral pieces, Jóhann Jóhannsson was well known for his film work: Prisoners (2013) and Sicario (2015) are made strange by his sometimes terrifying, thumping soundtracks. Arrival (2016) — about the visitation of aliens whose experience of time proves radically different to our own — inspired a yearning, melancholy score that is, in retrospect, a kind of blockbuster-friendly version of Last and First Men. (It’s worth noting that all three films were directed by Denis Villeneuve, himself no stranger to the aesthetics of concrete — witness 2017’s Blade Runner 2049.)

Jóhannsson’s Last and First Men is, by contrast, contemplative and surreal. It’s no blockbuster. A series of zooms and tracking shots against eerie architectural forms, mesmerisingly shot in monochrome 16mm by Norwegian cinematographer Sturla Brandth Grøvlen, it draws its inspiration and its script (a haunting, melancholy, sometimes chilly off-screen monologue performed by Tilda Swinton) from the 1930 novel by British philosopher William Olaf Stapledon.

Stapledon’s day job — lecturing on politics and ethics at the University of Liverpool — seems now of little moment, but his science fiction novels have never been out of print, and continue to set a dauntingly high bar for successors. Last and First Men is a history of the solar system across two billion years, detailing the dreams and aspirations, achievements and failings of 17 different kinds of future Homo (not including sapiens).

In the light of our ageing sun, these creatures evolve, blossom, speciate, and die, and it’s in the final chapters, and the melancholy moment of humanity’s ultimate extinction, that Jóhannsson’s film is set. Last and First Men is not a drama. There are no actors. There is no action. Mind you, it’s hard to see how any attempt to film Stapledon’s future history could work otherwise. It’s not really a novel; more a haunting academic paper from the beyond.

The idea to use passages from the book came quite late in Jóhannsson project, which began life as a film essay on (and this is where the concrete comes in) the huge, brutalist war memorials, called Spomenik, erected in the former Republic of Yugoslavia between the 1960s and the 1980s.

“Spomeniks were commissioned by Marshal Tito, the dictator and creator of Yugoslavia,” Jóhannsson explained in 2017 when the film, accompanied by a live rendition of an early score, was screened at the Manchester International Festival. “Tito constructed this artificial state, a Utopian experiment uniting the Slavic nations, with so many differences of religion. The spomeniks were intended as symbols of unification. The architects couldn’t use religious iconography, so instead, they looked to prehistoric, Mayan and Sumerian art. That’s why they look so alien and otherworldly.”

Swinton’s cool, regretful, monologue proves an ideal foil for the film’s architectural explorations, lifting what would otherwise be a stunning but slight art piece into dizzying, speculative territory: the last living human, contemplating the leavings of two billion years of human history.

The film was left unfinished at Jóhannsson’s death; it took his friend, the Berlin-based composer and sound artist Yair Elazar Glotman, about a year to realise Jóhannsson’s scattered and chaotic notes. No-one, hearing the story of how Last and First Men was put together, would imagine it would ever amount to anything more than a tribute piece to the composer.

Sometimes, though, the gods are kind. This is a hugely successful science fiction film, wholly deserving of a place beside Tarkovsky’s Solaris and Kubrick’s 2001. Who knew that staring at concrete, and listening to the end of humanity, could wet the watcher’s eye, and break their heart?

It is a terrible shame that Jóhannsson’s did not live to see his hope fulfilled; that, in his own words, “we’ve taken all these elements and made something beautiful and poignant. Something like a requiem.”

 

More than the naughty world deserves

Reading Wikipedia @ 20, edited by Joseph Reagle and Jackie Koerner, for the Telegraph, 10 January 2021

In 2015 the US talk show host and comedian Stephen Colbert coined “truthiness”, one of history’s more chilling buzzwords. “We’re not talking about truth,” he declared, “we’re talking about something that seems like truth — the truth we want to exist.”

Colbert thought the poster-boy for our disinformation culture would be Wikipedia, the open-source internet encyclopedia started, more or less as an afterthought, by Jimmy Wales and Larry Sanger in 2001. If George Washington’s ownership of slaves troubles you, Colbert suggested “bringing democracy to knowledge” by editing his Wikipedia page.

Three years later the magazine Atlantic was calling Wikipedia “the last bastion of shared reality in Trump’s America”. Yes, its coverage is lumpy, idiosyncratic, often persnickety , and not terribly well written. But it’s accurate to a fault, extensive beyond all imagining, and energetically policed. (Wikipedia nixes toxic user content within minutes. Why can’t YouTube? Why can’t Twitter?)

Editors Joseph Reagle and Jackie Koerner — both energetic Wikipedians — know better than to go hunting for Wikipedia’s secret sauce. (A community adage goes that Wikipedia always works better in practice than in theory.) They neither praise nor blame Wikipedia for what it has become, but — and this comes across very strongly indeed — they love it with a passion. The essays they have selected for this volume (you can find the full roster of contributions on-line) reflect, always readably and almost always sympathetically, on the way this utopian project has bedded down in the flaws of the real world.

Wikipedia says it exists “to benefit readers by acting as an encyclopedia, a comprehensive written compendium that contains information on all branches of knowledge”. Improvements are possible. Wikipedia is shaped by the way its unvetted contributors write about what they know and delete what they do not. That women represent only about 12 per cent of the editing community is, then, not ideal.

Harder to correct is the wrinkle occasioned by language. Wikipedias written in different languages are independent of each other. There might not be anything actually wrong, but there’s certainly something screwy about the way India, Australia, the US and the UK and all the rest of the Anglophone world share a single English-language Wikipedia, while only the Finns get to enjoy the Finnish one. And it says something (obvious) about the unevenness of global development that Hindi speakers (the third largest language group in the world) read a Wikipedia that’s 53rd in a ranking of size.

To encyclopedify the world is an impossible goal. Surely the philosophes of eighteenth century France knew that much when they embarked on their Encyclopédie. Paul Otlet’s Universal Repertory and H. G. Wells’s World Brain were similarly Quixotic.

Attempting to define Wikipedia through its intellectual lineage may, however, be to miss the point. In his stand-out essay “Wikipedia As A Role-Playing Game” Dariusz Jemielniak (author of the first ethnography of Wikipedia, Common Knowledge?, in 2014) stresses the playfulness of the whole enterprise. Why else, he asks, would academics avoid it? “”When you are a soldier, you do not necessarily spend your free time playing paintball with friends.”

Since its inception, pundits have assumed that it’s Wikipedia’s reliance on the great mass of unwashed humanity — sorry, I mean “user-generated content” — that will destroy it. Contributor Heather Ford, a South African open source activist, reckons it’s not its creators that will eventually ruin Wikipedia but its readers — specifically, data aggregation giants like Google, Amazon and Apple, who fillet Wikipedia content and disseminate it through search engines like Chrome and personal assistants like Alexa and Siri. They have turned Wikipedia into the internet’s go-to source of ground truth, inflating its importance to an unsustainable level.

Wikipedia’s entries are now like swords of Damocles, suspended on threads over the heads of every major commercial and political actor in the world. How long before the powerful find a way to silence this capering non-profit fool, telling motley truths to power? As Jemielniak puts it, “”A serious game that results in creating the most popular reliable knowledge source in the world and disrupts existing knowledge hierarchies and authority, all in the time of massive anti-academic attacks — what is there not to hate?”

Though one’s dislike of Wikipedia needn’t spring from principles or ideas or even self-interest. Plain snobbery will do. Wikipedia has pricked the pretensions of the humanities like no other cultural project. Editor Joseph Reagle discovered as much ten years ago in email conversation with founder Jimmy Wales (a conversation that appears in Good Faith Collaboration, Reagle’s excellent, if by now slightly dated study of Wikipedia). “One of the things that I noticed,” Wales wrote, “is that in the humanities, a lot of people were collaborating in discussions, while in programming… people weren’t just talking about programming, they were working together to build things of value.”

This, I think, is what sticks in the craw of so many educated naysayers: that while academics were busy paying each other for the eccentricity of their beautiful opinions, nerds were out in the world winning the culture wars; that nerds stand ready on the virtual parapet to defend us from truthy, Trumpist oblivion; that nerds actually kept the promise held out by the internet, and turned it into the fifth biggest site on the Web.

Wikipedia’s guidelines to its editors include “Assume Good Faith” and “Please Do Not Bite the Newcomers.” This collection suggests to me that this is more than the naughty world deserves.

An engine for understanding

Reading Fundamentals by Frank Wilczek for the Times, 2 January 2021

It’s not given to many of us to work at the bleeding edge of theoretical physics, discovering for ourselves the way the world really works.

The nearest most of us will ever get is the pop-science shelf, and this has been dominated for quite a while now by the lyrical outpourings of Italian theoretical physicist Carlo Rovelli. Rovelli’s upcoming one, Helgoland, promises to have his reader tearing across a universe made, not of particles, but of the relations between them.

It’s all too late, however: Frank Wilczek’s Fundamentals has gazzumped Rovelli handsomely, with a vision that replaces our classical idea of physical creation — “atoms and the void” — with one consisting entirely of spacetime, self-propagating fields and properties.

Born in 1951 and awarded the Nobel Prize in Physics in 2004 for figuring out why atoms don’t just fly apart, Wilczek is out to explain why “the history of Sweden is more complicated than the history of the universe”. The ingredients of the universe are surprisingly simple, but their fates, playing out through time in accordance with just a handful of rules, generate a world of unimaginable complexity, contingency and abundance. Measures of spin, charge and mass allow us to describe the whole of physical reality, but they won’t help us at all in depicting, say, the history of the royal house of Bernadotte.

Wilczek’s “ten keys to reality”, mentioned in his subtitle, aren’t to do with the 19 or so physical constants that exercised Martin Rees, the UK’s Astronomer Royal, in his 1990s pop-science heyday. The focus these days has shifted more to the spirit of things. When Wilczek describes the behaviour of electrons around an atom, for example, gone are the usual Böhr-ish mechanics, in which electrons leap from one nuclear orbit to another. Instead we get a vibrating cymbal, the music of the spheres, a poetic understanding of fields, and not a fragment of matter in sight.

So will you plump for the Wilzcek, or will you wait for the Rovelli? A false choice, of course; this is not a race. Popular cosmology is more like the jazz scene: the facts (figures, constants, models) are the standards everyone riffs off. After one or two exposures you find yourself returning for the individual performances: their poetry, their unique expression.

Wilczek’s ten keys are more like ten book ideas, exploring the spatial and temporal abundance of the universe; how it all began; the stubborn linearity of time; how it all will end. What should we make of his decision to have us swallow the whole of creation in one go?

In one respect this book was inevitable. It’s what people of Wilczek’s peculiar genius and standing do. There’s even a sly name for the effort: the philosopause. The implication here being that Wilczek has outlived his most productive years and is now pursuing philosophical speculations.

Wilzcek is not short of insights. His idea of what the scientific method consists of is refreshingly robust: a style of thinking that “combines the humble discipline of respecting the facts and learning from Nature with the systematic chutzpah of using what you think you’ve learned aggressively”. If you apply what you think you’ve discovered everywhere you can, even in situations that have nothing to do with your starting point, then, if it works, “you’ve discovered something useful; it it doesn’t, then you’ve learned something important.”

However, works of the philosopause are best judged on character. Richard Dawkins seems to have discovered, along with Johnny Rotten, that anger is an energy. Martin Rees has been possessed by the shade of that dutiful bureaucrat C P Snow. And in this case? Wilczek, so modest, so straight-dealing, so earnest in his desire to conciliate between science and the rest of culture, turns out to be a true visionary, writing — as his book gathers pace — a human testament to the moment when the discipline of physics, as we used to understand it, came to a stop.

Wilczek’s is the first generation whose intelligence — even at the far end of the bell-curve inhabited by genius — is insufficient to conceptualise its own scientific findings. Machines are even now taking over the work of hypothesis-making and interpretation. “The abilities of our machines to carry lengthy yet accurate calculations, to store massive amounts of information, and to learn by doing at an extremely fast pace,” Wilczek explains, “are already opening up qualitatively new paths toward understanding. They will move the frontier of knowledge in directions, and arrive at places, that unaided human brains can’t go.”

Or put it this way: physicists can pursue a Theory of Everything all they like. They’ll never find it, because if they did find it, they wouldn’t understand it.

Where does that leave physics? Where does that leave Wilczek? His response is gloriously matter-of-fact:

“… really, this should not come as fresh news. Humans themselves know many things that are not available to human consciousness, such as how to process visual information at incredible speeds, or how to make their bodies stay upright, walk and run.”

Right now physicists have come to the conclusion that the vast majority of mass in the universe reacts so weakly to the bits of creation we can see, we may never know its nature. Though Wilczek makes a brave stab at the problem of so-called “dark matter”, he is equally prepared to accept that a true explanation may prove incomprehensible.

Human intelligence turns out to be just one kind of engine for understanding. Wilzcek would have us nurture it and savour it, and not just for what it can do, but because it is uniquely ours.

The seeds of indisposition

Reading Ageless by Andrew Steele for the Telegraph, 20 December 2020

The first successful blood transfusions were performed in 1650, by the English physician Richard Lower, on dogs. The idea, for some while, was not that transfusions would save lives, but that they might extend them.

Turns out they did. The Philosophical Transactions of the Royal Society mentions an experiment in which “an old mongrel curr, all over-run with the mainge” was transfused with about fifteen ounces of of blood from a young spaniel and was “perfectly cured.”

Aleksandr Bogdanov, who once vied with Vladimir Lenin for control of the Bolsheviks (before retiring to write science fiction novels) brought blood transfusion to Russia, and hoped to rejuvenate various exhausted colleagues (including Stalin) by the method. On 24 March 1928 he mutually transfused blood with a 21-year-old student, suffered a massive transfusion reaction, and died, two weeks later, at the age of fifty-four.

Bogdanov’s theory was stronger than his practice. His essay on ageing speaks a lot of sense. “Partial methods against it are only palliative,” he wrote, “they merely address individual symptoms, but do not help fight the underlying illness itself.” For Bogdanov, ageing is an illness — unavoidable, universal, but no more “normal” or “natural” than any other illness. By that logic, ageing should be no less invulnerable to human ingenuity and science. It should, in theory, be curable.

Andrew Steele agrees. Steele is an Oxford physicist who switched to computational biology, drawn by the field of biogerontology — or the search for a cure for ageing. “Treating ageing itself rather than individual diseases would be transformative,” he writes, and the data he brings to this argument is quite shocking. It turns out that curing cancer would add less than three years to a person’s typical life expectancy, and curing heart disease, barely two, as there are plenty of other diseases waiting in the wings.

Is ageing, then, simply a statistical inevitability — a case of there always being something out there that’s going to get us?

Well, no. In 1825 Benjamin Gompertz, a British mathematician, explained that there are two distinct drivers of human mortality. There are extrinsic events, such as injuries or diseases. But there’s also an internal deterioration — what he called “the seeds of indisposition”.

It’s Steele’s job here to explain why we should treat those “seeds” as a disease, rather than a divinely determined limit. In the course of that explanation Steele gives us, in effect, a tour of the whole of human biology. It’s an exhilarating journey, but by no means always a pretty one: a tale of senescent cells, misfolded proteins, intracellular waste and reactive metals. Readers of advanced years, wondering why their skin is turning yellow, will learn much more here than they bargained for.

Ageing isn’t evolutionarily useful; but because it comes after our breeding period, evolution just hasn’t got the power to do anything about it. Mutations whose negative effects occur late in our lives accumulate in the gene pool. Worse, if they had a positive effect on our lives early on, then they will be actively selected for. Ageing, in other words, is something we inherit.

It’s all very well conceptualising old age as one disease. But if your disease amounts to “what happens to a human body when 525 million years of evolution stop working”, then you’re reduced to curing everything that can possibly go wrong, with every system, at once. Ageing, it turns out, is just thousands upon thousands of “individual symptoms”, arriving all at once.

Steele believes the more we know about human biology, the more likely it is we’ll find systemic ways to treat these multiple symptoms. The challenge is huge, but the advances, as Steele describes them, are real and rapid. If, for example, we can persuade senescent cells to die, then we can shed the toxic biochemical garbage they accumulate, and enjoy once more all the benefits of (among other things) young blood. This no fond hope: human trials of senolytics started in 2018.

Steele is a superb guide to the wilder fringes of real medicine. He pretends to nothing else, and nothing more. So whether you find Ageless an incredibly focused account, or just an incredibly narrow one, will come down, in the end, to personal taste.

Steele shows us what happens to us biologically as we get older — which of course leaves a lot of blank canvas for the thoughtful reader to fill. Steele’s forebears in this (frankly, not too edifying) genre have all to often claimed that there are no other issues to tackle. In the 1930s the surgeon Alexis Carrel declared that “Scientific civilization has destroyed the world of the soul… Only the strength of youth gives the power to satisfy physiological appetites and to conquer the outer world”.

Charming.

And he wasn’t the only one. Books like Successful Aging (Rowe & Kahn, 1998) and How and Why We Age (Hayflick, 1996) aspire to a sort of overweaning authority, not by answering hard questions about mortality, long life and ageing, but merely by denying a gerontological role for anyone outside their narrow specialism: philosophers, historians, theologians, ethicists, poets — all are shown the door.

Steele is much more sensible. He simply sticks to his subject. To the extent that he expresses a view, I am confident that he understands that ageing is an experience to be lived meaningfully and fully, as well as a fascinating medical problem to be solved.

Steele’s vision is very tightly controlled: he wants us to achieve “negligible senescence”, in which, as we grow older, we suffer no obvious impairments. What he’s after is a risk of death that stays constant no matter how old we get. This sounds fanciful, but it does happen in nature. Giant tortoises succumb to statistical inevitability, not decrepitude.

I have a fairly entrenched problem with books that treat ageing as a merely medical phenomenon. But I heartily recommend this one. It’s modest in scope, and generous in detail. It’s an honest and optimistic contribution to a field that tips very easily indeed into Tony Stark-style boosterism.

Life expectancy in the developed world has doubled from 40 in the 1800s to over 80 today. But it is in our nature to be always craving for more. One colourful outfit called Ambrosia is offering anyone over 35 the opportunity to receive a litre of youthful blood plasma for $8000. Steele has some fun with this: “At the time of writing,” he tells us, “a promotional offer also allows you to get two for $12000 — buy one, get one half-price.”

Soaked in ink and paint

Reading Dutch Light: Christiaan Huygens and the making of science in Europe
by Hugh Aldersey-Williams for the Spectator, 19 December 2020

This book, soaked, like the Dutch Republic itself, “in ink and paint”, is enchanting to the point of escapism. The author calls it “an interior journey, into a world of luxury and leisure”. It is more than that. What he says of Huygen’s milieu is true also of his book: “Like a ‘Dutch interior’ painting, it turns out to contain everything.”

Hugh Aldersey-Williams says that Huygens was the first modern scientist. This is a delicate argument to make — the word “scientist” didn’t enter the English language before 1834. And he’s right to be sparing with such rhetoric, since a little of it goes a very long way. What inadvertent baggage comes attached, for instance, to the (not unreasonable) claim that the city of Middleburg, supported by the market for spectacles, became “a hotbed of optical innovation” at the end of the 16th century? As I read about the collaboration between Christiaan’s father Constantijn (“with his trim dark beard and sharp features”) and his lens-grinder Cornelis Drebbel (“strapping, ill-read… careless of social hierarchies”) I kept getting flashbacks to the Steve Jobs and Steve Wozniak double-act in Aaron Sorkin’s film.

This is the problem of popular history, made double by the demands of explaining the science. Secretly, readers want the past to be either deeply exotic (so they don’t have to worry about it) or fundamentally familiar (so they, um, don’t have to worry about it).

Hugh Aldersey-Williams steeps us in neither fantasy for too long, and Dutch Light is, as a consequence, an oddly disturbing read: we see our present understanding of the world, and many of our current intellectual habits, emerging through the accidents and contingencies of history, through networks and relationships, friendships and fallings-out. Huygens’s world *is* distinctly modern — disturbingly so: the engine itself, the pipework and pistons, without any of the fancy fairings and decals of liberalism.

Trade begets technology begets science. The truth is out there but it costs money. Genius can only swim so far up the stream of social prejudice. Who your parents are matters.

Under Dutch light — clean, caustic, calvinistic — we see, not Enlightenment Europe emerging into the comforts of the modern, but a mirror in which we moderns are seen squatting a culture, full of flaws, that we’ve never managed to better.

One of the best things about Aldersey-Williams’s absorbing book (and how many 500-page biographies do you know feel too short when you finish them?) is the interest he shows in everyone else. Christiaan arrives in the right place, in the right time, among the right people, to achieve wonders. His father, born 1596 was a diplomat, architect, poet (he translated John Donne) and artist (he discovered Rembrandt). His longevity exasperated him: “Cease murderous years, and think no more of me” he wrote, on his 82nd birthday. He lived eight years more. But the space and energy Aldersey-Williams devotes to Constantijn and his four other children — “a network that stretched across Europe” — is anything but exasperating. It immeasurably enriches our idea of Christiaan’s work meant, and what his achievements signified.

Huygens worked at the meeting point of maths and physics, at a time when some key physical aspects of reality still resisted mathematical description. Curves provide a couple of striking examples. The cycloid is the path made by a point on the circumference of a turning wheel. The catenary is the curve made by a chain or rope hanging under gravity. Huygens was the first to explain these curves mathematically, doing more than most to embed mathematics in the physical sciences. He tackled problems in geometry and probability, and had some fun in the process (“A man of 56 years marries a woman of 16 years, how long can they live together without one or the other dying?”) Using telescopes he designed and made himself, he discovered Saturn’s ring system and its largest moon, Titan. He was the first to describe the concept of centrifugal force. He invented the pendulum clock.

Most extraordinary of all, Huygens — though a committed follower of Descartes (who was once a family friend) — came up with a model of light as a wave, wholly consistent with everything then known about the nature of light apart from colour, and streets ahead of the “corpuscular” theory promulgated by Newton, which had light consisting of a stream of tiny particles.

Huygens’s radical conception of light seems even stranger, when you consider that, as much as his conscience would let him, Huygens stayed faithful to Descartes’ vision of physics as a science of bodies in collision. Newton’s work on gravity, relying as it did on an unseen force, felt like a retreat to Huygens — a step towards occultism.

Because we turn our great thinkers into fetishes, we allow only one per generation. Newton has shut out Huygens, as Galileo shut out Kepler. Huygens became an also-ran in Anglo-Saxon eyes; ridiculous busts of Newton, meanwhile, were knocked out to adorn the salons of Britain’s country estates, “available in marble, terracotta and plaster versions to suit all pockets.”

Aldersey-Williams insists that this competition between the elder Huygens and the enfant terrible Newton was never so cheap. Set aside their notorious dispute over calculus, and we find the two men in lively and, yes, friendly correspondence. Cooperation and collaboration were on the rise: “Gone,” Aldersey-Williams writes, “is the quickness to feel insulted and take umbrage that characterised so many exchanges — domestic as well as international — in the early days of the French and English academies of science.”

When Henry Oldenburg, the prime mobile of the Royal Society, died suddenly in 1677, a link was broken between scientists everywhere, and particularly between Britain and the continent. The 20th century did not forge a culture of international scientific cooperation. It repaired the one Oldenburg and Huygens had built over decades of eager correspondence and clever diplomacy.

What else you got?

Reading Benjamin Labatut’s When We Cease to Understand the World for the Spectator, 14 November 2020

One day someone is going to have to write the definitive study of Wikipedia’s influence on letters. What, after all, are we supposed to make of all these wikinovels? I mean novels that leap from subject to subject, anecdote to anecdote, so that the reader feels as though they are toppling like Alice down a particularly erudite Wikipedia rabbit-hole.

The trouble with writing such a book, in an age of ready internet access, and particularly Wikipedia, is that, however effortless your erudition, no one is any longer going to be particularly impressed by it.

We can all be our own Don DeLillo now; our own W G Sebald. The model for this kind of literary escapade might not even be literary at all; does anyone here remember James Burke’s Connections, a 1978 BBC TV series which took an interdisciplinary approach to the history of science and invention, and demonstrated how various discoveries, scientific achievements, and historical world events were built from one another successively in an interconnected way?

And did anyone notice how I ripped the last 35 words from the show’s Wikipedia entry?

All right, I’m sneering, and I should make clear from the off that When We Cease… is a chilling, gripping, intelligent, deeply humane book. It’s about the limits of human knowledge, and the not-so-very-pleasant premises on which physical reality seems to be built. The author, a Chilean born in Rotterdam in 1980, writes in Spanish. Adrian Nathan West — himself a cracking essayist — fashioned this spiky, pitch-perfect English translation. The book consists, in the main, of four broadly biographical essays. The chemist Franz Haber finds an industrial means of fixing nitrogen, enabling the revolution in food supply that sustains our world, while also pioneering modern chemical warfare. Karl Schwarzchild, imagines the terrible uber-darkness at the heart of a black hole, dies in a toxic first world war and ushers in a thermonuclear second. Alexander Grothendieck is the first of a line of post-war mathematician-paranoiacs convinced they’ve uncovered a universal principle too terrible to discuss in public (and after Oppenheimer, really, who can blame them?) In the longest essay-cum-story, Erwin Schrodinger and Werner Heisenberg slug it out for dominance in a field — quantum physics — increasingly consumed by uncertainty and (as Labatut would have it) dread.

The problem here — if problem it is — is that no connection, in this book of artfully arranged connections, is more than a keypress away from the internet-savvy reader. Wikipedia, twenty years old next year, really has changed our approach to knowledge. There’s nothing aristocratic about erudition now. It is neither a sign of privilege, nor (and this is more disconcerting) is it necessarily a sign of industry. Erudition has become a register, like irony. like sarcasm. like melancholy. It’s become, not the fruit of reading, but a way of perceiving the world.

Literary attempts to harness this great power are sometimes laughable. But this has always been the case for literary innovation. Look at the gothic novel. Fifty odd years before the peerless masterpiece that is Mary Shelley’s Frankenstein we got Horace Walpole’s The Castle of Otranto, which is jolly silly.

Now, a couple of hundred years after Frankenstein was published, “When We Cease to Understand the World” dutifully repeats the rumours (almost certainly put about by the local tourist industry) that the alchemist Johann Conrad Dippel, born outside Darmstadt in the original Burg Frankenstein in 1673, wielded an uncanny literary influence over our Mary. This is one of several dozen anecdotes which Labatut marshals to drive home that message that There Are Things In This World That We Are Not Supposed to Know. It’s artfully done, and chilling in its conviction. Modish, too, in the way it interlaces fact and fiction.

It’s also laughable, and for a couple of reasons. First, it seems a bit cheap of Labatut to treat all science and mathematics as one thing. If you want to build a book around the idea of humanity’s hubris, you can’t just point your finger at “boffins”.

The other problem is Labatut’s mixing of fact and fiction. He’s not out to cozen us. But here and there this reviewer was disconcerted enough to check his facts — and where else but on Wikipedia? I’m not saying Labatut used Wikipedia. (His bibliography lists a handful of third-tier sources including, I was amused to see, W G Sebald.) Nor am I saying that using Wikipedia is a bad thing.

I think, though, that we’re going to have to abandon our reflexive admiration for erudition. It’s always been desperately easy to fake. (John Fowles.) And today, thanks in large part to Wikipedia, it’s not beyond the wit of most of us to actually *acquire*.

All right, Benjamin, you’re erudite. We get it. What else you got?

A fanciful belonging

Reading The Official History of Britain: Our story in numbers as told by the Office for National Statistics by Boris Starling with David Bradbury for The Telegraph, 18 October 2020

Next year’s national census may be our last. Opinions are being sought as to whether it makes sense, any longer, for the nation to keep taking its own temperature every ten years. Discussions will begin in 2023. Our betters may conclude that the whole rigmarole is outdated, and that its findings can be gleaned more cheaply and efficiently by other methods.

How the UK’s national census was established, what it achieved, and what it will mean if it’s abandoned, is the subject of The Official History of Britain — a grand title for what is, to be honest, a rather messy book, its facts and figures slathered in weak and irrelevant humour, most of it to do with football, I suppose as an intellectual sugar lump for the proles.

Such condescension is archetypally British; and so too is the gimcrack team assembled to write this book. There is something irresistibly Dad’s Army about the image of David Bradbury, an old hand at the Office of National Statistics, comparing dad jokes with novelist Boris Starling, creator of Messiah’s DCI Red Metcalfe, who was played on the telly by Ken Stott.

The charm of the whole enterprise is undeniable. Within these pages you will discover, among other tidbits, the difference between critters and spraggers, whitsters and oliver men. Such were the occupations introduced into the Standard Classification of 1881. (Recent additions include YouTuber and dog sitter.) Nostalgia and melancholy come to the fore when the authors say a fond farewell to John and Margaret — names, deeply unfashionable now, that were pretty much compulsory for babies born between 1914 and 1964. But there’s rigour, too; I recommend the author’s highly illuminating analysis of today’s gender pay gap.

Sometimes the authors show us up for the grumpy tabloid zombies we really are. Apparently a sizeable sample of us, quizzed in 2014, opined that 15 per cent of all our girls under sixteen were pregnant. The lack of mathematical nous here is as disheartening as the misanthropy. The actual figure was a still worryingly high 0.5 per cent, or one in 200 girls. A 10-year Teenage Pregnancy Strategy was created to tackle the problem, and the figure for 2018 — 16.8 conceptions per 1000 women aged between 15 and 17 — is the lowest since records began.

This is why census records are important: they inform enlightened and effective government action. The statistician John Rickman said as much in a paper written in 1796, but his campaign for a national census only really caught on two years later, when the clergyman Thomas Malthus scared the living daylights out of everyone with his “Essay on the Principle of Population”. Three years later, ministers rattled by Malthus’s catalogue of checks on the population of primitive societies — war, pestilence, famine, and the rest — peeked through their fingers at the runaway population numbers for 1801.

The population of England then was the same as the population of Greater London now. The population of Scotland was almost exactly the current population of metropolitan Glasgow.

Better to have called it “The Official History of Britons”. Chapter by chapter, the authors lead us (wisely, if not too well) from Birth, through School, into Work and thence down the maw of its near neighbour, Death, reflecting all the while on what a difference two hundred years have made to the character of each life stage.

The character of government has changed, too. Rickman wanted a census because he and his parliamentary colleagues had almost no useful data on the population they were supposed to serve. The job of the ONS now, the writers point out, “is to try to make sure that policymakers and citizens can know at least as much about their populations and economies as the internet behemoths.”

It’s true: a picture of the state of the nation taken every ten years just doesn’t provide the granularity that could be fetched, more cheaply and more efficiently, from other sources: “smaller surveys, Ordnance Survey data, GP registrations, driving licence details…”

But this too is true: near where I live there is a pedestrian crossing. There is a button I can push, to change the lights, to let me cross the road. I know that in daylight hours, the button is a dummy, that the lights are on a timer, set in some central office, to smooth the traffic flow. Still, I press that button. I like that button. I appreciate having my agency acknowledged, even in a notional, fanciful way.

Next year, 2021, I will tell the census who and what I am. It’s my duty as a citizen, and also my right, to answer how I will. If, in 2031, the state decides it does not need to ask me who I am, then my idea of myself as a citizen, notional as it is, fanciful as it is, will be impoverished.

Modernity in Mexico

Reading Connected: How a Mexican village built its own cell phone network by Roberto J González for New Scientist, 14 October 2020

In 2013 the world’s news media, fell in love with Talea, a Mexican pueblo (population 2400) in Rincón, a remote corner of Northern Oaxaca. América Móvil, the telecommunications giant that ostensibly served their area, had refused to provide them with a mobile phone service, so the plucky Taleans had built a network of their own.

Imagine it: a bunch of indigenous maize growers, subsistence farmers with little formal education, besting and embarrassing Carlos Slim, América Móvil’s owner and, according to Forbes magazine at the time, the richest person in the world!

The full story of that short-lived, homegrown network is more complicated, says Roberto González in his fascinating, if somewhat self-conscious account of rural innovation.

Talea was never a backwater. A community that survives Spanish conquest and resists 500 years of interference by centralised government may become many things, but “backward” is not one of them.

On the other hand, Gonzalez harbours no illusions about how communities, however sophisticated, might resist the pall of globalising capital — or why they would even want to. That homogenising whirlwind of technology, finance and bureaucracy also brings with it roads, hospitals, schools, entertainment, jobs, and medicine that actually works.

For every outside opportunity seized, however, an indigenous skill must be forgotten. Talea’s farmers can now export coffee and other cash crops, but many fields lie abandoned, as the town’s youth migrate to the United States. The village still tries to run its own affairs — indeed, the entire Oaxaca region staged an uprising against centralised Mexican authority in 2006. But the movement’s brutal repression by the state augurs ill for the region’s autonomy. And if you’ve no head for history, well, just look around. Pueblos are traditionally made of mud. It’s a much easier, cheaper, more repairable and more ecologically sensitive material than the imported alternatives. Still, almost every new building here is made of concrete.

In 2012, Talea gave its backing to another piece of imported modernity — a do-it-yourself phone network, assembled by Peter Bloom, a US-born rural development specialist, and Erick Huerta, a Mexican telecommunications lawyer. Both considered access to mobile phone networks and the internet to be a human right.

Also helping — and giving the lie to the idea that the network was somehow a homegrown idea — were “Kino”, a hacker who helped indigenous communities evade state controls, and Minerva Cuevas, a Mexican artist best known for hacking supermarket bar codes.

By 2012 Talea’s telephone network was running off an open-source mobile phone network program called OpenBTS (BTS stands for base transceiver station). Mobiles within range of a base station can communicate with each other, and connect globally over the internet using VoIP (or Voice over Internet Protocol). All the network needed was an electrical power socket and an internet connection — utilities Talea had enjoyed for years.

The network never worked very well. Whenever the internet went down, which it did occasionally, the whole town lost its mobile coverage. Recently the phone company Movistar has moved in with an aggressive plan to provide the region with regular (if costly) commercial coverage. Talea’s autonomous network idea lives on, however, in a cooperative organization of community cell phone networks which today represents nearly seventy pueblos across several different regions in Oaxaca.

Connected is an unsentimental account of how a rural community takes control (even if only for a little while) over the very forces that threaten its cultural existence. Talea’s people are dispersing ever more quickly across continents and platforms in search of a better life. The “virtual Taleas” they create on Facebook and other sites to remember their origins are touching, but the fact remains: 50 years of development have done more to unravel a local culture than 500 years of conquest.