“A wonderful moral substitute for war”

Reading Oliver Morton’s The Moon and Robert Stone and Alan Adres’s Chasing the Moon for The Telegraph, 18 May 2019

I have Arthur to thank for my earliest memory: being woken and carried into the living room on 20 July 1969 to see Neil Armstrong set foot on the moon.

Arthur is a satellite dish, part of the Goonhilly Earth Satellite Station in Cornwall. It carried the first ever transatlantic TV pictures from the USA to Europe. And now, in a fit of nostalgia, I am trying to build a cardboard model of the thing. The anniversary kit I bought comes with a credit-card sized Raspberry Pi computer that will cause a little red light to blink at the centre of the dish, every time the International Space Station flies overhead.

The geosychronous-satellite network that Arthur Clarke envisioned in 1945 came into being at the same time as men landed on the Moon. Intelsat III F-3 was moved into position over the Indian Ocean a few days before Apollo 11’s launch, completing the the world’s first geostationary-satellite network. The Space Race has bequeathed us a world steeped in fractured televisual reflections of itself.

Of Apollo itself, though, what actually remains? The Columbia capsule is touring the United States: it’s at Seattle’s Museum of Flight for this year’s fiftieth anniversary. And Apollo’s Mission Control Center in Houston is getting a makeover, its flight control consoles refurbished, its trash cans, book cases, ashtrays and orange polyester seat cushions all restored.

On the Moon there are some flags; some experiments, mostly expired; an abandoned car.

In space, where it matters, there’s nothing. The intention had been to build moon-going craft in orbit. This would have involved building a space station first. In the end, spooked by a spate of Soviet launches, NASA decided to cut to the chase, sending two small spacecraft up on a single rocket. One got three astronauts to the moon. The other, a tiny landing bug (standing room only) dropped two of them onto the lunar surface and puffed them back up into lunar orbit, where they rejoined the command module and headed home. It was an audacious, dangerous and triumphant mission — but it left nothing useful or reuseable behind.

In The Moon: A history for the future, science writer Oliver Morton observes that without that peculiar lunar orbital rendezvous plan, Apollo would at least have left some lasting infrastructure in orbit to pique someone’s ambition. As it was, “Every Apollo mission would be a single shot. Once they were over, it would be in terms of hardware — even, to a degree, in terms of expertise — as if they had never happened.”

Morton and I belong to the generation sometimes dubbed Apollo’s orphans. We grew up (rightly) dazzled by Apollo’s achievement. It left us, however, with the unshakable (and wrong) belief that our enthusiasm was common, something to do with what we were taught to call humanity’s “outward urge”. The refrain was constant: how in people there was this inborn desire to leave their familiar surroundings and explore strange new worlds.

Nonsense. Over a century elapsed between Columbus’s initial voyage and the first permanent English settlements. One of the more surprising findings of recent researches into the human genome is that, left to their own devices, people hardly move more than a few weeks’ walking distance from where they were born.

This urge, that felt so visceral, so essential to one’s idea of oneself: how could it possibly turn out to be the psychic artefact of a passing political moment?

Documentary makers Robert Stone and Alan Andres answer that particular question in Chasing the Moon, a tie in to their forthcoming series on PBS. It’s a comprehensive account of the Apollo project, and sends down deep roots: to the cosmist speculations of fin de siecle Russia, the individualist eccentricities of Germanys’ Verein fur Raumschiffart (Space Travel Society), and the deceptively chummy brilliance of the British Interplanetary Society, who used to meet in the pub.

The strength of Chasing the Moon lies not in any startling new information it divulges (that boat sailed long ago) but in the connections it makes, and the perspectives it brings to bear. It is surprising to find the New York Times declaring, shortly after the Bay of Pigs fiasco, that Kennedy isn’t nearly as interested in building a space programme as he should be. (“So far, apparently, no one has been able to persuade President Kennedy of the tremendous political, psychological, and prestige importance, entirely apart from the scientific and military results, of an impressive space achievement.”) And it is worthwhile to be reminded that, less than a month after his big announcement, Kennedy was trying to persuade Khrushchev to collaborate on the Apollo project, and that he approached the Soviets with the idea a second time, just days before his assassination in Dallas.

For Kennedy, Apollo was a strategic project, “a wonderful moral substitute for war ” (to slightly misapply Ray Bradbury’s phrase), and all to do with manned missions. NASA administrator James Webb, on the other hand, was a true believer. He could see no end to the good big organised government projects could achieve by way of education and science and civil development. In his modesty and dedication, Webb resembled no-one so much as the first tranche of bureaucrat-scientists in the Soviet Union. He never featured on a single magazine cover, and during his entire tenure he attended only one piloted launch from Cape Kennedy. (“I had a job to do in Washington,” he explained.)

The two men worked well enough together, their priorities dovetailing neatly in the role NASA took in promoting the Civil Rights Act and the Voting Rights Act and the government’s equal opportunities program. (NASA’s Saturn V designer, the former Nazi rocket scientist Wernher Von Braun, became an unlikely and very active campaigner, the New York Times naming him “one of the most outspoken spokesmen for racial moderation in the South.”) But progress was achingly slow.

At its height, the Apollo programme employed around two per cent of the US workforce and swallowed four per cent of its GDP. It was never going to be agile enough, or quotidian enough, to achieve much in the area of effecting political change. There were genuine attempts to recruit and train a black pilot for the astronaut programme. But comedian Dick Gregory had the measure of this effort: “A lot of people was happy that they had the first Negro astronaut, Well, I’ll be honest with you, not myself. I was kind of hoping we’d get a Negro airline pilot first.”

The big social change the Apollo program did usher in was television. (Did you know that failing to broadcast the colour transmissions from Apollo 11 proved so embarrassing to the apartheid government in South Africa that they afterwards created a national television service?)

But the moon has always been a darling of the film business. Never mind George Melie’s Trip to the Moon. How about Fritz Lang ordering a real rocket launch for the premiere of Frau im Mond? This was the film that followed Metropolis, and Lang roped in no less a physicist than Hermann Oberth to build it for him. When his 1.8-metre tall liquid-propellant rocket came to nought, Oberth set about building one eleven metres tall powered by liquid oxygen. They were going to launch it from the roof of the cinema. Luckily they ran out of money.

The Verein für Raumschiffahrt was founded by men who had acted as scientific consultants on Frau im Mond. Von Braun became one of their number, before he was whisked away by the Nazis to build rockets for the war effort. Without Braun, the VfR grew nuttier by the year. Oberth, who worked for a time in the US after the war, went the same way, his whole conversation swallowed by UFOs and extraterrestrials and glimpses of Atlantis. When he went back to Germany, no-one was very sorry to see him go.

What is it about dreaming of new worlds that encourages the loner in us, the mooncalf, the cave-dweller, wedded to ascetism, always shying from the light?

After the first Moon landing, the philosopher (and sometime Nazi supporter) Martin Heidegger said in interview, “I at any rate was frightened when I saw pictures coming from the moon to the earth… The uprooting of man has already taken place. The only thing we have left is purely technological relationships. This is no longer the earth on which man lives.”

Heidegger’s worries need a little unpacking, and for that we turn to Morton’s cool, melancholy The Moon: A History for the Future. Where Stone and Anders collate and interpret, Morton contemplates and introspects. Stone and Anders are no stylists. Morton’s flights of informed fancy include a geological formation story for the moon that Von Trier’s film Melancholy cannot rival for spectacle and sentiment.

Stone and Anders stand with Walter Cronkite whose puzzled response to young people’s opposition to Apollo — “How can anybody turn off from a world like this?” — stands as an epitaph for Apollo’s orphans everywhere. Morton, by contrast, does understand why it’s proved so easy for us to switch off from the Moon. At any rate he has some good ideas.

Gertrude Stein, never a fan of Oakland, once wrote of the place, “There is no there there.” If Morton’s right she should have tried the Moon, a place whose details “mostly make no sense.”

“The landscape,” Morton explains, “may have features that move one into another, slopes that become plains, ridges that roll back, but they do not have stories in the way a river’s valley does. It is, after all, just the work of impacts. The Moon’s timescape has no flow; just punctuation.”

The Moon is Heidegger’s nightmare realised. It can never be a world of experience. It can only be a physical environment to be coped with technologically. It’s dumb, without a story of its own to tell, so much “in need of something but incapable of anything”, in Morton’s telling phrase, that you can’t even really say that it’s dead.

So why did we go there, when we already knew that it was, in the words of US columnist Milton Mayer, a “pulverised rubble… like Dresden in May or Hiroshima in August”?

Apollo was the US’s biggest, brashest entry in its heart-stoppingly exciting – and terrifying – political and technological competition with the Soviet Union. This is the matter of Stone and Anders’s Chasing the Moon, as a full a history as one could wish for, clear-headed about the era and respectful of the extraordinary efforts and qualities of the people involved.

But while Morton is no less moved by Apollo’s human adventure, we turn to his book for a cooler and more distant view. Through Morton’s eyes we begin to see, not only what the moon actually looks like (meaningless, flat, gentle, a South Downs gone horribly wrong) but why it conjures so much disbelief in those who haven’t been there.

A year after the first landing the novelist Norman Mailer joked: “In another couple of years there will be people arguing in bars about whether anyone even went to the Moon.” He was right. Claims that the moon landing were fake arose the moment the Saturn Vs stopped flying in 1972, and no wonder. In a deep and tragic sense Apollo was fake, in the sense that it didn’t deliver the world it had promised.

And let’s be clear here: the world it promised would have been wonderful. Never mind the technology: that was never the core point. What really mattered was that at the height of the Vietnam war, we seemed at last to have found that wonderful moral substitute for war. “All of the universe doesn’t care if we exist or not,” Ray Bradbury wrote, “but we care if we exist… This is the proper war to fight.”

Why has space exploration not united the world around itself? It’s easy to blame ourselves and our lack of vision. “It’s unfortunate,” Lyndon Johnson once remarked to the astronaut Wally Schirra, “but the way the American people are, now that they have developed all of this capability, instead of taking advantage of it, they’ll probably just piss it all away…” This is the mordant lesson of Stone and Andres’s otherwise uplifting Chasing the Moon.

Oliver Morton’s The Moon suggests a darker possibility: that the fault lies with the Moon itself, and, by implication, with everything that lies beyond our little home.

Morton’s Moon is a place defined by absences, gaps, and silence. He makes a poetry of it, for a while, he toys with thoughts of future settlement, he explores the commercial possibilities. In the end, though, what can this uneventful satellite of ours ever possibly be, but what it is: “just dry rocks jumbled”?

 

 

Asking for it

Reading The Metric Society: On the Quantification of the Social by Steffen Mau (Polity Press) for the Times Literary Supplement, 30 April 2019 

Imagine Steffen Mau, a macrosociologist (he plays with numbers) at Humboldt University of Berlin, writing a book about information technology’s invasion of the social space. The very tools he uses are constantly interrupting him. His bibliographic software wants him to assign a star rating to every PDF he downloads. A paper-sharing site exhorts him repeatedly to improve his citation score (rather than his knowledge). In a manner that would be funny, were his underlying point not so serious, Mau records how his tools keep getting in the way of his job.

Why does Mau use these tools at all? Is he too good for a typewriter? Of course he is: the whole history of civilisation is the story of us getting as much information as possible out of our heads and onto other media. It’s why, nigh-on 5000 years ago, the Sumerians dreamt up the abacus. Thinking is expensive. How much easier to stop thinking, and rely on data records instead!

The Metric Society, is not a story of errors made, or of wrong paths taken. This is a story, superbly reduced to the chill essentials of an executive summary, of how human society is getting exactly what it’s always been asking for. The last couple of years have seen more than 100 US cities pledge to use evidence and data to improve their decision-making. In the UK, “What Works Centres”, first conceived in the 1990s, are now responsible for billions in funding. The acronyms grow more bellicose, the more obscure they become. In the UK, the Alliance for Useful Evidence (with funding from ESRC, Big Lottery and Nesta) champions the use of evidence in social policy and practice.

Mau describes the emergence of a society trapped in “data-driven perpetual stock-taking”, in which the new Juggernaut of auditability lays waste to creativity, production, and even simple efficiency. “The magic attraction of numbers and comparisons is simply irresistible,” Mau writes.

It’s understandable. Our first great system of digital abstraction, money, enabled a more efficient and less locally bound exchange of good and services, and introduced a certain level of rational competition into the world of work.

But look where money has led us! Capital is not the point here. Neither is capitalism. The point is our relationship with information. Amazon’s algorithms are sucking all the localism out of the retail system, to the point where whole high streets have vanished — and entire communities with them. Amazon is in part powered by the fatuous metricisation of social variety through systems of scores, rankings, likes, stars and grades, which are (not coincidentally) the methods by which social media structures — from clownish Twitter to China’s Orwellian Social Credit System — turn qualitative differences into quantitative inequalities.

Mau leaves us thoroughly in the lurch. He’s a diagnostician, not a snake-oil salesman, and his bedside manner is distinctly chilly. Dazzled by data, which have relieved us of the need to dream and imagine, we fight for space on the foothills of known territory. The peaks our imaginations might have trod — as a society, and as a species — tower above us, ignored.

“The English expedition of 1919 is to blame for this whole misery”

Four books to celebrate the centenary of  Eddington’s 1919 eclipse observations. For The Spectator, 11 May 2019.

Einstein’s War: How relativity triumphed amid the vicious nationalism of World War I
Matthew Stanley
Dutton

Gravity’s Century: From Einstein’s eclipse to images of black holes
Ron Cowen
Harvard University Press

No Shadow of a Doubt
Daniel Kennefick
Princeton University Press

Einstein’s Wife: The real story of Mileva Einstein-Maric
Allen Esterson and David C Cassidy; contribution by Ruth Lewin Sime.
MIT Press

On 6 November 1919, at a joint meeting of the Royal Astronomical Society and the Royal Society, held at London’s Burlington House, the stars went all askew in the heavens.
That, anyway, was the rhetorical flourish with which the New York Times hailed the announcement of the results of a pair of astronomical expeditions conducted in 1919, after the Armistice but before the official end of the Great War. One expedition, led by Arthur Stanley Eddington, assistant to the Astronomer Royal, had repaired to the plantation island of Principe off the coast of West Africa; the other, led by Andrew Crommelin, who worked at the Royal Greenwich Observatory, headed to a racecourse in Brazil. Together, in the few minutes afforded by the 29 May solar eclipse, the teams used telescopes to photograph shifts in the apparent location of stars as the edge of the sun approached them.

The possibility that a heavy body like the sun might cause some distortion in the appearance of the star field was not particularly outlandish. Newton, who had assigned “corpuscles” of light some tiny mass, supposed that such a massive body might draw light in like a lens, though he imagined the effect was too slight to be observable.

The degree of distortion the Eddington expeditions hoped to observe was something else again. 1.75 arc-seconds is roughly the angle subtended by a coin, a couple of miles away: a fine observation, but not impossible at the time. Only the theory of the German-born physicist Albert Einstein — respected well enough at home but little known to the Anglophone world — would explain such a (relatively) large distortion, and Eddington’s confirmation of his hypothesis brought the “famous German physician” (as the New York Times would have it) instant celebrity.

“The English expedition of 1919 is ultimately to blame for this whole misery, by which the general masses seized possession of me,” Einstein once remarked; but he was not so very sorry for the attention. Forget the usual image of Einstein the loveable old eccentric. Picture instead a forty-year-old who, when he steps into a room, literally causes women to faint. People wanted his opinions even about stupid things. And for years, if anyone said anything wise, within a few months their words were being attributed to Einstein.

“Why is it that no one understands me and everyone likes me?” Einstein wondered. His appeal lay in his supposed incomprehensibility. Charlie Chaplin understood: “They cheer me because they all understand me,” he remarked, accompanying the theoretical physicist to a film premiere, “and they cheer you because no one understands you.”

Several books serve to mark the centenary of the 1919 eclipse observations. Though their aims diverge, they all to some degree capture the likeness of Einstein the man, messy personal life and all, while rendering his physics a little bit more comprehensible to the rest of us. Each successfully negotiates the single besetting difficulty facing books of this sort, namely the way science lends itself to bad history.

Science uses its past as an object lesson, clearing all the human messiness away to leave the ideas standing. History, on the other hand factors in as much human messiness as possible to show how the business of science is as contingent and dramatic as any other human activity.

While dealing with human matters, some ambiguity over causes and effects is welcome. There are two sides to every story, and so on and so forth: any less nuanced approach seems suspiciously moralistic. One need only look at the way various commentators have interpreted Einstein’s relationship with his first wife.

Einstein was, by the end of their failing marriage, notoriously horrible to Mileva Einstein-Maric; this in spite of their great personal and intellectual closeness as first-year physics students at the Federal Swiss Polytechnic. Einstein once reassured Elsa Lowenthal, his cousin and second-wife-to-be, that “I treat my wife as an employee I can not fire.” (Why Elsa, reading that, didn’t run a mile, is not recorded.)

Albert was a bad husband. His wife was a mathematician. Therefore Albert stole his theory of special relativity from Mileva. This shibboleth, bandied about since the 1970s, is a sort of of evil twin of whig history, distorted by teleology, anachronism and present-mindedness. It does no one any favours. The three separately authored parts of Einstein’s Wife: The real story of Mileva Einstein-Maric unpick the myth of Mileva’s influence over Albert, while increasing, rather than diminishing, our interest in and admiration of the woman herself. It’s a hard job to do well, without preciousness or special pleading, especially in today’s resentment-ridden and over-sensitive political climate, and the book is an impressive, compassionate accomplishment.
Matthew Stanley’s Einstein’s War, on the other hand, tips ever so slightly in the other direction, towards the simplistic and the didactic. His intentions, however, are benign — he is here to praise Einstein and Eddington and their fellows, not bury them — and his slightly on-the-nose style is ultimately mandated by the sheer scale of what he is trying to do, for he succeeds in wrapping the global, national and scientific politics of an era up in a compelling story of one man’s wild theory, lucidly sketched, and its experimental confirmation in the unlikeliest and most exotic circumstances.

The world science studies is truly a blooming, buzzing confusion. It is not in the least bit causal, in the ordinary human sense. Far from there being a paucity of good stories in science, there are a limitless number of perfectly valid, perfectly accurate, perfectly true stories, all describing the same phenomenon from different points of view.

Understanding the stories abroad in the physical sciences at the fin de siecle, seeing which ones Einstein adopted, why he adopted them, and why, in some cases, he swapped them for others, certainly doesn’t make his theorising easy. But it does give us a gut sense of why he was so baffled by the public’s response to his work. The moment we are able to put him in the context of co-workers, peers and friends, we see that Einstein was perfecting classical physics, not overthrowing it, and that his supposedly peculiar theory of relativity — as the man said himself –“harmonizes with every possible outlook of philosophy and does not interfere with being an idealist or materialist, pragmatist or whatever else one likes.”

In science, we need simplification. We welcome a didactic account. Choices must be made, and held to. Gravity’s Century by the science writer Ron Cowen is the most condensed of the books mentioned here; it frequently runs right up to the limit of how far complex ideas can be compressed without slipping into unavoidable falsehood. I reckon I spotted a couple of questionable interpretations. But these were so minor as to be hardly more than matters of taste, when set against Cowen’s overall achievement. This is as good a short introduction to Einstein’s thought as one could wish for. It even contrives to discuss confirmatory experiments and observations whose final results were only announced as I was writing this piece.

No Shadow of a Doubt is more ponderous, but for good reason: the author Daniel Kennefick, an astrophysicist and historian of science, is out to defend the astronomer Eddington against criticisms more serious, more detailed, and framed more conscientiously, than any thrown at that cad Einstein.

Eddington was an English pacifist and internationalist who made no bones about wanting his eclipse observations to champion the theories of a German-born physicist, even as jingoism reached its crescendo on both sides of the Great War. Given the sheer bloody difficulty of the observations themselves, and considering the political inflection given them by the man orchestrating the work, are Eddington’s results to be trusted?

Kennefick is adamant that they are, modern naysayers to the contrary, and in conclusion to his always insightful biography, he says something interesting about the way historians, and especially historians of science, tend to underestimate the past. “Scientists regard continuous improvement in measurement as a hallmark of science that is unremarkable except where it is absent,” he observes. “If it is absent, it tells us nothing except that someone involved has behaved in a way that is unscientific or incompetent, or both.” But, Kennefick observes, such improvement is only possible with practice — and eclipses come round too infrequently for practice to make much difference. Contemporary attempts to recreate Eddington’s observations face the exact same challenges Eddington did, and “it seems, as one might expect, that the teams who took and handled the data knew best after all.”

It was Einstein’s peculiar fate that his reputation for intellectual and personal weirdness has concealed the architectural elegance of his work. Higher-order explanations of general relativity have become clichés of science fiction. The way massive bodies bend spacetime like a rubber sheet is an image that saturates elementary science classes, to the point of tedium.

Einstein hated those rubber-sheet metaphors for a different reason. “Since the mathematicians pounced on the relativity theory,” he complained, “I no longer understand it myself.” We play about with thoughts of bouncy sheets. Einstein had to understand their behaviours mathematically in four dimensions (three of space and one of time), crunching equations so radically non-linear, their results would change the value of the numbers originally put into them in feedback loops that drove the man out of his mind. “Never in my life have I tormented myself anything like this,” he moaned.

For the rest of us, however, A little, prophylactic exposure to Einstein’s actual work pays huge dividends. It sweeps some of the weirdness away and reveals Einstein’s actual achievement: theories that set all the forces above the atomic scale dancing with an elegance Isaac Newton, founding father of classical physics, would have half-recognised, and wholly admired.

 

Choose-your-own adventure

Reading The Importance of Small Decisions by Michael O’Brien, R. Alexander Bentley and William Brock for New Scientist, 13 April 2019

What if you could map all kinds of human decision-making and use it to chart society’s evolution?

This is what academics Michael O’Brien, Alexander Bentley and William Brock try to do in The Importance of Small Decisions. It is an attempt to expand on a 2014 paper, “Mapping collective behavior in the big-data era”, that they wrote in Behavioral and Brain Sciences . While contriving to be somehow both too short and rambling, it bites off more than it can chew, nearly chokes to death on the ins and outs of group selection, and coughs up its best ideas in the last 40 pages.

Draw a graph. The horizontal axis maps decisions according to how socially influenced they are. The vertical axis tells you how clear the costs and pay-offs are for each decision. Rational choices sit in the north-western quadrant of the map. To the north-east, bearded capuchins teach each other how to break into palm nuts in a charming example of social learning (pictured). Twitter storms generated by fake news swirl about the south-east.

The more choices you face, the greater the cognitive load. The authors cite economist Eric Beinhocker, who in The Origin of Wealth calculated that human choices had multiplied a hundred million-fold in the past 10,000 years. Small and insignificant decisions now consume us.

Worse, costs and pay-offs are increasingly hidden in an ocean of informational white noise, so that it is easier to follow a trend than find an expert. “Why worry about the underlying causes of global warming when we can see what tens of millions of our closest friends think?” ask the authors, building to a fine, satirical climax.

In an effort to communicate widely, the authors have, I think, left out a few too many details from their original paper. And a mid-period novel by Philip K. Dick would paint a more visceral picture of a world created by too much information. Still, there is much fun to be had reading the garrulous banter of these three extremely smart academics.

Come on, Baggy, get with the beat!

Reading The Evolving Animal Orchestra: In search of what makes us musical by Henkjan Honing for New Scientist, 6 April 2019

The perception, if not the enjoyment, of musical cadences and of rhythm,” wrote Darwin in his 1871 book The Descent of Man, “is probably common to all animals.”

Henkjan Honing has tested this eminently reasonable idea, and in his book, The Evolving Animal Orchestra, he reports back. He details his disappointment, frustration and downright failure with such wit, humility and a love of the chase that any young person reading it will surely want to run away to become a cognitive scientist.

No culture has yet been found that doesn’t have music, and all music shares certain universal characteristics: melodies composed of seven or fewer discrete pitches; a regular beat; a limited sequence of rhythmic patterns. All this would suggest a biological basis for musicality.

A bird flies with regular beats of its wings. Animals walk with a particular rhythm. So you might expect beat perception to be present in everything that doesn’t want to falter when moving. But it isn’t. Honing describes experiments that demonstrate conclusively that we are the only primates with a sense of rhythm, possibly deriving from advanced beat perception.

Only strongly social animals, he writes, from songbirds and parrots to elephants and humans, have beat perception. What if musicality was acquired by all prosocial species through a process of convergent evolution? Like some other cognitive scientists, Honing now wonders whether language might derive from music, in a similar way to how reading uses much older neural structures that recognise contrast and sharp corners.

Honing must now test this exciting hypothesis. And if The Evolving Animal Orchestra is how he responds to disappointment, I can’t wait to see what he makes of success.

This God has taste

The Guardian spiked this one: a review of I am God by Giacomo Sartori, translated from the Italian by Frederika Randall (Restless Books)

This sweet, silly, not-so-shallow entertainment from 2016 ( Sono Dio, the first of Giacomo Sartori’s works to receive an English translation) takes an age before naming its young protagonist. For ages, she’s simply “the tall one”; sometimes, “the sodomatrix” (she inseminates cattle for a living).

Her name is Daphne, “a militant atheist who spends her nights trying to sabotage the Vatican website,” and she ekes out a precarious professional living in the edgeland laboratories of post-industrial Italy. The narrator sketches her relationship with her stoner dad and her love triangle with Lothario (or Apollo, or Randy — it doesn’t really matter) and his diminutive girlfriend. His eye is sharp: at one point we get to glimpse “the palm of [Daphne’s] hand moving over [Lothario’s] chest as if washing a window.” But the narrator keeps slipping off the point into a welter of self-absorbed footnotes. Daphne interests him — indeed, he’s besotted — but really he’s more interested in himself. And no wonder. As he never tires of repeating, with an ever more desperate compulsion: “I am God”.

This is a God with time on his hands. Not for him a purely functional creation with “trees of shapeless gelatin broth, made of a revolting goo like industrial waste. Neon lights that suddenly flick off, instead of sunsets.” This God has taste.

Why, then, does he find himself falling for such an emotionally careless mortal as Daphne? Could it be “that this gimpy human language hasn’t already contaminated me with some human germ…?” Sly comic business ensues as, with every word He utters, God paints Himself further into a corner it will take a miracle to escape.

The author Giacomo Sartori is a soil specialist turned novelist and one of the founders of Nazione Indiana, a blog and cultural project created to give voice to Italy’s literary eccentrics. Italy’s stultifying rural culture has been his main target up to now. Here, though, he’s taking shots at humanity in general: “They’re such hucksters,” he sighs, from behind the novel’s divine veil, “so reliably unpredictable, immoral and nuts that anyone observing them is soon transfixed.”

Of course, Sartori’s theological gags could be read just as easily as the humdrum concerns of a writer falling under the spell of their characters. But there’s much to relish in the way God comes to appreciate more deeply the lot of his favourite playthings, “telling a million stories, twisting the facts, philosophizing, drowning in their own words. All vain efforts; unhappy they are, unhappy they remain.”

A world that has run out of normal

Reading The Uninhabitable Earth: A Story of the Future by David Wallace-Wells for the Telegraph, 16 February 2019

As global temperatures rise, and the mean sea-level with them, I have been tracing the likely flood levels of the Thames Valley, to see which of my literary rivals will disappear beneath the waves first. I live on a hill, and what I’d like to say is: you’ll be stuck with me a while longer than most. But on the day I had set aside to consume David Wallace-Wells’s terrifying account of climate change and the future of our species (there isn’t one), the water supply to my block was unaccountably cut off.

Failing to make a cup of tea reminded me, with some force, of what ought to be obvious: that my hill is a post-apocalyptic death-trap. I might escape the floods, but without clean water, food or power, I’ll be lucky to last a week.

The first half of The Uninhabitable Earth is organised in chapters that deal separately with famines, floods, fires, droughts, brackish oceans, toxic winds and war and all the other manifest effects of anthropogenic climate change (there are many more than four horsemen in this Apocalypse). At the same time, the author reveals, paragraph by paragraph, how these ever-more-frequent disasters join up in horrific cascades, all of which erode human trust to the point where civic life collapses.

The human consequences of climate disaster are going to be ugly. When a million refugees from the Syrian civil war started arriving in Europe in 2017, far-right parties entered mainstream political discourse for the first time in decades. By 2050, the United Nations predicts that Europe will host 200 million refugees. So buckle up. The disgust response with which we greet strangers on our own land is something we conscientiously suppress these days. But it’s still there: an evolved response that in less sanitary times got us through more than one plague.

That such truths go largely unspoken says something about the cognitive dissonance in which our culture is steeped. We just don’t have the mental tools to hold climate change in our heads. Amitav Ghosh made this clear enough in The Great Derangement (2016), which explains why the traditional novel is so hopeless at handling a world that has run out of normal, forgotten how to repeat itself, and will never be any sort of normal again.

Writers, seeking to capture the contemporary moment, resort to science fiction. But the secret, sick appeal of post-apocalyptic narratives, from Richard Jefferies’s After London on, is that in order to be stories at all their heroes must survive. You can only push nihilism so far. J G Ballard couldn’t escape that bind. Neither could Cormac McCarthy. Despite our most conscientious attempts at utter bloody bleakness, the human spirit persists.

Wallace-Wells admits as much. When he thinks of his own children’s future, denizens of a world plunging ever deeper into its sixth major extinction event, he admits that despair melts and his heart fills with excitement. Humans will cling to life on this ever less habitable earth for as long as they can. Quite right, too.

Wallace-Wells is deputy editor of New York magazine. In July 2017 he wrote a cover story outlining worst-case scenarios for climate change. His pessimism proved salutary: The Uninhabitable Earth has been much anticipated.

In the first half of the book the author channels former US vice-president Al Gore, delivering a blizzard of terrifying facts, and knocking socks off his predecessor’s An Inconvenient Truth (2006) not thanks to his native gifts (considerable as they are) but because the climate has deteriorated since then to the point where its declines can now be observed directly, and measured over the course of a human lifetime.

More than half the extra carbon dioxide released into the atmosphere by burning fossil fuels has been added in the past 30 years. This means that “we have done as much damage to the fate of the planet and its ability to sustain human life and civilization since Al Gore published his first book on climate than in all the centuries – all the millennia – that came before.” (4) Oceans are carrying at least 15 per cent more heat energy than they did in 2000. 22 per cent of the earth’s landmass was altered by humans just between 1992 and 2015. In Sweden, in 2018, forests in the Arctic Circle went up in flames. On and on like this. Don’t shoot the messenger, but “we have now engineered as much ruin knowingly as we ever managed in ignorance.”

The trouble is not that the future is bleak. It’s that there is no future. We’re running out of soil. In the United States, it’s eroding ten times faster than it is being replaced. In China and India, soil is disappearing thirty to forty times as fast. Wars over fresh water have already begun. The CO2 in the atmosphere has reduced the nutrient value of plants by about thirty per cent since the 1950s. Within the lifetimes of our children, the hajj will no longer be a feature of Islamic practice: the heat in Mecca will be such that walking seven times counterclockwise around the Kaaba will kill you.

This book may come to be regarded as last truly great climate assessment ever made. (Is there even time left to pen another?) Some of the phrasing will give persnickety climate watchers conniptions. (Words like “eventually” will be a red rag for them, because they catalyse the reader’s imagination without actually meaning anything.) But the research is extensive and solid, the vision compelling and eminently defensible.

Alas, The Uninhabitable Earth is also likely to be one of the least-often finished books of the year. I’m not criticising the prose, which is always clear and engaging and often dazzling. But It’s simply that the more we are bombarded with facts, the less we take in. Treating the reader like an empty bucket into which facts may be poured does not work very well, and even less well when people are afraid of what you are telling them. “If you have made it this far, you are a brave reader,” Wallace Wells writes on page 138. Many will give up long before then. Climate scientists have learned the hard way how difficult it is to turn fact into public engagement.

The second half of The Uninhabitable Earth asks why our being made aware of climate disaster doesn’t lead to enough reasonable action being taken against it. There’s a nuanced mathematical account to be written of how populations reach carrying capacity, run out of resources, and collapse; and an even more difficult book that will explain why we ever thought human intelligence would be powerful enough to elude this stark physical reality.

The final chapters of The Uninhabitable Earth provide neither, but neither are they narrowly partisan. Wallace-Wells mostly resists the temptation to blame the mathematical inevitability of our species’ growth and decline on human greed. The worst he finds to say about the markets and market capitalism – our usual stock villains – is not that they are evil, or psychopathic (or certainly no more evil or psychopathic than the other political experiments we’ve run in the past 150 years) but that they are not nearly as clever as we had hoped they might be. There is a twisted magnificence in the way we are exploiting, rather than adapting to the End Times. (Whole Foods in the US, we are told, is now selling “GMO-free” fizzy water.)

The Paris accords of 2016 established keeping warming to just two degrees as a global goal. Only a few years ago we were hoping for a rise of just 1.5 degrees. What’s the difference? According to the IPCC, that half-degree concession spells death for about 150 million people. Without significantly improved pledges, however, the IPCC reckons that instituting the Paris accords overnight (and no-one has) will still see us topping 3.2 degrees of warming. At this point the Antarctic’s ice sheets will collapse, drowning Miami, Dhaka, Shanghai, Hong Kong and a hundred other cities around the world. (Not my hill, though.)

And to be clear: this isn’t what could happen. This is what is already guaranteed to happen. Greenhouse gases work on too long a timescale to avoid it. “You might hope to simply reverse climate change;” writes Wallace-Wells: “you can’t. It will outrun all of us.”

“How widespread alarm will shape our ethical impulses toward one another, and the politics that emerge from those impulses,” says Wallace-Wells,”is among the more profound questions being posed by the climate to the planet of people it envelopes.”

My bet is the question will never tip into public consciousness: that, on the contrary, we’ll find ways, through tribalism, craft and mischief, to engineer what Wallace-Wells dubs “new forms of indifference”, normalising climate suffering, and exploiting novel opportunities, even as we live and more often die through times that will never be normal again.

And so we wait

Thinking about Delayed Response by Jason Farman (Yale) for the Telegraph, 6 February 2019.

In producer Jon Favreau’s career-making 1996 comedy film Swingers, Favreau himself plays Mike, a young man in love with love, and at war with the answerphones of the world.

“Hi,” says one young woman’s machine, “this is Nikki. Leave a message,” prompting Mike to work, flub after flub, through an entire, entirely fictitious, relationship with the absent Nikki.

“This just isn’t working out,” he sighs, on about his twentieth attempt to leave a message that’s neither creepy nor incoherent.” I — I think you’re great, but, uh, I — I… Maybe we should just take some time off from each other. It’s not you, it’s me. It’s what I’m going through.”

There are a couple of lessons in this scene, and once they’re learned, there’ll be no pressing need for you to read Jason Farman’s Delayed Response. (I think you’d enjoy reading him, quite a bit, but, in the spirit of this project, let those reasons wait till the end.)

First lesson of two: “non-verbal communication never stops; non-verbal cues are always being produced whether we want them to be or not.” Those in the know may recognise here Farman’s salute here to Edward T. Hall’s book The Silent Language (1980), for which Delayed Response is a useful foil. But the point — that any delay between transmission and reception is part of the message — is no mere intellectual nicety. Anyone who has had a love affair degenerate into an exchange of ever more flippant WhatsApp messages; or has waited for a prospective employer to get back to them about a job application, knows that silent time carries meaning.

Second lesson: delay can be used to manifest power. In Swingers, Mike crashes into what an elusive novelist friend of mine dubs, with gleeful malevolence, “the power of absence,” which is more or less the same power my teenage daughter wields when she “ghosts” some boy. In the words of the French sociologist Pierre Bourdieu, “Waiting is one of the privileged ways of experiencing the effect of power, and the link between time and power.” We’re none of us immune; we’re all in thrall to what Farman calls the “waiting economy”, and as our civics crumble (don’t pretend you haven’t noticed) the hucksters driving that economy get more and more brazen. (Consider, as an example, the growing discrepancy in UK delivery times between public and private postal services.)

Delays carry meanings. We cannot control them with any finesse; but we can use them as blunt weapons on each other.

What’s left for Farman to say?

Farman’s account of wait times is certainly exhaustive, running the full gamut of history, from contemporary Japanese smartphone messaging apps to Aboriginal message sticks that were being employed up to 50,000 years ago. (To give you some idea how venerable that communication system is, consider that papyrus dates from around 2900 BC.) He spans on-line wait times both so short as to be barely perceptible, and delays so long that they may be used to calculate the distance between planets. His examples are sometimes otherworldly (literally so in the case of the New Horizons mission to Pluto), sometimes unnervingly prosaic: He recounts the Battle of Fredericksburg in the American Civil War as a piling up of familiar and ordinary delays, conjuring up a picture of war-as-bureaucracy that is truly mortifying.

Farman acknowledges how much more quickly we send and receive messages these days — but his is no paean to technological progress. The dismal fact is: the instantaneous provision of information degrades our ability to interpret it. As long ago as 1966 the neurobiologist James L McGaugh reported that as the time increases between learning and testing, memory retention actually improves. And yet the purveyors of new media continue to equate speed with wisdom, promising that ever-better worlds will emerge from ever-more-efficient media. Facebook’s Mark Zuckerberg took this to an extreme typical of him in April 2017 when he announced that “We’re building further out beyond augmented reality, and that includes work around direct brain interfaces that one day will let you communicate using only your mind, although that stuff is pretty far out.”

This kind of hucksterism is harmless in itself, but it doesn’t come out of nothing. The thing we should be really afraid of is the creeping bureaucratisation of human experience. I remember, three years before Zuckerberg slugged back his own Kool-Aid, I sat listening to UCL neuroscientist Nilli Lavie lecturing about attention. Lavie was clearly a person of good will and good sense, but what exactly did she mean by her claim that wandering attention loses the US economy around two billion dollars a year? Were our minds to be perfectly focused, all year round, would that usher in some sort of actuarial New Jerusalem? Or would it merely extinguish all dreaming? Without a space for minds to wander in, where would a new idea – any new idea – actually come from?

This, of course, is the political flim-flam implicit in crisis thinking. So long as we are occupied with urgent problems, we are unable to articulate nuanced and far-reaching political ideas. “Waiting, ultimately, is essential for imagining that which does not yet exist and innovating on the knowledge we encounter,” Farman writes, to which I’m inclined to add the obvious point: Progressives are shrill and failing because their chosen media — Twitter and the like — deprive them of any register other than crisis-triggered outrage.

We may dream up a dystopia in which the populus is narcotised into bovine contentment by the instantaneous supply of undigested information, but as Farman makes clear, this isn’t going to happen. The anxiety generated by delay doesn’t disappear with quicker response times, it simply gets redistributed and reshaped. People under 34 years of age check their phones an average of 150 times a day, a burden entirely alien to soldiers waiting for Death and the postman at Fredericksburg. Farman writes: “Though the mythologies of the digital age continue to argue that we are eliminating waiting from daily life, we are actually putting it right at the centre of how we connect with one another.”

This has a major, though rarely articulated consequence for us: that anxiety balloons to fill the vacuum left by a vanished emotion: one we once considered pleasurable and positive. I refer, of course, to anticipation. Anticipation is no longer something we relish. This is because, in our world of immediate satisfactions, we’re simply not getting enough exposure to it. Waiting has ceased to be the way we measure projected pleasure. Now it’s merely an index of our frustration.

Farman is very good on this subject, and this is why Delayed Response is worth reading. (There: I told you we’d get around to this.) The book’s longueurs, and Farman’s persnickety academical style, pale beside his main point, very well expressed, that “the meaning of life isn’t deferred until that thing we hope for arrives; instead, in the moment of waiting, meaning is located in our ability to recognise the ways that such hopes define us.”

Whose head is it anyway?

Reading Hubert Haddad’s novel Desirable Body for the Guardian, 22 December 2018

English speakers have only two or three translations from the French by which to judge the sometimes dreamy, sometimes nightmarish output of Tunisian poet and novelist Hubert Haddad. He began writing long prose in the 1970s and has been turning out a novel a year, more or less, since the turn of the century.

First published as Corps désirable in 2015, this novel sews a real-life maverick neurosurgeon, Sergio Canavero, into a narrative that coincides with the bicentenary of the first ever neurosurgical horror story, Mary Shelley’s Frankenstein.

Cédric Allyn-Weberson, scion of a big pharma plutocrat, has set sail for the coast of Paros with his war correspondent girlfriend Lorna Leer, on a yacht called Evasion. A horrible accident crushes his spine but leaves his head intact. Funded by Cédric’s estranged father Morice, Canavero sets about transplanting Cédric’s head on to a donor body. Assuming the operation succeeds, how will Cédric cope?

Nevertheless, this short, sly novel is not about Canavero’s surgery so much as about the existential questions it raises. Emotions are physiological phenomena, interpreted by the mind. It follows that Cédric’s head, trapped “in a merciless battle … abandoned to this slow, living enterprise, to the invading hysteria of muscles and organs”, can’t possibly know how to read his new body. His life has, sure enough, been reduced to “a sort of crystalline, luminous, almost abstract dream”.

Cédric doesn’t forget who he is; he simply ceases to care, and adopts a creaturely attitude in which self hardly matters, and beings are born and die nameless. In his world, “There was no one, with the exception of a few chance encounters and sometimes some embraces. Did birds or rats worry about their social identity?”

There is something dated about Haddad’s book: an effect as curious as it is, I am sure, deliberate, with piquant hints of Ian Fleming in his use of glamorous European locations. It’s in its glancing, elliptical relationship to technology that Desirable Body takes its most curious backward step. Yet this elusive approach feels like a breath of fresh air after decades spent wading through big infrastructure-saturated fictions such as Don DeLillo’s Underworld and Richard Powers’s The Overstory. Haddad focuses succinctly on formal existential questions: questions for which there are no handy apps, and which can in no way be evaded by the application of ameliorating technology.

The besetting existential problem for the book, and, indeed, for poor Cédric himself, is pleasure. He discovers this with a vengeance when he once again (and at last) goes to bed with his girlfriend: “Getting used to this new body after so much time seems like an appropriation of a sexual kind, a disturbing usurpation, a rape almost.” Lorna’s excitement only adds to his confusion: “The last straw is the jealous impulse that overtakes him when he sees her writhing on top of him.”

French critics have received Desirable Body with due solemnity. Surely this was a mistake: Haddad’s nostalgic gestures are playful, not ponderous, and I don’t think we are required to take them too seriously. Following Cédric’s dismal post-operative sexual experience, the book changes gear from tragedy to farce; indeed, becomes laugh-out-loud funny as he finds himself king-for-a-day in a buffoonish and clockwork world where “no one is really loved because we constantly go to the wrong house or the wrong person with the same extraordinary obstinacy”.

Desirable Body is about more than one decapitated man’s unusual plight; it’s about how surprisingly little our choices have to do with our feelings and passions. A farce, then, and a sharp one: it’s funny to contemplate, but if you fell into its toils for a second, you’d die screaming in horror.

Religion is more than opium

A review for the Telegraph: A Sacred Space is Never Empty: A history of Soviet atheism by Victoria Smolkin

On 12 April 1961 Yuri Gagarin clambered into Vostok I, blasted into space, and became the first human being to orbit the earth.

“And suddenly I hear: Man is in space! My God! I stopped heating up the oven, sat next to the radio receiver, afraid to step away even for a minute.” This is the recollection of a 73-year-old woman from the Kuibyshev region, published in the state newspaper Izvestiia a little over a month later.

“We were always told that God is in the heavens, so how can a man fly there and not bump into Elijah the Prophet or one of God’s angels? What if God punishes him for his insolence?… Now I am convinced that God is Science, is Man.”

The opposition between religion and science set up in this letter is charmingly naive — as though a space capsule might shatter the crystal walls of heaven! But the official Soviet attitude to these matters was not much different. Lenin considered religion “merely a product and reflection of the economic yoke within society.” Religion was simply a vicious exploitation of uneducated peoples’ urge to superstition. As socialism developed, superstitions in general would disappear, and religions along with them.

The art theorist Aleksey Gan called the constructivist Moscow Planetarium, completed in the late 1920s, “a building in which religious services are held… until society grows to the level of a scientific understanding, and the instinctual need for spectacle comes up against the real phenomena of the world and technology.”

The assumption here — that religion evaporates as soon as people learn to behave rationally — was no less absurd at the time than it is now; how it survived as a political instinct over generations is going to take a lot of explanation. Victoria Smolkin, an associate professor at Wesleyan University, delivers, but with a certain flatness of style that can grate after a while.

By 1973, with 70 planetariums littering the urban landscape and an ideologically oblivious populace gearing itself for a religious revival, I found myself wishing that her gloves would come off.

For the people she is writing about, the stakes could not have been higher. What can be more important than the meaning of life? We are all going to die, after all, and everything we do in our little lives is going to be forgotten. Had we absolutely no convictions about the value of anything beyond our little lives, we would most likely stay in bed, starving and soiling ourselves. The severely depressed do exactly this, for they have grown pathologically realistic about their survival chances.

Cultures are engines of enchantment. They give us reasons to get up in the morning. They give us people, institutions, ideas, and even whole planes of magical reality to live for.

The 1917 Revolution’s great blow-hards were more than happy to accept this role for their revolutionary culture. “Let thousands of us die to resurrect millions of people all over the earth!’ exclaims Rybin in Maxim Gorky’s 1906 novel Mother. “That’s what: dying’s easy for the sake of resurrection! If only the people rise!’ And in his two-volume Religion and Socialism, the Commissar of Education Anatoly Lunacharsky prophesies the coming of a culture in which the masses will wiillingly “die for the common good… sacrificing to realise a state that starts not with his ‘I’ but with our ‘we’.”

Given the necessary freedom and funding, perhaps the early Soviet Union’s self-styled “God-builders” — Alexander Bogdanov, Leon Trotsky and the rest — might have cooked up a uniquely Soviet metaphysics, with rituals for birth, marriage and death that were worth the name. In suprematism, constructivism, cosmism, and all the other millenarian “-isms” floating about at the turn of the century, Russia had no shortage of fresh ingredients.

But Lenin’s lumpen anticlericals held the day. Bogdanov was sidelined and Trotsky was (literally) axed. Lenin looted the church. Stalin coopted the Orthodox faith to bolster patriotism in a time of war. Khrushchev criminalised all religions and most folk practices in the name of state unity. And none of them had a clue why the state’s materialism — even as it matured into a coherent philosophy — failed to replace the religion to which it was contrivedly opposed.

A homegrown sociology became possible under the fourth premier Leonid Brezhnev’s supposedly ossifying rule, and with it there developed something like a mature understanding of what religion actually was. These insights were painfully and often clumsily won — like Jack Skellington in Tim Burton’s The Nightmare Before Christmas. trying to understand the business of gift-giving by measuring all the boxes under the tree. And the understanding, when it came, came far too late.

The price of generations of off-again, on-again religious repression was not a confident secularism, or even a convulsive religious reaction. It was poison: ennui and cynicism and ideological indifference that proved impossible for the state to oppose because it had no structure, no leaders, no flag, no clergy or dogma.

In the end there was nothing left but to throw in the towel. Mikhail Gorbachev met with Patriarch Pimen on 29 April 1988, and embraced the millennium as a national celebration. Konstantin Kharchev, chair of the Council for Religious Affairs, commented: “The church has survived, and has not only survived, but has rejuvenated itself. And the question arises: which is more useful to the party — someone who believes in God, someone who believes in nothing at all, or someone who believes in both God and Communism? I think we should choose the lesser evil.” (234)

Cultures do not collapse from war or drought or earthquake. They fall apart because, as the archaeologist Joseph Tainter points out, they lose the point of themselves. Now the heavy lifting of this volume is done, let us hope Smolkin takes a breath and describes the desolation wrought by the institutions she has researched in such detail. There’s a warning here that we, deep in our own contemporary disenchantment, should heed.