Normal fish and stubby dinosaurs

Reading Imagined Life by James Trefil and Michael Summers for New Scientist, 20 September 2019

If you can imagine a world that is consistent with the laws of physics,” say physicist James Trefil and planetary scientist Michael Summers, “then there’s a good chance that it exists somewhere in our galaxy.”

The universe is dark, empty, and expanding, true. But the few parts of it that are populated by matter at all, are full of planets. Embarrassingly so: interstellar space itself is littered with hard-to-spot rogue worlds, ejected early on in their solar system’s history, and these worlds may outnumber orbiting planets by a factor of two to one. (Not everyone agrees: some experts reckon rogues may out-number orbital worlds 1000 to one. One of the reasons the little green men have yet to sail up to the White House, is that they keep hitting space shoals.)

Can we conclude, then, that this cluttered galaxy is full of life? The surprising (and frustrating) truth is that we genuinely have no idea. And while Trefil and Summers are obviously primed to receive with open arms any visitors who happen by, they do a splendid job, in this, their second slim volume together of explaining just how tentative and speculative our thoughts about exobiology actually are, and why.

Exoplanets came out in 2013; Imagined Life is a sort of sequel and is, if possible, even more accessible. In just 14 pages, the authors outline the physical laws constraining the universe. Then they rattle through the various ways we can define life, and why spotting life on distant worlds is so difficult (“For just about every molecule that we could identify [through spectroscopy] as a potential biomarker of life on an exoplanet, there is a nonbiological production mechanism.”). They list the most likely types of environment on which life may have evolved, from water worlds to Mega Earths (expect “normal fish… and stubby dinosaurs”), from tidally locked planets to wildly exotic (but by no means unlikely) superconducting rogues. And we haven’t even reached the meat of this tiny book yet – a tour, planet by imaginary planet, of the possibilities for life, intelligence, and civilisation in our and other galaxies.

Most strange worlds are far too strange for life, and the more one learns about chemistry, the more sober one’s speculations become. Water is common in the universe, and carbon not hard to find, and this is as well, given the relative uselessness of their nearest equivalents (benzene and silicon, say). The authors argue enthusiastically for the possibilities of life that’s “really not like us”, but they have a hard time making it stick. Carbon-based life is pretty various, of course, but even here there may be unexepected limits on what’s possible. Given that, out of 140 amino acids, only 22 have been recruited in nature, it may be that mechanisms of inheritance converge on a surprisingly narrow set of possibilities.

The trick to finding life in odd places, we discover, is to look not out, but in, and through. “Scientists are beginning to abandon the idea that life has to evolve and persist on the surface of planets” the authors write, laying the groundwork for their description of an aquatic alien civilisation for whom a mission to the ocean surface “would be no stranger to them than a mission to Mars is to us.”

I’m not sure I buy the authors’ stock assumption that life most likely breeds intelligence most likely breeds technology. Nothing in biology , or human history, suggests as much. Humans in their current iteration may be far odder than we imagine. But what the hell: Imagined Life reminds me of those books I grew up with, full of artists’ impressions of the teeming oceans of Venus. Only now, the science is better; the writing is better; and the possibiliities, being more focused, are altogether more intoxicating.

The weather forecast: a triumph hiding in plain sight

Reading The Weather Machine by Andrew Blum (Bodley Head) for the Telegraph, 6 July 2019

Reading New York journalist Andrew Blum’s new book has cured me of a foppish and annoying habit. I no longer dangle an umbrella off my arm on sunny days, tripping up my fellow commuters before (inevitably) mislaying the bloody thing on the train to Coulsdon Town. Very late, and to my considerable embarrassment, I have discovered just how reliable the weather forecast is.

My thoroughly English prejudice against the dark art of weather prediction was already set by the time the European Centre for Medium-Range Weather Forecasts opened in Reading in 1979. Then the ECMWF claimed to be able to see three days into the future. Six years later, it could see five days ahead. It knew about Sandy, the deadliest hurricane of 2012, eight days ahead, and it expects to predict high-impact events a fortnight before they happen by the year 2025.

The ECMWF is a world leader, but it’s not an outlier. Look at the figures: weather forecasts have been getting consistently better for 40 straight years. Blum reckons this makes the current global complex of machines, systems, networks and acronyms (and there are lots of acronyms) “a high point of science and technology’s aspirations for society”.

He knows this is a minority view: “The weather machine is a wonder we treat as a banality,” he writes: “a tool that we haven’t yet learned to trust.” The Weather Machine is his attempt to convey the technical brilliance and political significance of an achievement that hides in plain sight.

The machine’s complexity alone is off all familiar charts, and sets Blum significant challenge. “As a rocket scientist at the Jet Propulsion Laboratory put it to me… landing a spacecraft on Mars requires dealing with hundreds of variables,” he writes; “making a global atmospheric model requires hundreds of thousands.” Blum does an excellent job of describing how meteorological theory and observation were first stitched together, and why even today their relationship is a stormy one.

His story opens in heroic times, with Robert FitzRoy one of his more engaging heroes. Fitzroy is best remembered for captaining the HMS Beagle and weathering the puppyish enthusiasm of a young Charles Darwin. But his real claim to fame is as a meteorologist. He dreamt up the term “forecast”, turned observations into predictions that saved sailors’ lives, and foresaw with clarity what a new generation of naval observers would look like. Distributed in space and capable of communicating instantaneously with each other, they would be “as if an eye in space looked down on the whole North Atlantic”.

You can’t produce an accurate forecast from observation alone, however. You also need a theory of how the weather works. The Norwegian physicist Vilhelm Bjerknes came up with the first mathematical model of the weather: a set of seven interlinked partial differential equations that handled the fact that the atmosphere is a far from ideal fluid. Sadly, Bjerknes’ model couldn’t yet predict anything — as he himself said, solutions to his equations “far exceed the means of today’s mathematical analysis”. As we see our models of the weather evolve, so we see works of individual genius replaced by systems of machine computation. In the observational realm, something similar happens: the heroic efforts of individual observers throw up trickles of insight that are soon subsumed in the torrent of data streaming from the orbiting artefacts of corporate and state engineering.

The American philosopher Timothy Morton dreamt up the term “hyperobject” to describe things that are too complex and numinous to describe in the plain terms. Blum, whose earlier book was Tubes: Behind the Scenes at the Internet (2012), fancies his chances at explaining human-built hyperobjects in solid, clear terms, without recourse to metaphor and poesy. In this book, for example, he recognises the close affinity of military and meteorological infrastructures (the staple of many a modish book on the surveillance state), but resists any suggestion that they are the same system.

His sobriety is impressive, given how easy it is to get drunk on this stuff. In October 1946, technicians at the White Sands Proving Ground in Nevada installed a camera in the nose cone of a captured V2, and by launching it, yielded photographs of a quarter of the US — nearly a million square miles banded by clouds “stretching hundreds of miles in rows like streets”. This wasn’t the first time a bit of weather kit acted as an expendable test in a programme of weapons development, and it certainly wasn’t the last. Today’s global weather system has not only benefited from military advancements in satellite positioning and remote sensing; it has made those systems possible. Blum allows that “we learned to see the whole earth thanks to the technology built to destroy the whole earth”. But he avoids paranoia.

Indeed, he is much more impressed by the way countries at hammer and tongs with each other on the political stage nevertheless collaborated closely and well on a global weather infrastructure. Point four of John F Kennedy’s famous 1961 speech on “Urgent National Needs” called for “a satellite system for worldwide weather observation”, and it wasn’t just militarily useful American satellites he had in mind for the task: in 1962 Harry Wexler of the U.S. Weather Bureau worked with his Soviet counterpart Viktor Bugaev on a report proposing a “World Weather Watch”, and by 1963 there was, Blum finds, “a conscious effort by scientists — on both sides of the Iron Curtain, in all corners of the earth — to design an integrated and coordinated apparatus” — this at a time when weather satellites were so expensive they could be justified only on national security grounds.

Blum’s book comes a little bit unstuck at the end. A final chapter that could easily have filled a third of the book is compressed into just a few pages’ handwaving and special pleading, as he conjures up a vision of a future in which the free and global nature of weather information has ceased to be a given and the weather machine, that “last bastion of international cooperation”, has become just one more atomised ghost of a future the colonial era once promised us.

Why end on such a minatory note? The answer, which is by no means obvious, is to be found in Reading. Today 22 nations pay for the ECMWF’s maintenance of a pair of Cray supercomputers. The fastest in the world, these machines must be upgraded every two years. In the US, meanwhile, weather observations rely primarily on the health of four geostationary satellites, at a cost of 11 billion dollars. (America’s whole National Weather Service budget costs only around $1billion.)

Blum leaves open the question, How is an organisation built by nation-states, committed to open data and borne of a global view, supposed to work in a world where information lives on private platforms and travels across private networks — a world in which billions of tiny temperature and barometric sensors, “in smartphones, home devices, attached to buildings, buses or airliners,” are aggregated by the likes of Google, IBM or Amazon?

One thing is disconcertingly clear: Blum’s weather machine, which in one sense is a marvel of continuing modernity, is also, truth be told, a dinosaur. It is ripe for disruption, of a sort that the world, grown so reliant on forecasting, could well do without.

“The English expedition of 1919 is to blame for this whole misery”

Four books to celebrate the centenary of  Eddington’s 1919 eclipse observations. For The Spectator, 11 May 2019.

Einstein’s War: How relativity triumphed amid the vicious nationalism of World War I
Matthew Stanley
Dutton

Gravity’s Century: From Einstein’s eclipse to images of black holes
Ron Cowen
Harvard University Press

No Shadow of a Doubt
Daniel Kennefick
Princeton University Press

Einstein’s Wife: The real story of Mileva Einstein-Maric
Allen Esterson and David C Cassidy; contribution by Ruth Lewin Sime.
MIT Press

On 6 November 1919, at a joint meeting of the Royal Astronomical Society and the Royal Society, held at London’s Burlington House, the stars went all askew in the heavens.
That, anyway, was the rhetorical flourish with which the New York Times hailed the announcement of the results of a pair of astronomical expeditions conducted in 1919, after the Armistice but before the official end of the Great War. One expedition, led by Arthur Stanley Eddington, assistant to the Astronomer Royal, had repaired to the plantation island of Principe off the coast of West Africa; the other, led by Andrew Crommelin, who worked at the Royal Greenwich Observatory, headed to a racecourse in Brazil. Together, in the few minutes afforded by the 29 May solar eclipse, the teams used telescopes to photograph shifts in the apparent location of stars as the edge of the sun approached them.

The possibility that a heavy body like the sun might cause some distortion in the appearance of the star field was not particularly outlandish. Newton, who had assigned “corpuscles” of light some tiny mass, supposed that such a massive body might draw light in like a lens, though he imagined the effect was too slight to be observable.

The degree of distortion the Eddington expeditions hoped to observe was something else again. 1.75 arc-seconds is roughly the angle subtended by a coin, a couple of miles away: a fine observation, but not impossible at the time. Only the theory of the German-born physicist Albert Einstein — respected well enough at home but little known to the Anglophone world — would explain such a (relatively) large distortion, and Eddington’s confirmation of his hypothesis brought the “famous German physician” (as the New York Times would have it) instant celebrity.

“The English expedition of 1919 is ultimately to blame for this whole misery, by which the general masses seized possession of me,” Einstein once remarked; but he was not so very sorry for the attention. Forget the usual image of Einstein the loveable old eccentric. Picture instead a forty-year-old who, when he steps into a room, literally causes women to faint. People wanted his opinions even about stupid things. And for years, if anyone said anything wise, within a few months their words were being attributed to Einstein.

“Why is it that no one understands me and everyone likes me?” Einstein wondered. His appeal lay in his supposed incomprehensibility. Charlie Chaplin understood: “They cheer me because they all understand me,” he remarked, accompanying the theoretical physicist to a film premiere, “and they cheer you because no one understands you.”

Several books serve to mark the centenary of the 1919 eclipse observations. Though their aims diverge, they all to some degree capture the likeness of Einstein the man, messy personal life and all, while rendering his physics a little bit more comprehensible to the rest of us. Each successfully negotiates the single besetting difficulty facing books of this sort, namely the way science lends itself to bad history.

Science uses its past as an object lesson, clearing all the human messiness away to leave the ideas standing. History, on the other hand factors in as much human messiness as possible to show how the business of science is as contingent and dramatic as any other human activity.

While dealing with human matters, some ambiguity over causes and effects is welcome. There are two sides to every story, and so on and so forth: any less nuanced approach seems suspiciously moralistic. One need only look at the way various commentators have interpreted Einstein’s relationship with his first wife.

Einstein was, by the end of their failing marriage, notoriously horrible to Mileva Einstein-Maric; this in spite of their great personal and intellectual closeness as first-year physics students at the Federal Swiss Polytechnic. Einstein once reassured Elsa Lowenthal, his cousin and second-wife-to-be, that “I treat my wife as an employee I can not fire.” (Why Elsa, reading that, didn’t run a mile, is not recorded.)

Albert was a bad husband. His wife was a mathematician. Therefore Albert stole his theory of special relativity from Mileva. This shibboleth, bandied about since the 1970s, is a sort of of evil twin of whig history, distorted by teleology, anachronism and present-mindedness. It does no one any favours. The three separately authored parts of Einstein’s Wife: The real story of Mileva Einstein-Maric unpick the myth of Mileva’s influence over Albert, while increasing, rather than diminishing, our interest in and admiration of the woman herself. It’s a hard job to do well, without preciousness or special pleading, especially in today’s resentment-ridden and over-sensitive political climate, and the book is an impressive, compassionate accomplishment.
Matthew Stanley’s Einstein’s War, on the other hand, tips ever so slightly in the other direction, towards the simplistic and the didactic. His intentions, however, are benign — he is here to praise Einstein and Eddington and their fellows, not bury them — and his slightly on-the-nose style is ultimately mandated by the sheer scale of what he is trying to do, for he succeeds in wrapping the global, national and scientific politics of an era up in a compelling story of one man’s wild theory, lucidly sketched, and its experimental confirmation in the unlikeliest and most exotic circumstances.

The world science studies is truly a blooming, buzzing confusion. It is not in the least bit causal, in the ordinary human sense. Far from there being a paucity of good stories in science, there are a limitless number of perfectly valid, perfectly accurate, perfectly true stories, all describing the same phenomenon from different points of view.

Understanding the stories abroad in the physical sciences at the fin de siecle, seeing which ones Einstein adopted, why he adopted them, and why, in some cases, he swapped them for others, certainly doesn’t make his theorising easy. But it does give us a gut sense of why he was so baffled by the public’s response to his work. The moment we are able to put him in the context of co-workers, peers and friends, we see that Einstein was perfecting classical physics, not overthrowing it, and that his supposedly peculiar theory of relativity — as the man said himself –“harmonizes with every possible outlook of philosophy and does not interfere with being an idealist or materialist, pragmatist or whatever else one likes.”

In science, we need simplification. We welcome a didactic account. Choices must be made, and held to. Gravity’s Century by the science writer Ron Cowen is the most condensed of the books mentioned here; it frequently runs right up to the limit of how far complex ideas can be compressed without slipping into unavoidable falsehood. I reckon I spotted a couple of questionable interpretations. But these were so minor as to be hardly more than matters of taste, when set against Cowen’s overall achievement. This is as good a short introduction to Einstein’s thought as one could wish for. It even contrives to discuss confirmatory experiments and observations whose final results were only announced as I was writing this piece.

No Shadow of a Doubt is more ponderous, but for good reason: the author Daniel Kennefick, an astrophysicist and historian of science, is out to defend the astronomer Eddington against criticisms more serious, more detailed, and framed more conscientiously, than any thrown at that cad Einstein.

Eddington was an English pacifist and internationalist who made no bones about wanting his eclipse observations to champion the theories of a German-born physicist, even as jingoism reached its crescendo on both sides of the Great War. Given the sheer bloody difficulty of the observations themselves, and considering the political inflection given them by the man orchestrating the work, are Eddington’s results to be trusted?

Kennefick is adamant that they are, modern naysayers to the contrary, and in conclusion to his always insightful biography, he says something interesting about the way historians, and especially historians of science, tend to underestimate the past. “Scientists regard continuous improvement in measurement as a hallmark of science that is unremarkable except where it is absent,” he observes. “If it is absent, it tells us nothing except that someone involved has behaved in a way that is unscientific or incompetent, or both.” But, Kennefick observes, such improvement is only possible with practice — and eclipses come round too infrequently for practice to make much difference. Contemporary attempts to recreate Eddington’s observations face the exact same challenges Eddington did, and “it seems, as one might expect, that the teams who took and handled the data knew best after all.”

It was Einstein’s peculiar fate that his reputation for intellectual and personal weirdness has concealed the architectural elegance of his work. Higher-order explanations of general relativity have become clichés of science fiction. The way massive bodies bend spacetime like a rubber sheet is an image that saturates elementary science classes, to the point of tedium.

Einstein hated those rubber-sheet metaphors for a different reason. “Since the mathematicians pounced on the relativity theory,” he complained, “I no longer understand it myself.” We play about with thoughts of bouncy sheets. Einstein had to understand their behaviours mathematically in four dimensions (three of space and one of time), crunching equations so radically non-linear, their results would change the value of the numbers originally put into them in feedback loops that drove the man out of his mind. “Never in my life have I tormented myself anything like this,” he moaned.

For the rest of us, however, A little, prophylactic exposure to Einstein’s actual work pays huge dividends. It sweeps some of the weirdness away and reveals Einstein’s actual achievement: theories that set all the forces above the atomic scale dancing with an elegance Isaac Newton, founding father of classical physics, would have half-recognised, and wholly admired.

 

A series of apparently impossible events

Exploring Smoke and Mirrors at Wellcome Collection for New Scientist, 1 May 2019

ACCORDING to John Nevil Maskelyne, “a bad conjurer will make a good medium any day”. He meant that, as a stage magician in 19th-century London, he had to produce successful effects night after night, while rivals who claimed their illusions were powered by the spirit world could simply blame a bad set on “unhelpful spirits”, or even on the audience’s own scepticism.

A gaffe-ridden performance in the UK by one set of spiritualists, the US Davenport Brothers, drove Maskelyne to invent his own act. With his friend, the cabinet maker George Alfred Cooke, he created an “anti-spiritualist” entertainment, at once replicating and debunking the spiritualist movement’s stock-in-trade effects.

Matthew Tompkins teases out the historical implications of Maskelyne’s story in The Spectacle of Illusion: Magic, the paranormal and the complicity of the mind (Thames & Hudson). It is a lavishly illustrated history to accompany Smoke and Mirrors, a new and intriguing exhibition at the Wellcome Collection in London.

Historical accident was partly responsible. In 1895, Guglielmo Marconi sent long-wave radio signals over a distance of a couple of kilometres, and, for decades after, hardly a year passed in which some researcher didn’t announce a new type of invisible ray. The world turned out to have aspects hidden from unaided human perception. Was it so unreasonable of people to speculate about what, or who, might lurk in those hidden corners of reality? Were they so gullible, reeling as they were from the mass killings of the first world war, to populate these invisible realms with their dead?

In 1924, the magazine Scientific American offered $2500 to any medium who could demonstrate their powers under scientific controls. The medium Mina “Margery” Crandon decided to try her hand, but she reckoned without the efforts of one Harry “Handcuff” Houdini, who eventually exposed her as a fraud.

Yet spiritualism persisted, shading off into parapsychology, quantum speculation and any number of cults. Understanding why is more the purview of a psychologist such as Gustav Kuhn, who, as well as being a major contributor to the show, offers insight into magic and magical belief in his own new book, Experiencing the Impossible (MIT Press).

Kuhn, a member of the Magic Circle, finds Maskelyne’s “anti-spiritualist” form of stage magic alive in the hands of illusionist Derren Brown. He suggests that Brown is more of a traditional magician than he lets on, dismissing the occult while he endorses mysterious psychological phenomena, mostly to do with “subconscious priming”, that, at root, are non-scientific.

Kuhn defines magic as “the experience of wonder that results from perceiving an apparently impossible event”. Definitions of what is impossible differ, and different illusions work for different people. You can even design it for animals, as a torrent of YouTube videos, based largely on Finnish magician Jose Ahonen’s “Magic for Dogs”, attest.

Tricking dogs is one thing, but why do our minds fall for magic? It was the 18th-century Scottish Enlightenment philosopher, David Hume, who argued that there is no metaphysical glue binding events, and that we only ever infer causal relationships, be they real or illusory.

Twinned with our susceptibility to wrongly infer relationships between events in the world is our ability to fool ourselves at an even deeper level. Numerous studies, including one by researcher and former magician Jay Olson and clinician Amir Raz which sits at the exit to the Wellcome show, conclude that our feeling of free will may be an essential trick of the mind.

Inferring connections makes us confident in ourselves and our abilities, and it is this confidence, this necessary delusion about the brilliance of our cognitive abilities, that lets us function… and be tricked. Even after reading both books, I defy you to see through the illusions and wonders in store at the exhibition.

And so we wait

Thinking about Delayed Response by Jason Farman (Yale) for the Telegraph, 6 February 2019.

In producer Jon Favreau’s career-making 1996 comedy film Swingers, Favreau himself plays Mike, a young man in love with love, and at war with the answerphones of the world.

“Hi,” says one young woman’s machine, “this is Nikki. Leave a message,” prompting Mike to work, flub after flub, through an entire, entirely fictitious, relationship with the absent Nikki.

“This just isn’t working out,” he sighs, on about his twentieth attempt to leave a message that’s neither creepy nor incoherent.” I — I think you’re great, but, uh, I — I… Maybe we should just take some time off from each other. It’s not you, it’s me. It’s what I’m going through.”

There are a couple of lessons in this scene, and once they’re learned, there’ll be no pressing need for you to read Jason Farman’s Delayed Response. (I think you’d enjoy reading him, quite a bit, but, in the spirit of this project, let those reasons wait till the end.)

First lesson of two: “non-verbal communication never stops; non-verbal cues are always being produced whether we want them to be or not.” Those in the know may recognise here Farman’s salute here to Edward T. Hall’s book The Silent Language (1980), for which Delayed Response is a useful foil. But the point — that any delay between transmission and reception is part of the message — is no mere intellectual nicety. Anyone who has had a love affair degenerate into an exchange of ever more flippant WhatsApp messages; or has waited for a prospective employer to get back to them about a job application, knows that silent time carries meaning.

Second lesson: delay can be used to manifest power. In Swingers, Mike crashes into what an elusive novelist friend of mine dubs, with gleeful malevolence, “the power of absence,” which is more or less the same power my teenage daughter wields when she “ghosts” some boy. In the words of the French sociologist Pierre Bourdieu, “Waiting is one of the privileged ways of experiencing the effect of power, and the link between time and power.” We’re none of us immune; we’re all in thrall to what Farman calls the “waiting economy”, and as our civics crumble (don’t pretend you haven’t noticed) the hucksters driving that economy get more and more brazen. (Consider, as an example, the growing discrepancy in UK delivery times between public and private postal services.)

Delays carry meanings. We cannot control them with any finesse; but we can use them as blunt weapons on each other.

What’s left for Farman to say?

Farman’s account of wait times is certainly exhaustive, running the full gamut of history, from contemporary Japanese smartphone messaging apps to Aboriginal message sticks that were being employed up to 50,000 years ago. (To give you some idea how venerable that communication system is, consider that papyrus dates from around 2900 BC.) He spans on-line wait times both so short as to be barely perceptible, and delays so long that they may be used to calculate the distance between planets. His examples are sometimes otherworldly (literally so in the case of the New Horizons mission to Pluto), sometimes unnervingly prosaic: He recounts the Battle of Fredericksburg in the American Civil War as a piling up of familiar and ordinary delays, conjuring up a picture of war-as-bureaucracy that is truly mortifying.

Farman acknowledges how much more quickly we send and receive messages these days — but his is no paean to technological progress. The dismal fact is: the instantaneous provision of information degrades our ability to interpret it. As long ago as 1966 the neurobiologist James L McGaugh reported that as the time increases between learning and testing, memory retention actually improves. And yet the purveyors of new media continue to equate speed with wisdom, promising that ever-better worlds will emerge from ever-more-efficient media. Facebook’s Mark Zuckerberg took this to an extreme typical of him in April 2017 when he announced that “We’re building further out beyond augmented reality, and that includes work around direct brain interfaces that one day will let you communicate using only your mind, although that stuff is pretty far out.”

This kind of hucksterism is harmless in itself, but it doesn’t come out of nothing. The thing we should be really afraid of is the creeping bureaucratisation of human experience. I remember, three years before Zuckerberg slugged back his own Kool-Aid, I sat listening to UCL neuroscientist Nilli Lavie lecturing about attention. Lavie was clearly a person of good will and good sense, but what exactly did she mean by her claim that wandering attention loses the US economy around two billion dollars a year? Were our minds to be perfectly focused, all year round, would that usher in some sort of actuarial New Jerusalem? Or would it merely extinguish all dreaming? Without a space for minds to wander in, where would a new idea – any new idea – actually come from?

This, of course, is the political flim-flam implicit in crisis thinking. So long as we are occupied with urgent problems, we are unable to articulate nuanced and far-reaching political ideas. “Waiting, ultimately, is essential for imagining that which does not yet exist and innovating on the knowledge we encounter,” Farman writes, to which I’m inclined to add the obvious point: Progressives are shrill and failing because their chosen media — Twitter and the like — deprive them of any register other than crisis-triggered outrage.

We may dream up a dystopia in which the populus is narcotised into bovine contentment by the instantaneous supply of undigested information, but as Farman makes clear, this isn’t going to happen. The anxiety generated by delay doesn’t disappear with quicker response times, it simply gets redistributed and reshaped. People under 34 years of age check their phones an average of 150 times a day, a burden entirely alien to soldiers waiting for Death and the postman at Fredericksburg. Farman writes: “Though the mythologies of the digital age continue to argue that we are eliminating waiting from daily life, we are actually putting it right at the centre of how we connect with one another.”

This has a major, though rarely articulated consequence for us: that anxiety balloons to fill the vacuum left by a vanished emotion: one we once considered pleasurable and positive. I refer, of course, to anticipation. Anticipation is no longer something we relish. This is because, in our world of immediate satisfactions, we’re simply not getting enough exposure to it. Waiting has ceased to be the way we measure projected pleasure. Now it’s merely an index of our frustration.

Farman is very good on this subject, and this is why Delayed Response is worth reading. (There: I told you we’d get around to this.) The book’s longueurs, and Farman’s persnickety academical style, pale beside his main point, very well expressed, that “the meaning of life isn’t deferred until that thing we hope for arrives; instead, in the moment of waiting, meaning is located in our ability to recognise the ways that such hopes define us.”

Whose head is it anyway?

Reading Hubert Haddad’s novel Desirable Body for the Guardian, 22 December 2018

English speakers have only two or three translations from the French by which to judge the sometimes dreamy, sometimes nightmarish output of Tunisian poet and novelist Hubert Haddad. He began writing long prose in the 1970s and has been turning out a novel a year, more or less, since the turn of the century.

First published as Corps désirable in 2015, this novel sews a real-life maverick neurosurgeon, Sergio Canavero, into a narrative that coincides with the bicentenary of the first ever neurosurgical horror story, Mary Shelley’s Frankenstein.

Cédric Allyn-Weberson, scion of a big pharma plutocrat, has set sail for the coast of Paros with his war correspondent girlfriend Lorna Leer, on a yacht called Evasion. A horrible accident crushes his spine but leaves his head intact. Funded by Cédric’s estranged father Morice, Canavero sets about transplanting Cédric’s head on to a donor body. Assuming the operation succeeds, how will Cédric cope?

Nevertheless, this short, sly novel is not about Canavero’s surgery so much as about the existential questions it raises. Emotions are physiological phenomena, interpreted by the mind. It follows that Cédric’s head, trapped “in a merciless battle … abandoned to this slow, living enterprise, to the invading hysteria of muscles and organs”, can’t possibly know how to read his new body. His life has, sure enough, been reduced to “a sort of crystalline, luminous, almost abstract dream”.

Cédric doesn’t forget who he is; he simply ceases to care, and adopts a creaturely attitude in which self hardly matters, and beings are born and die nameless. In his world, “There was no one, with the exception of a few chance encounters and sometimes some embraces. Did birds or rats worry about their social identity?”

There is something dated about Haddad’s book: an effect as curious as it is, I am sure, deliberate, with piquant hints of Ian Fleming in his use of glamorous European locations. It’s in its glancing, elliptical relationship to technology that Desirable Body takes its most curious backward step. Yet this elusive approach feels like a breath of fresh air after decades spent wading through big infrastructure-saturated fictions such as Don DeLillo’s Underworld and Richard Powers’s The Overstory. Haddad focuses succinctly on formal existential questions: questions for which there are no handy apps, and which can in no way be evaded by the application of ameliorating technology.

The besetting existential problem for the book, and, indeed, for poor Cédric himself, is pleasure. He discovers this with a vengeance when he once again (and at last) goes to bed with his girlfriend: “Getting used to this new body after so much time seems like an appropriation of a sexual kind, a disturbing usurpation, a rape almost.” Lorna’s excitement only adds to his confusion: “The last straw is the jealous impulse that overtakes him when he sees her writhing on top of him.”

French critics have received Desirable Body with due solemnity. Surely this was a mistake: Haddad’s nostalgic gestures are playful, not ponderous, and I don’t think we are required to take them too seriously. Following Cédric’s dismal post-operative sexual experience, the book changes gear from tragedy to farce; indeed, becomes laugh-out-loud funny as he finds himself king-for-a-day in a buffoonish and clockwork world where “no one is really loved because we constantly go to the wrong house or the wrong person with the same extraordinary obstinacy”.

Desirable Body is about more than one decapitated man’s unusual plight; it’s about how surprisingly little our choices have to do with our feelings and passions. A farce, then, and a sharp one: it’s funny to contemplate, but if you fell into its toils for a second, you’d die screaming in horror.

Prudery isn’t justice

Reading Objection: Disgust, morality, and the law by Debra Lieberman and Carlton Patrick for New Scientist, 15 September 2018

Ww want the law to be fair and objective. We also want laws that work in the real world, protecting and reassuring us, and maintaining our social and cultural values.

The moral dilemma is that we can’t have both. This may be because humans are hopelessly irrational and need a rational legal system to keep them in check. But it may also be that rationality has limits; trying to sit in judgement over everything is as cruel and farcical as gathering cats in a sack.

This dilemma is down to disgust, say Debra Lieberman, a psychologist at the University of Miami, and Carlton Patrick, a legal scholar at the University of Central Florida. In Objection, they join forces to consider why we find some acts disgusting without being reprehensible (like nose-picking), while others seem reprehensible without being disgusting (like drunk driving).

Disgust is such a powerful intuitive guide that it has informed our morality and hence our legal system. But it maps badly over a jurisprudence built on notions of harm and culpability.

Worse, terms of disgust are frequently wielded against people we intend to marginalise, making disgust a dangerously fissile element in our moral armoury.

Can science help us manage it? The prognosis is not good. If you were to ask a cultural anthropologist, a psychologist, a neuroscientist, a behavioural economist and a sociologist to explain disgust, you would receive different, often mutually contradictory, opinions.

The authors make their own job much more difficult, however, by endorsing a surreally naive model of the mind – one in which “both ’emotion’ and ‘cognition’ require circuitry” and it is possible to increase a child’s devotion to family by somehow manipulating this “circuitry”.

From here, the reader is ushered into the lollipop van of evolutionary psychology, where “disgust is best understood as a type of software program instantiated in our neural hardware”, which “evolved originally to guide our ancestors when making decisions about what to eat”.

The idea that disgust is to some degree taught and learned, conditioned by culture, class and contingency, is not something easily explored using the authors’ over-rigid model of the mind. Whenever they lay this model aside, however, they handle ambiguity well.

Their review of the literature on disgust is cogent and fair. They point out that although the decriminalisation of homosexuality and gay marriage argues persuasively for legal rationalism, there are other acts – like the violation of corpses – that we condemn without a strictly rational basis (the corpse isn’t complaining). This plays to the views of bioethicist Leon Kass, who calls disgust “the only voice left that speaks up to defend the central core of our humanity”.

Objection explores an ethical territory that sends legal purists sprawling. The authors emerge from this interzone battered, but essentially unbowed.

 A History of Silence reviewed: Unlocking the world of infinitely small noises

Reading Alain Corbin’s A History of Silence (Polity Press) for The Telegraph, 3 September 2018

The Orientalist painter Eugene Fromentin adored the silence of the Sahara. “Far from oppressing you,” he wrote to a friend, “it inclines you to light thoughts.” People assume that silence, being an absence of noise, is the auditory equivalent of darkness. Fromentin was having none of it: “If I may compare aural sensations to those of sight, then silence that reigns over vast spaces is more a sort of aerial transparency, which gives greater clarity to the perceptions, unlocks the world of infinitely small noises, and reveals a range of inexpressible delights.” (26-27)

Silence invites clarity. Norwegian explorer and publisher Erling Kagge seized on this nostrum for his international bestseller Silence in the Age of Noise, published in Norway in 2016, the same year Alain Corbin’s A History of Silence was published in France.

People forget this, but Kagge’s short, smart airport read was more tough-minded than the fad it fed. In fact, Kagge’s crowd-pleasing Silence and Corbin’s erudite History make surprisingly good travelling companions.

For instance: while Corbin was combing through Fromentin’s Un été dans le Sahara of 1856, Kagge was talking to his friend the artist Marina Abramovic, whose experience of desert silence was anything but positive: “Despite the fact that everything end completely quiet around her, her head was flooded with disconnected thoughts… It seemed like an empty emptiness, while the goal is to experience a full emptiness, she says.” (115)

Abramovic’s trouble, Kagge tells us, was that she couldn’t stop thinking. She wanted a moment of Fromentinesque clarity, but her past and her future proved insuperable obstacles: the moment she tore are mind away from one, she found herself ruminating over the other.

It’s a common complaint, according to Kagge: “The present hurts, and our response is to look ceaselessly for fresh purposes that draw our attention outwards, away from ourselves.” (37)

Why should this be? The answer is explored in Corbin’s book, which is one of those cultural histories which comes very close to becoming a virtually egoless compendium of quotations. Books of this sort can be a terrible mess but Corbin’s architecture, on the contrary, is as stable as it is artfully concealed. This is a temple: not a pile.

The present, properly attended to, alone and in silence, reveals time’s awful scale. When we think about the past or the future, what we’re actually doing is telling ourselves stories. It’s in the present moment, if we dare attend to it, that we glimpse the Void.

Jules Michelet, in his book La Montaigne (1872), recognised that the great forces shaping human destiny are so vast as to be silent. The process of erosion, for example, “is the more successfully accomplished in silence, to reveal, one morning, a desert of hideous nakedness, where nothing shall ever again revive.” The equivalent creative forces are hardly less awful: the “silent toil of the innumerably polyps” of a coral reef, for example, creating “the future Earth; on whose surface, perhaps, Man shall hereafter reside”.

No wonder so many of us believe in God, when sitting alone in a quiet room for ten minutes confronts us with eternity. The Spanish Basque theologian Ignatius Loyola used to spend seven solid hours a day in silent prayer and came to the only possible conclusion: silence “cancels all rational and discursive activity, thereby enabling direct perception of the divine word.”

“God bestows, God trains, God accomplishes his work, and this can only be done in the silence that is established between the Creator and the creature.” (42-43)

Obvious as this conclusion may be, though, it could still be wrong. Even the prophet Isaiah complained: “Verily thou art a God that hides thyself”.

What if God’s not there? What if sound and fury were our only bulwark against the sucking vacuum of our own meaninglessness? It’s not only the empty vessels that make the most noise: “Men with subtle minds who are quick to see every side of a question find it hard to refrain from expressing what they think,” said Eugene Delacroix. Anyway, “how is it possible to resist giving a favourable idea of one’s mind to a man who seems surprised and pleased to hear what one is saying?”

There’s a vibrancy in tumult, a civic value in conversation, and silence is by no means always golden — a subject Corbin explores ably and at length, carrying us satisfyingly far from our 2016-vintage hygge-filled comfort zone.

Silence suggests self-control, but self-control can itself be a response to oppression. Advice to courtiers to remain silent, or at least ensure that their words are more valuable than their silence, may at worst suggest a society where there is no danger in keeping quiet, but plenty of danger in speaking.

This has certainly been the case historically among peasants, immersed as they are in a style of life that has elements of real madness about it: an undercurrent of constant hate and bitterness expressed in feuding, bullying, bickering and family quarrels, the petty mentality, the self-deprecation, the superstition, the obsessive control of daily life by a strict authoritarianism. If you don’t believe me, read Zola. His peasant silences are “first and foremost a tactic” in a milieu where plans, ambitious or tragic, “were slow to come to fruition, which meant that it was essential not to show your hand.” (93)

Corbin’s history manages to anatomise silence without fetishising it. He leaves you with mixed feelings: a neat trick to pull, in a society that’s convinced it’s drowning in noise.

It’s not true. In Paris, Corbin reminds us, forges once operated on the ground floors of buildings throughout the city. Bells of churches, convents, schools and colleges only added to the cacophony. Carriages made the level of street noise even more deafening. (61) I was reminded, reading this, of how the Victorian computer pioneer Charles Babbage died, at the age of 79, at his home in Marylebone. Never mind the urinary tract infection: his last hours were spent being driven to distraction by itinerant hurdy-gurdy players pan-handling outside his window.

If anything, contemporary society suffers from too much control of its auditory environment. Passengers travelling in trains and trams observe each other in silence; a behaviour that was once considered damn rude. Pedestrians no longer like to be greeted. Among the many silences we may welcome, comes one that we surely must deplore: the silence that accompanies the diminution of our civic life.

Gendun Chopel: Putting the Kama Sutra in the shade

Reviewing a new verse translation of Gendun Chopel’s Treatise on Passion for The Spectator, 9 June 2018.

The Tibetan artist and poet Gendun Chopel was born in 1903. He was identified as an incarnate lama, and ordained as a Buddhist monk. In 1934 he renounced his vows, quit Tibet for India, learned Sanskrit and — if his long poem, usually translated as A Treatise on Passion, is to be taken at face value — copulated with every woman who let him.

Twelve years later he returned to Tibet, and was thrown into prison on trumped-up charges. The experience broke him. He died of cirrhosis in 1951, as troops of China’s People’s Liberation Army were marching through the streets of Lhasa.

Chopel’s reputation as the most important Tibetan writer of the 20th century is secure, mostly through his travelogue, Grains of Gold. The Passion Book is very different; it is Chopel’s reply to the kamasastra, a classical genre of sanskrit erotica best known to us through one rather tame work, the Kama Sutra.

If Chopel had wanted to show off to his peers back home he could simply have translated the Kama Sutra —but where would have been the fun in that? The former monk spent four years researching and writing his own spectacularly explicit work of Tibetan kamasastra.

It is impossible not to like Chopel — ‘A monastic friend undoing his way of life,/ A narrow-minded poser losing his facade’ — if only for the sincerity of his effort. At one point he tries to get the skinny on female masturbation: ‘Other than scornful laughs and being hit with fists/ I could not find even one who would give an honest answer.’

Still, he gets it: ‘Since naked flesh and sinew are different,’ he warns his (literate, therefore male) readership, ‘How can a thorn sense what the wound feels?’

Thus, like touching an open wound,
The pleasure and pain of women is intense

Chopel insists that women’s and men’s experiences of sex differ, and that women are not mere sources of male pleasure but full partners in the play of passion. So far, so safe. But let’s not be too quick to tidy up Chopel’s long, dizzying, delirious mess of a poem, which jumps from folk wisdom about how to predict a woman’s future by studying the moles on her face, to a jeremiad against the hypocrisy of the rich and powerful, to evocations of tantric states, to the sexual preferences of women in various regions of India, to sexual positions, to fullblown sexual delirium:

They copulate squatting and they copulate standing;
Intertwined, with head and foot reversed, they copulate.
Hanging the woman in the air
With a rope of silk they copulate.

Chopel’s translators, Donald S Lopez Jr. and Thupten Jinpa — Tibetan and Buddhist scholars, who a few years ago also translated Grains of Gold — have appended a long afterword which goes some way to revealing what is going on here.In the Christian faith, sexual intercourse may lead to hell. The early tradition of Buddhism took a different position: sex is fine, so far as it goes; it’s everything that follows — marriage, home, property, domestic contentment, the pram in the hall — that paves the road to perdition.

This is what inspires Buddhism’s tradition of astounding misogyny. Something has got to stop you from having sex with your own wife — and a famous Mahayana sutra has the solution. Think of her as a demon. An ogre. A hag. As sickness, old age, or death:

As a huge wolf, a huge sea monster, and a huge cat; a black snake, a crocodile, and a demon that causes epilepsy; and as swollen, shrivelled and diseased.

The rise of the tantric tradition altered sexual attitudes to the extent that one was now actually obliged to have intercourse if one ever hoped to achieve buddhahood. But the ideal tantric playmate — a girl of 16 or younger, and ideally low-caste — was still no more than a tool for the enlightenment of an elite male.

Chopel, coming late to the ordinary delights and comforts of sex, was having none of it. Lopez and Jinpa speculate entertainingly about where Chopel sits in the pantheon of such early sexologists as Ellis, Freud and Reich. For sure, he was a believer in sexual liberation: ‘When suitable deeds are prohibited in public,’ he asserts, ‘Unsuitable deeds will be done in private.’

Jeffrey Hopkins translated Chopel’s A Treatise on Passion into prose in 1992 as Tibetan Arts of Love. This is the first effort in verse, and though a clear, scholarly advance, the translators have struggled to render the carefully metered original into lines of even roughly the same number of syllables. You can understand their bind: even in the original Tibetan, there’s still no critical edition. With so much basic scholarship to be done, it would have been pointless if they had simply jazz-handed their way through a loose transliteration.

Their effort captures Chopel’s charm, and that’s the main thing. As Chopel said of the act itself: ‘It may not be a virtue, but how could it be a sin?’

Elements of surprise

Reading Vera Tobin’s Elements of Surprise for New Scientist, 5 May 2018

How do characters and events in fiction differ from those in real life? And what is it about our experience of life that fiction exaggerates, omits or captures to achieve its effects?

Effective fiction is Vera Tobin’s subject. And as a cognitive scientist, she knows how pervasive and seductive it can be, even in – or perhaps especially in – the controlled environment of an experimental psychology lab.

Suppose, for instance, you want to know which parts of the brain are active when forming moral judgements, or reasoning about false beliefs. These fields and others rest on fMRI brain scans. Volunteers receive short story prompts with information about outcomes or character intentions and, while their brains are scanned, have to judge what other characters ought to know or do.

“As a consequence,” writes Tobin in her new book Elements of Surprise, “much research that is putatively about how people think about other humans… tells us just as much, if not more, about how study participants think about characters in constructed narratives.”

Tobin is weary of economists banging on about the “flaws” in our cognitive apparatus. “The science on this phenomenon has tended to focus on cataloguing errors people make in solving problems or making decisions,” writes Tobin, “but… its place and status in storytelling, sense-making, and aesthetic pleasure deserve much more attention.”

Tobin shows how two major “flaws” in our thinking are in fact the necessary and desirable consequence of our capacity for social interaction. First, we wildly underestimate our differences. We model each other in our heads and have to assume this model is accurate, even while we’re revising it, moment to moment. At the same time, we have to assume no one else has any problem performing this task – which is why we’re continually mortified to discover other people have no idea who we really are.

Similarly, we find it hard to model the mental states of people, including our past selves, who know less about something than we do. This is largely because we forget how we came to that privileged knowledge.

“Tobin is weary of economists banging on about the ‘flaws’ in our cognitive apparatus”
There are implications for autism, too. It is, Tobin says, unlikely that many people with autism “lack” an understanding that others think differently – known as “theory of mind”. It is more likely they have difficulty inhibiting their knowledge when modelling others’ mental states.

And what about Emma, titular heroine of Jane Austen’s novel? She “is all too ready to presume that her intentions are unambiguous to others and has great difficulty imagining, once she has arrived at an interpretation of events, that others might believe something different”, says Tobin. Austen’s brilliance was to fashion a plot in which Emma experiences revelations that confront the consequences of her “cursed thinking” – a cognitive bias making us assume any person with whom we communicate has the background knowledge to understand what is being said.

Just as we assume others know what we’re thinking, we assume our past selves thought as we do now. Detective stories exploit this foible. Mildred Pierce, Michael Curtiz’s 1945 film, begins at the end, as it were, depicting the story’s climactic murder. We are fairly certain we know who did it, but we flashback to the past and work forward to the present only to find that we have misinterpreted everything.

I confess I was underwhelmed on finishing this excellent book. But then I remembered Sherlock Holmes’s complaint (mentioned by Tobin) that once he reveals the reasoning behind his deductions, people are no longer impressed by his singular skill. Tobin reveals valuable truths about the stories we tell to entertain each other, and those we tell ourselves to get by, and how they are related. Like any good magic trick, it is obvious once it has been explained.