Joy in the detail

Reading Charles Darwin’s Barnacle and David Bowie’s Spider by Stephen Heard for the Spectator, 16 May 2020

Heteropoda davidbowie is a species of huntsman spider. Though rare, it has been found in parts of Malaysia, Singapore, Indonesia and possibly Thailand. (The uncertainty arises because it’s often mistaken for a similar-looking species, the Heteropoda javana.) In 2008 a German collector sent photos of his unusual-looking “pet” to Peter Jäger, an arachnologist at the Senckenberg Research Institute in Frankfurt. Consequently, and in common with most other living finds, David Bowie’s spider was discovered twice: once in the field, and once in the collection.

Bowie’s spider is famous, but not exceptional. Jäger has discovered more than 200 species of spider in the last decade, and names them after politicians, comedians and rock stars to highlight our ecological plight. Other researchers find more pointed ways to further the same cause. In the first month of Donald Trump’s administration, Iranian-Canadian entomologist Vazrick Nazari discovered a moth with a head crowned with large, blond comb-over scales. There’s more to Neopalpa donaldtrumpi than a striking physical resemblance: it lives in a federally protected area around where the border wall with Mexico is supposed to go. Cue headlines.

Species are becoming extinct 100 times faster than they did before modern humans arrived. This makes reading a book about the naming of species a curiously queasy affair. Nor is there much comfort to be had in evolutionary ecologist Stephen Heard’s observation that, having described 1.5 million species, we’ve (at very best) only recorded half of what’s out there. There is, you may recall, that devastating passage in Cormac McCarthy’s western novel Blood Meridian in which Judge Holden meticulously records a Native American artifact in his sketchbook — then destroys it. Given that to discover a species you must, by definition, invade its environment, Holden’s sketch-and-burn habit appears to be a painfully accurate metonym for what the human species is doing to the planet. Since the 1970s (when there used to be twice as many wild animals than there are now) we’ve been discovering and endangering new species in almost the same breath.

Richard Spruce, one of the Victorian era’s great botanical explorers, who spent 15 years exploring the Amazon from the Andes to its mouth, is a star of this short, charming book about how we have named and ordered the living world. No detail of his bravery, resilience and grace under pressure come close to the eloquence of this passing quotation, however: “Whenever rains, swollen streams, and grumbling Indians combined to overwhelm me with chagrin,” he wrote in his account of his travels, “I found reason to thank heaven which had enabled me to forget for the moment all my troubles in the contemplation of a simple moss.”

Stephen Heard, an evolutionary ecologist based in Canada, explains how extraordinary amounts of curiosity have been codified to create a map of the living world. The legalistic-sounding codes by which species are named are, it turns out, admirably egalitarian, ensuring that the names amateurs give species are just as valid as those of professional scientists.

Formal names are necessary because of the difficulty we have in distinguishing between similiar species. Common names run into this difficulty all the time. There too many of them, so the same species gets different names in different languages. At the same time, there aren’t enough of them, so that, as Heard points out, “Darwin’s finches aren’t finches, African violets aren’t violets, and electric eels aren’t eels;” Robins, blackbirds and badgers are entirely different animals in Europe and North America; and virtually every flower has at one time or another been called a daisy.

Also names tend, reasonably enough, to be descriptive. This is fine when you’re distinguishing between, say, five different types of fish When there are 500 different fish to sort through, however, absurdity beckons. Heard lovingly transcribes the pre-Linnaean species name of the English whiting, formulated around 1738: “Gadus, dorso tripterygio, ore cirrato, longitudine ad latitudinem tripla, pinna ani prima officulorum trigiata“. So there.

It takes nothing away from the genius of Swedish physician Carl Linnaeus, who formulated the naming system we still use today, to say that he came along at the right time. By Linnaeus’s day, it was possible to look things up. Advances in printing and distribution had made reference works possible. Linnaeus’s innovation was to decouple names from descriptions. And this, as Heard reveals in anecdote after anecdote, is where the fun now slips in: the mythopoeic cool of the baboon Papio anubis, the mischevious smarts of the beetle Agra vation, the nerd celebrity of lemur Avahi cleesi.

Hearst’s taxonomy of taxonomies makes for somewhat thin reading; this is less of a book, more like a dozen interesting magazine articles flying in close formation. But its close focus, bringing to life minutiae of both the living world and the practice of science, is welcome.

I once met Michael Land, the neurobiologist who figured out how the lobster’s eye works. He told me that the trouble with big ideas is that they get in the way of the small ones. Heard’s lesson, delivered with such a light touch, is the same. The joy, and much of the accompanying wisdom, lies in the detail.

Goodbye to all that

Reading Technologies of the Human Corpse by John Troyer for the Spectator, 11 April 2020

John Troyer, the director of the Centre for Death and Society at the University of Bath, has moves. You can find his interpretative dances punctuating a number of his lectures, which go by such arresting titles as ‘150 Years of the Human Corpse in American History in Under 15 Minutes with Jaunty Background Music’ and ‘Abusing the Corpse Even More: Understanding Necrophilia Laws in the USA — Now with more Necro! And more Philia!’ (Wisconsin and Ohio are, according to Troyer’s eccentric looking and always fascinating website, ‘two states that just keep giving and giving when it comes to American necrophilia cases’.)

Troyer’s budding stand-up career has taken a couple of recent knocks. First was the ever more pressing need for him to crack on with his PhD (his dilatoriness was becoming a family joke). Technologies of the Human Corpse is yanked, not without injury, from that career-establishing academic work. Even as he assembled the present volume, however, there came another, far more personal, blow.

Late in July 2017 Troyer’s younger sister Julie was diagnosed with an aggressive brain cancer. Her condition deteriorated far more quickly than anyone expected, and on 29 July 2018 she died. This left Troyer — the engaging young American death scholar sprung from a family of funeral directors — having to square his erudite and cerebral thoughts on death and dead bodies with the fact he’d just kissed his sister goodbye. He interleaves poetical journal entries composed during Julie’s dying and her death, her funeral and her commemoration, between chapters written by a younger, jollier and of course shallower self.

To be brutal, the poems aren’t up to much, and on their own they wouldn’t add a great deal by way of nuance or tragedy. Happily for us, however, and to Troyer’s credit, he has transformed them into a deeply moving 30-page memoir that now serves as the book’s preface. This, then, is Troyer’s monster: a powerful essay about dying and bereavement; a set of poems written off the cuff and under great stress; and seven rather disconnected chapters about what’s befallen the human corpse in the past century or so.

Even as the book was going to print, Troyer explains in a hurried postscript, his father, a retired undertaker, lost consciousness following a cardiac arrest and was very obviously dying:

“And seeing my father suddenly fall into a comatose state so soon after watching my sister die is impossible to fully describe: I understand what is happening, yet I do not want to understand what is happening.”

This deceptively simple statement from Troyer the writer is streets ahead of anything Troyer the postgrad can pull off.

But to the meat of the book. The American civil war saw several thousand corpses embalmed and transported on new-fangled railway routes across the continent. The ability to preserve bodies, and even lend them a lifelike appearance months after death, created a new industry that, in various configurations and under several names, now goes by the daunting neologism of ‘deathcare provision’. In the future, this industry will be seen ‘transforming the mostly funeralisation side of the business into a much broader, human body parts and tissue distribution system’, as technical advances make increasing use of cadavers and processed cadaver parts.

So how much is a dead body worth? Between $30,000 and $50,000, says Troyer — five times as much for donors processed into medical implants, dermal implants and demineralised bone matrices. Funds and materials are exchanged through a network of body brokers who serve as middlemen between biomedical corporations such as Johnson & Johnson and the usual sources of human cadavers — medical schools, funeral homes and mortuaries. It is by no stretch an illegal trade, nor is it morally problematic in most instances; but it is rife with scandal. As one involved party remarks: ‘If you’re cremated, no one is ever going to know if you’re missing your shoulders or knees or your head.’

Troyer is out to show how various industries serve to turn our dead bodies into ‘an unfettered source of capital’. The ‘fluid men’ of Civil War America — who toured the battlefields showing keen students how to embalm a corpse (and almost always badly) — had no idea what a strange story they had started. Today, as the anatomist Gunther von Hagens poses human cadavers in sexual positions to pique and titillate worldwide audiences, we begin to get a measure of how far we have come. Hagens’s posthumous pornography reveals, says Troyer, ‘the ultimate taxonomic power over nature: we humans, or at least our bodies, can live forever because we pull ourselves from nature’.

Technologies of the Human Corpse is a bit of a mess, but I have a lot of time for Troyer. His insights are sound, and his recent travails may yet (and at high human cost — but it was ever thus) make him a writer of some force.

 

Pluck

Reading Gunpowder and Glory: The Explosive Life of Frank Brock OBE by Harry Smee and Henry Macrory for the Spectator, 21 March 2020

Early one morning in October 1874, a barge carrying three barrels of benzoline and five tons of gunpowder blew up in the Regent’s Canal, close to London Zoo. The crew of three were killed outright, scores of houses were badly damaged, the explosion could be heard 25 miles away, and “dead fish rained from the sky in the West End.”

This is a book about the weird, if obvious, intersection between firework manufacture and warfare. It is, ostensibly, the biography of a hero of the First World War, Frank Brock. And if it were the work of more ambitious literary hands, Brock would have been all you got. His heritage, his school adventures, his international career as a showman, his inventions, his war work, his violent death. Enough for a whole book, surely?

But Gunpowder and Glory is not a “literary” work, by which I mean it is neither self-conscious nor overwrought. Instead Henry Macrory (who anyway has already proved his literary chops with his 2018 biography of the swindler Whitaker Wright) has opted for what looks like a very light touch here, assembling and ordering the anecdotes and reflections of Frank Brock’s grandson Harry Smee about his family, their business as pyrotechnical artists, and, finally, about Frank, his illustrious forebear.

I suspect a lot of sweat went into such artlessness, and it’s paid off, creating a book that reads like fascinating dinner conversation. Reading its best passages, I felt I was discovering Brock the way Harry had as a child, looking into his mother’s “ancient oak chests filled with papers, medals, newspapers, books, photographs, an Intelligence-issue knuckleduster and pieces of Zeppelin and Zeppelin bomb shrapnel.”

For eight generations, the Brock family produced pyrotechnic spectaculars of a unique kind. Typical set piece displays in the eighteenth century included “Jupiter discharging lightning and thunder, Two gladiators combating with fire and sword, and Neptune finely carv’d seated in his chair, drawn by two sea horses on fire-wheels, spearing a dolphin.”

Come the twentieth century, Brock’s shows were a signature of Empire. It would take a write like Thomas Pynchon to do full justice to “a sixty foot-high mechanical depiction of the Victorian music-hall performer, Lottie Collins, singing the chorus of her famous song ‘Ta-ra-ra-boom-de-ay’ and giving a spirited kick of an automated leg each time the word ‘boom’ rang out.”

Frank was a Dulwich College boy, and one of that generation lost to the slaughter of the Great War. A spy and an inventor — James Bond and Q in one — he applied his inherited chemical and pyrotechnical genius to the war effort — by making a chemical weapon. It wasn’t any good, though: Jellite, developed during the summer of 1915 and named after its jelly-like consistency during manufacture, proved insufficiently lethal.

On such turns of chance do reputations depend, since we remember Frank Brock for his many less problematic inventions. Dover flares burned for seven and a half minutes
and lit up an area of three miles radius, as Winston Churchill put it, “as bright as Piccadilly”. U boats, diving to avoid these lights, encountered mines. Frank’s artificial fogs, hardly bettered since, concealed whole British fleets, entire Allied battle lines.

Then there are his incendiary bullets.

At the time of the Great War a decent Zeppelin could climb to 20,000 feet, travel at 47 mph for more than 1,000 miles, and stay aloft for 36 hours. Smee and Mcrory are well within their rights to call them “the stealth bombers of their time”.

Brock’s bullets tore them out of the sky. Sir William Pope, Brock’s advisor, and a professor of chemistry at Cambridge University, explained: “You need to imagine a bullet proceeding at several thousand feet a second, and firing as it passes through a piece of fabric which is no thicker than a pocket handkerchief.” All to rupture a gigantic sac of hydrogen sufficiently to make the gas explode. (Much less easy than you think; the Hindenburg only crashed because its entire outer envelope was set on fire.)

Frank died in an assault on the mole at Zeebrugge in 1918. He shouldn’t have been there. He should have been in a lab somewhere, cooking up another bullet, another light, another poison gas. Today, he surely would be suitably contained, his efforts efficiently channeled, his spirit carefully and surgically broken.

Frank lived at a time when it was possible — and men, at any rate, were encouraged — to be more than one thing. That this heroic idea overreached itself — that rugby field and school chemistry lab both dissolved seamlessly into the Somme — needs no rehearsing.

Still, we have lost something. When Frank went to school there was a bookstall near the station which sold “a magazine called Pluck, containing ‘the daring deeds of plucky sailors, plucky soldiers, plucky firemen, plucky explorers, plucky detectives, plucky railwaymen, plucky boys and plucky girls and all sorts of conditions of British heroes’.”

Frank was a boy moulded thus, and sneer as much as you want, we will not see his like again.

 

“So that’s how the negroes of Georgia live!”

Visiting W.E.B. Du Bois: Charting Black Lives, at the House of Illustration, London, for the Spectator, 25 January 2020

William Edward Burghardt Du Bois was born in Massachusetts in 1868, three years after the official end of slavery in the United States. He grew up among a small, tenacious business- and property-owning black middle class who had their own newspapers, their own schools and universities, their own elected officials.

After graduating with a PhD in history from Harvard University, Du Bois embarked on a sprawling study of African Americans living in Philadelphia. At the historically black Atlanta University in 1897, he established international credentials as a pioneer of the newfangled science of sociology. His students were decades ahead of their counterparts in the Chicago school.

In the spring of 1899, Du Bois’s son Burghardt died, succumbing to sewage pollution in the Atlanta water supply. ‘The child’s death tore our lives in two,’ Du Bois later wrote. His response: ‘I threw myself more completely into my work.’

A former pupil, the black lawyer Thomas Junius Calloway, thought that Du Bois was just the man to help him mount an exhibition to demonstrate the progress that had been made by African Americans. Funded by Congress and planned for the Paris Exposition of 1900, the project employed around a dozen clerks, students and former students to assemble and run ‘the great machinery of a special census’.

Two studies emerged. ‘The Georgia Negro’, comprising 32 handmade graphs and charts, captured a living community in numbers: how many black children were enrolled in public schools, how far family budgets extended, what people did for work, even the value of people’s kitchen furniture.

The other, a set of about 30 statistical graphics, was made by students at Atlanta University and considered the African American population of the whole of the United States. Du Bois was struck by the fact that the illiteracy of African Americans was ‘less than that of Russia, and only equal to that of Hungary’. A chart called ‘Conjugal Condition’ suggests that black Americans were more likely to be married than Germans.

The Exposition Universelle of 1900 brought all the world to the banks of the Seine. Assorted Africans, shipped over for the occasion, found themselves in model native villages performing bemused and largely made-up rituals for the visitors. (Some were given a truly lousy time by their bosses; others lived for the nightlife.) Meanwhile, in a theatre made of plaster and drapes, the Japanese geisha Sada Yacco, wise to this crowd from her recent US tour, staged a theatrical suicide for herself every couple of hours.

The expo also afforded visitors more serious windows on the world. Du Bois scraped together enough money to travel steerage to Paris to oversee his exhibition’s installation at the Palace of Social Economy.

He wasn’t overly impressed by the competition. ‘There is little here of the “science of society”,’ he remarked, and the organisers of the Exposition may well have agreed with him: they awarded him a gold medal for what Du Bois called, with justifiable pride, ‘an honest, straightforward exhibit of a small nation of people, picturing their life and development without apology or gloss, and above all made by themselves’.

At the House of Illustration in London you too can now follow the lines, bars and spirals that reveal how black wealth, literacy and land ownership expanded over the four decades since emancipation.

His exhibition also included what he called ‘the usual paraphernalia for catching the eye — photographs, models, industrial work, and pictures’, so why did Du Bois include so many charts, maps and diagrams?

The point about data is that it looks impersonal. It is a way of separating your argument from what people think of you, and this makes it a powerful weapon in the hands of those who find themselves mistrusted in politics and wider society. Du Bois and his community, let’s not forget, were besieged — by economic hardship, and especially by the Jim Crow laws that would outlive him by two years (he died in 1963).

Du Bois pioneered sociology, not statistics. Means of visualising data had entered academia more than a century before, through the biographical experiments of Joseph Priestly. His timeline charts of people’s lives and relative lifespans had proved popular, inspiring William Playfair’s invention of the bar chart. Playfair, an engineer and political economist, published his Commercial and Political Atlas in London in 1786. It was the first major work to contain statistical graphs. More to the point, it was the first time anyone had tried to visualise an entire nation’s economy.

Statistics and their graphic representation were quickly established as an essential, if specialised, component of modern government. There was no going back. Metrics are a self-fertilising phenomenon. Arguments over figures, and over the meaning of figures, can only generate more figures. The French civil engineer Charles Joseph Minard used charts in the 1840s to work out how to monetise freight on the newfangled railroads, then, in retirement, and for a hobby, used two colours and six dimensions of data to visualise Napoleon’s invasion and retreat during the 1812 campaign of Russia.

And where society leads, science follows. John Snow founded modern epidemiology when his annotated map revealed the source of an outbreak of cholera in London’s Soho. English nurse Florence Nightingale used information graphics to persuade Queen Victoria to improve conditions in military hospitals.

Rightly, we care about how accurate or misleading infographics can be. But let’s not forget that they should be beautiful. The whole point of an infographic is, after all, to capture attention. Last year, the House of Illustration ran a tremendous exhibition of the work of Marie Neurath who, with her husband Otto, dreamt up a way of communicating, without language, by means of a system of universal symbols. ‘Words divide, pictures unite’ was the slogan over the door of their Viennese design institute. The couple’s aspirations were as high-minded as their output was charming. The Neurath stamp can be detected, not just in kids’ picture books, but across our entire designscape.

Infographics are prompts to the imagination. (One imagines at least some of the 50 million visitors to the Paris Expo remarking to each other, ‘So that’s how the negroes of Georgia live!’) They’re full of facts, but do they convey them more effectively than language? I doubt it. Where infographics excel is in eliciting curiosity and wonder. They can, indeed, be downright playful, as when Fritz Kahn, in the 1920s, used fast trains, street traffic, dancing couples and factory floors to describe, by visual analogy, the workings of the human body.

Du Bois’s infographics aren’t rivals to Kahn or the Neuraths. Rendered in ink, gouache watercolour and pencil, they’re closer in spirit to the hand-drawn productions of Minard and Snow. They’re the meticulous, oh-so-objective statements of a proud, decent, politically besieged people. They are eloquent in their plainness, as much as in their ingenuity, and, given a little time and patience, they prove to be quite unbearably moving.

Cutting up the sky

Reading A Scheme of Heaven: Astrology and the Birth of Science by Alexander Boxer
for the Spectator, 18 January 2020

Look up at sky on a clear night. This is not an astrological game. (Indeed, the experiment’s more impressive if you don’t know one zodiacal pattern from another, and rely solely on your wits.) In a matter of seconds, you will find patterns among the stars.

We can pretty much apprehend up to five objects (pennies, points of light, what-have-you) at a single glance. Totting up more than five objects, however, takes work. It means looking for groups, lines, patterns, symmetries, boundaries.

The ancients cut up the sky into figures, all those aeons ago, for the same reason we each cut up the sky within moments of gazing at it: because if we didn’t, we wouldn’t be able to comprehend the sky at all.

Our pattern-finding ability can get out of hand. During his Nobel lecture in 1973 the zoologist Konrad Lorenz recalled how he once :”… mistook a mill for a sternwheel steamer. A vessel was anchored on the banks of the Danube near Budapest. It had a little smoking funnel and at its stern an enormous slowly-turning paddle-wheel.”

Some false patterns persist. Some even flourish. And the brighter and more intellectually ambitious you are, the likelier you are to be suckered. John Dee, Queen Elizabeth’s court philosopher, owned the country’s largest library (it dwarfed any you would find at Oxford or Cambridge). His attempt to tie up all that knowledge in a single divine system drove him into the arms of angels — or at any rate, into the arms of the “scrier” Edward Kelley, whose prodigious output of symbolic tables of course could be read in such a way as to reveal fragments of esoteric wisdom.

This, I suspect, is what most of us think about astrology: that it was a fanciful misconception about the world that flourished in times of widespread superstition and ignorance, and did not, could not, survive advances in mathematics and science.

Alexander Boxer is out to show how wrong that picture is, and A Scheme of Heaven will make you fall in love with astrology, even as it extinguishes any niggling suspicion that it might actually work.

Boxer, a physicist and historian, kindles our admiration for the earliest astronomers. My favourite among his many jaw-dropping stories is the discovery of the precession of the equinoxes. This is the process by which the sun, each mid-spring and mid-autumn, rises at a fractionally different spot in the sky each year. It takes 26,000 years to make a full revolution of the zodiac — a tiny motion first detected by Hipparchus around 130 BC. And of course Hipparchus, to make this observation at all, “had to rely on the accuracy of stargazers who would have seemed ancient even to him.”

In short, a had a library card. And we know that such libraries existed because the “astronomical diaries” from the Assyrian library at Nineveh stretch from 652BC to 61BC, representing possibly the longest continuous research program ever undertaken in human history.

Which makes astrology not too shoddy, in my humble estimation. Boxer goes much further, dubbing it “the ancient world’s most ambitious applied mathematics problem.”

For as long as lives depend on the growth cycles of plants, the stars will, in a very general sense, dictate the destiny of our species. How far can we push this idea before it tips into absurdity? The answer is not immediately obvious, since pretty much any scheme we dream up will fit some conjunction or arrangement of the skies.

As civilisations become richer and more various, the number and variety of historical events increases, as does the chance that some event will coincide with some planetary conjunction. Around the year 1400, the French Catholic cardinal Pierre D’Ailly concluded his astrological history of the world with a warning that the Antichrist could be expected to arrive in the year 1789, which of course turned out to be the year of the French revolution.

But with every spooky correlation comes an even larger horde of absurdities and fatuities. Today, using a machine-learning algorithm, Boxer shows that “it’s possible to devise a model that perfectlly mimics Bitcoin’s price history and that takes, as its input data, nothing more than the zodiac signs of the planets on any given day.”

The Polish science fiction writer Stanislaw Lem explored this territory in his novel The Chain of Chance: “We now live in such a dense world of random chance,” he wrote in 1975, “in a molecular and chaotic gas whose ‘improbabilities’ are amazing only to the individual human atoms.” And this, I suppose, is why astrology eventually abandoned the business of describing whole cultures and nations (a task now handed over to economics, another largely ineffectual big-number narrative) and now, in its twilight, serves merely to gull individuals.

Astrology, to work at all, must assume that human affairs are predestined. It cannot, in the long run, survive the notion of free will. Christianity did for astrology, not because it defeated a superstition, but because it rendered moot astrology’s iron bonds of logic.

“Today,” writes Boxer, “there’s no need to root and rummage for incidental correlations. Modern machine-learning algorithms are correlation monsters. They can make pretty much any signal correlate with any other.”

We are bewitched by big data, and imagine it is something new. We are ever-indulgent towards economists who cannot even spot a global crash. We credulously conform to every algorithmically justified norm. Are we as credulous, then, as those who once took astrological advice as seriously as a medical diagnosis? Oh, for sure.

At least our forebears could say they were having to feel their way in the dark. The statistical tools you need to sort real correlations from pretty patterns weren’t developed until the late nineteenth century. What’s our excuse?

“Those of us who are enthusiastic about the promise of numerical data to unlock the secrets of ourselves and our world,” Boxer writes, “would do well simply to acknowledge that others have come this way before.”

‘God knows what the Chymists mean by it’

Reading Antimony, Gold, and Jupiter’s Wolf: How the Elements Were Named, by
Peter Wothers, for The Spectator, 14 December 2019

Here’s how the element antimony got its name. Once upon a time (according to the 17th-century apothecary Pierre Pomet), a German monk (moine in French) noticed its purgative effects in animals. Fancying himself as a physician, he fed it to “his own Fraternity… but his Experiment succeeded so ill that every one who took of it died. This therefore was the reason of this Mineral being call’d Antimony, as being destructive of the Monks.”

If this sounds far-fetched, the Cambridge chemist Peter Wothers has other stories for you to choose from, each more outlandish than the last. Keep up: we have 93 more elements to get through, and they’re just the ones that occur naturally on Earth. They each have a history, a reputation and in some cases a folklore. To investigate their names is to evoke histories that are only intermittently scientific. A lot of this enchanting, eccentric book is about mining and piss.

The mining:

There was no reliable lighting or ventilation; the mines could collapse at any point and crush the miners; they could be poisoned by invisible vapours or blown up by the ignition of pockets of flammable gas. Add to this the stifling heat and the fact that some of the minerals themselves were poisonous and corrosive, and it really must have seemed to the miners that they were venturing into hell.

Above ground, there were other difficulties. How to spot the new stuff? What to make of it? How to distinguish it from all the other stuff? It was a job that drove men spare. In a 1657 Physical Dictionary the entry for Sulphur Philosophorum states simply: ‘God knows what the Chymists mean by it.’

Today we manufacture elements, albeit briefly, in the lab. It’s a tidy process, with a tidy nomenclature. Copernicum, einsteinium berkelium: neologisms as orderly and unevocative as car marques.

The more familiar elements have names that evoke their history. Cobalt, found in
a mineral that used to burn and poison miners, is named for the imps that, according to the 16th-century German Georgius Agricola ‘idle about in the shafts and tunnels and really do nothing, although they pretend to be busy in all kinds of labour’. Nickel is kupfernickel, ‘the devil’s copper’, an ore that looked like valuable copper ore but, once hauled above the ground, appeared to have no value whatsoever.

In this account, technology leads and science follows. If you want to understand what oxygen is, for example, you first have to be able to make it. And Cornelius Drebbel, the maverick Dutch inventor, did make it, in 1620, 150 years before Joseph Priestley got in on the act. Drebbel had no idea what this enchanted stuff was, but he knew it sweetened the air in his submarine, which he demonstrated on the Thames before King James I. Again, if you want a good scientific understanding of alkalis, say, then you need soap, and lye so caustic that when a drunk toppled into a pit of the stuff ‘nothing of him was found but his Linnen Shirt, and the hardest Bones, as I had the Relation from a Credible Person, Professor of that Trade’. (This is Otto Tachenius, writing in 1677. There is lot of this sort of thing. Overwhelming in its detail as it can be, Antimony, Gold, and Jupiter’s Wolf is wickedly entertaining.)

Wothers does not care to hold the reader’s hand. From page 1 he’s getting his hands dirty with minerals and earths, metals and the aforementioned urine (without which the alchemists, wanting chloride, sodium, potassium and ammonia, would have been at a complete loss) and we have to wait till page 83 for a discussion of how the modern conception of elements was arrived at. The periodic table doesn’t arrive till page 201 (and then it’s Mendeleev’s first table, published in 1869). Henri Becquerel discovers radioactivity barely four pages before the end of the book. It’s a surprising strategy, and a successful one. Readers fall under the spell of the possibilities of matter well before they’re asked to wrangle with any of the more highfalutin chemical concepts.

In 1782, Louis-Bernard Guyton de Morveau published his Memoir upon Chemical Denominations, the Necessity of Improving the System, and the Rules for Attaining a Perfect Language. Countless idiosyncracies survived his reforms. But chemistry did begin to acquire an orderliness that made Mendeleev’s towering work a century later — relating elements to their atomic structure — a deal easier.

This story has an end. Chemistry as a discipline is now complete. All the major problems have been solved. There are no more great discoveries to be made. Every chemical reaction we do is another example of one we’ve already done. These days, chemists are technologists: they study spectrographs, and argue with astronomers about the composition of the atmospheres around planets orbiting distant stars; they tinker in biophysics labs, and have things to say about protein synthesis. The heroic era of chemical discovery — in which we may fondly recall Gottfried Leibniz extracting phosphorus from 13,140 litres of soldiers’ urine — is past. Only some evocative words remain; and Wothers unpacks them with infectious enthusiasm, and something which in certain lights looks very like love.

Attack of the Vocaloids

Marrying music and mathematics for The Spectator, 3 August 2019

In 1871, the polymath and computer pioneer Charles Babbage died at his home in Marylebone. The encyclopaedias have it that a urinary tract infection got him. In truth, his final hours were spent in an agony brought on by the performances of itinerant hurdy-gurdy players parked underneath his window.

I know how he felt. My flat, too, is drowning in something not quite like music. While my teenage daughter mixes beats using programs like GarageBand and Logic Pro, her younger brother is bopping through Helix Crush and My Singing Monsters — apps that treat composition itself as a kind of e-sport.

It was ever thus: or was once 18th-century Swiss watchmakers twigged that musical snuff-boxes might make them a few bob. And as each new mechanical innovation has emerged to ‘transform’ popular music, so the proponents of earlier technology have gnashed their teeth. This affords the rest of us a frisson of Schadenfreude.

‘We were musicians using computers,’ complained Pete Waterman, of the synthpop hit factory Stock Aitken Waterman in 2008, 20 years past his heyday. ‘Now it’s the whole story. It’s made people lazy. Technology has killed our industry.’ He was wrong, of course. Music and mechanics go together like beans on toast, the consequence of a closer-than-comfortable relation between music and mathematics. Today, a new, much more interesting kind of machine music is emerging to shape my children’s musical world, driven by non-linear algebra, statistics and generative adversarial networks — that slew of complex and specific mathematical tools we lump together under the modish (and inaccurate) label ‘artificial intelligence’.

Some now worry that artificially intelligent music-makers will take even more agency away from human players and listeners. I reckon they won’t, but I realise the burden of proof lies with me. Computers can already come up with pretty convincing melodies. Soon, argues venture capitalist Vinod Khosla, they will be analysing your brain, figuring out your harmonic likes and rhythmic dislikes, and composing songs made-to-measure. There are enough companies attempting to crack it; Popgun, Amper Music, Aiva, WaveAI, Amadeus Code, Humtap, HumOn, AI Music are all closing in on the composer-less composition.

The fear of tech taking over isn’t new. The Musicians’ Union tried to ban synths in the 1980s, anxious that string players would be put out of work. The big disruption came with the arrival of Kyoko Date. Released in 1996, she was the first seriously publicised attempt at a virtual pop idol. Humans still had to provide Date with her singing and speaking voice. But by 2004 Vocaloid software — developed by Kenmochi Hideki at the Pompeu Fabra University in Barcelona — enabled users to synthesise ‘singing’ by typing in lyrics and a melody. In 2016 Hatsune Miku, a Vocaloid-powered 16-year-old artificial girl with long, turquoise twintails, went, via hologram, on her first North American tour. It was a sell-out. Returning to her native Japan, she modelled Givenchy dresses for Vogue.

What kind of music were these idoru performing? Nothing good. While every other component of the music industry was galloping ahead into a brave new virtualised future — and into the arms of games-industry tech — the music itself seemed stuck in the early 1980s which, significantly, was when music synthesizer builder Dave Smith had first come up with MIDI.

MIDI is a way to represent musical notes in a form a computer can understand. MIDI is the reason discrete notes that fit in a grid dominate our contemporary musical experience. That maddenning clockwork-regular beat that all new music obeys is a MIDI artefact: the software becomes unwieldy and glitch-prone if you dare vary the tempo of your project. MIDI is a prime example (and, for that reason, made much of by internet pioneer-turned-apostate Jaron Lanier) of how a computer can take a good idea and throw it back at you as a set of unbreakable commandments.

For all their advances, the powerful software engines wielded by the entertainment industry were, as recently as 2016, hardly more than mechanical players of musical dice games of the sort popular throughout western Europe in the 18th century.

The original games used dice randomly to generate music from precomposed elements. They came with wonderful titles, too — witness C.P.E. Bach’s A method for making six bars of double counterpoint at the octave without knowing the rules (1758). One 1792 game produced by Mozart’s publisher Nikolaus Simrock in Berlin (it may have been Mozart’s work, but we’re not sure) used dice rolls randomly to select beats, producing a potential 46 quadrillion waltzes.

All these games relied on that unassailable, but frequently disregarded truth, that all music is algorithmic. If music is recognisable as music, then it exhibits a small number of formal structures and aspects that appear in every culture — repetition, expansion, hierarchical nesting, the production of self-similar relations. It’s as Igor Stravinsky said: ‘Musical form is close to mathematics — not perhaps to mathematics itself, but certainly to something like mathematical thinking and relationship.’

As both a musician and a mathematician, Marcus du Sautoy, whose book The Creativity Code was published this year, stands to lose a lot if a new breed of ‘artificially intelligent’ machines live up to their name and start doing his mathematical and musical thinking for him. But the reality of artificial creativity, he has found, is rather more nuanced.

One project that especially engages du Sautoy’s interest is Continuator by François Pachet, a composer, computer scientist and, as of 2017, director of the Spotify Creator Technology Research Lab. Continuator is a musical instrument that learns and interactively plays with musicians in real time. Du Sautoy has seen the system in action: ‘One musician said, I recognise that world, that is my world, but the machine’s doing things that I’ve never done before and I never realised were part of my sound world until now.’

The ability of machine intelligences to reveal what we didn’t know we knew is one of the strangest and most exciting developments du Sautoy detects in AI. ‘I compare it to crouching in the corner of a room because that’s where the light is,’ he explains. ‘That’s where we are on our own. But the room we inhabit is huge, and AI might actually help to illuminate parts of it that haven’t been explored before.’

Du Sautoy dismisses the idea that this new kind of collaborative music will be ‘mechanical’. Behaving mechanically, he points out, isn’t the exclusive preserve of machines. ‘People start behaving like machines when they get stuck in particular ways of doing things. My hope is that the AI might actually stop us behaving like machines, by showing us new areas to explore.’

Du Sautoy is further encouraged by how those much-hyped ‘AIs’ actually work. And let’s be clear: they do not expand our horizons by thinking better than we do. Nor, in fact, do they think at all. They churn.

‘One of the troubles with machine-learning is that you need huge swaths of data,’ he explains. ‘Machine image recognition is hugely impressive, because there are a lot of images on the internet to learn from. The digital environment is full of cats; consequently, machines have got really good at spotting cats. So one thing which might protect great art is the paucity of data. Thanks to his interminable chorales, Bach provides a toe-hold for machine imitators. But there may simply not be enough Bartok or Brahms or Beethoven for them to learn on.’

There is, of course, the possibility that one day the machines will start learning from each other. Channelling Marshall McLuhan, the curator Hans Ulrich Obrist has argued that art is an early-warning system for the moment true machine consciousness arises (if it ever does arise).

Du Sautoy agrees. ‘I think it will be in the world of art, rather than in the world of technology, that we’ll see machines first express themselves in a way that is original and interesting,’ he says. ‘When a machine acquires an internal world, it’ll have something to say for itself. Then music is going to be a very important way for us to understand what’s going on in there.’

Art that hides in plain sight

Visiting Takis’s survey show at Tate Modern for the Spectator, 13 July 2019

Steel flowers bend in a ‘breeze’ generated by magnetic pendulums. This is the first thing you see as you enter Tate Modern’s survey show. And ‘Magnetic Fields’ (1969) is pretty enough: the work of this self-taught artist, now in his nineties, has rarely been so gentle, or so intuitive.

But there’s a problem. ‘I would like to render [electromagnetism] visible so as to communicate its existence and make its importance known,’ Takis has written. But magnetism hides in plain sight. A certain amount of interference is necessary before it will reveal itself.

Does the interference matter? Does the fact that gallery assistants have to activate this work every ten minutes spoil the ‘cosmicness’ of Takis’s art? The sculptor Alberto Giacometti thought so: ‘One day, during one of my exhibitions, he told me that he didn’t agree with my use of electricity for some of my works,’ Takis recalled in an interview in 1990. ‘He disliked the fact that if you switched off the power, the work would cease to function.’

Why Takis’s pieces should prompt such a finicky response isn’t immediately obvious. What do we expect of this stuff? Perpetual motion? One moment we wonder at the invisible force that can suspend delicate metal cones fractions of an inch above the surface of a canvas. The next moment, we’re peering where we shouldn’t, trying to figure out the circuitry that keeps a sphere swinging over a steel wire.

We’re presented with many wonders — objects rendered weightless, or put into permanent vibration. And as the show progresses (it’s surprisingly large, designed to unfold around corners and spring surprises at your back) the work gets less intuitive, and a lot louder. A pendulum, orbiting a strong, floor-mounted magnet, whips eccentrically and not at all gently about its centre of attraction. It’s like nothing in visible nature. There’s no ‘magnetic breeze’ here, no ‘force like gravity’, just the thing, the weirdness itself. Now we’re getting somewhere.

Born Panayiotis Vassilakis in 1925, Takis discovered his alchemical calling early. One memoir recalls how ‘as a small boy, he would bury pieces of broken glass and other such oddments in the ground to see what happened to them when he impatiently dug them out a couple of days later’. In 1954 he moved to Paris, where he fell in with Marcel Duchamp and Yves Tanguy. In London he inspired a group of young artists who went on to create the politically radical Signals London gallery. In America the beats admired him, the Massachusetts Institute of Technology gave him a fellowship, and the composer John Cage encouraged his shamanism. (‘I cannot think of my work as entirely my work,’ Takis writes. ‘In a sense, I’m only a transmitter.’)

Takis treads the same awkward line in visual art that Cage did in music. Cage promised us that behind the music of signs lay some sort of sonic essence. But his snark hunt proved rather dull. Takis’s own search ends more happily, if only because the eye, in its search for signs, doesn’t admit defeat nearly as quickly as the ear. Takis’s traffic signals, stripped of context and perched on tall poles, become eyes full of sadness and yearning. They still mean something. They’re still signs of something.

Made from oddments plucked from boxes of army and air-force surplus on Tottenham Court Road, some of Takis’s more engineered work has dated. We look at it as a sort of industrial archaeology. Its radicalism, its status as ‘anti-technology’, is hard to fathom.

But the simpler pieces need no translation. They are (suitably enough, for an artist whose works often screech and rattle) a sort of visual equivalent of music. They do not mean anything. They are meaning. They reflect harmonious relationships between energy and space and mass. Takis’s work is like his subject: it hides in plain sight.

“And it will no longer be necessary to ransack the earth…”

Visiting Raw Materials: Plastics at the Nunnery Gallery, Bow Arts, for the Spectator, 1 June 2019

Plastics — even venerable, historically eloquent plastics — hardly draw the eye. As this show’s insightful accompanying publication (a snip at £3) would have it, ‘Plastics have no intrinsic form or texture, thus they are not materials that can be true to themselves.’ They exist within inverted commas. They can be shell-like, horn-like, stony, metallic — they do not really exist on their own behalf.

Mind you, the first vitrine in Raw Materials: Plastics at the Nunnery Gallery in east London contains an object of rare beauty: a small, mottled, crazed, discoloured sphere that looks for all the world like the planet Venus, reduced to handy scale.

It’s a billiard ball, made of the first plastic: cellulose nitrate. Its manufacture had been keenly anticipated. In the US, a $10,000 prize had been offered for anything that could replace ivory in the manufacture of billiard balls (and no wonder: a single tusk yields only three balls).

Under various brand names (Celluloid, Parkesine, Xylonite), and in spite of its tendency to catch fire (colliding snooker balls would occasionally explode), cellulose nitrate saved the elephant. And not just the elephant: plastics pioneer John Wesley Hyatt reckoned that ‘Celluloid [has] given the elephant, the tortoise, and the coral insect a respite in their native haunts; and it will no longer be necessary to ransack the earth in pursuit of substances which are constantly growing scarcer.’

The whole point of plastic is that it has no characteristics of its own, only properties engineered for specific uses. Cheaper than jade. Less brittle than bone. It’s the natural material of the future, always more becoming than being. Hence the names: Xylonite. Bexoid. Halex. Lactoid.

Unable to nail the material in words, one writes instead about its history, sociology, industrial archaeology or ecological impact. On remote islands in the Pacific, thousands of albatross chicks are starving because the parents mistake floating plastic debris for food. Stories like this conjure up a vision of vast islands of discarded plastic coagulating in the Pacific Ocean, but there aren’t any. Instead, plastics eventually fragment into ever smaller pieces that are ingested by marine animals and carried to the sea bottom. In the Mariana Trench, all crustaceans tested had plastics in their guts. So plastics rise and fall through the food chain, creating havoc as they go — a bitter irony for a material that saved the elephant and the turtle, made fresh food conveyable and modern medicine possible, and all for less than 15 per cent of global oil consumption.

What can be gained from looking at the stuff itself? Raw Materials: Plastics transcends the limitations of its material by means of a good story. The first plastics were made in the Lea Valley, not from crude oil, but from plant materials, in a risky, artisanal fashion that bore, for a while, the hallmarks of older crafts including baking, woodcutting and metalwork. Fast-forward 140 years or so and, under the umbrella term ‘bioplastics’, plant-based and biodegradable synthetic products promise to turn the wheel of development full circle, returning plastics to their organic roots. (Designer Peter Marigold’s FORMCard plastic, used here in an excellent school art project, is a starch-based bioplastic made from potato skins.) Then, perhaps, we can break the bind in which we currently find ourselves: the one in which we’re poisoning the planet with plastic in our efforts not to further despoil it.

This is the third and for my money the most ambitious of the gallery’s ongoing series of small, thoughtful exhibitions about the materials, processes and industries that have shaped London’s Lea Valley. (Raw Materials: Wood ran in 2017; Raw Materials: Textiles last year.) The show is more chronicle than catalogue, but the art, scant as it is, punches above its weight.

I was struck, in particular, by France Scott’s ‘PHX [X is for Xylonite]’, a 13-minute collage of photogrammetry, laser scanning and 16mm film. It ought, by all logic, to be a complete mess and I still haven’t been able to work out why it’s so compelling. Is it because digital artefacts, like their plastic forebears, are themselves prisoners of contingency, aping the forms of others while stubbornly refusing to acquire forms of their own?

“The English expedition of 1919 is to blame for this whole misery”

Four books to celebrate the centenary of  Eddington’s 1919 eclipse observations. For The Spectator, 11 May 2019.

Einstein’s War: How relativity triumphed amid the vicious nationalism of World War I
Matthew Stanley
Dutton

Gravity’s Century: From Einstein’s eclipse to images of black holes
Ron Cowen
Harvard University Press

No Shadow of a Doubt
Daniel Kennefick
Princeton University Press

Einstein’s Wife: The real story of Mileva Einstein-Maric
Allen Esterson and David C Cassidy; contribution by Ruth Lewin Sime.
MIT Press

On 6 November 1919, at a joint meeting of the Royal Astronomical Society and the Royal Society, held at London’s Burlington House, the stars went all askew in the heavens.
That, anyway, was the rhetorical flourish with which the New York Times hailed the announcement of the results of a pair of astronomical expeditions conducted in 1919, after the Armistice but before the official end of the Great War. One expedition, led by Arthur Stanley Eddington, assistant to the Astronomer Royal, had repaired to the plantation island of Principe off the coast of West Africa; the other, led by Andrew Crommelin, who worked at the Royal Greenwich Observatory, headed to a racecourse in Brazil. Together, in the few minutes afforded by the 29 May solar eclipse, the teams used telescopes to photograph shifts in the apparent location of stars as the edge of the sun approached them.

The possibility that a heavy body like the sun might cause some distortion in the appearance of the star field was not particularly outlandish. Newton, who had assigned “corpuscles” of light some tiny mass, supposed that such a massive body might draw light in like a lens, though he imagined the effect was too slight to be observable.

The degree of distortion the Eddington expeditions hoped to observe was something else again. 1.75 arc-seconds is roughly the angle subtended by a coin, a couple of miles away: a fine observation, but not impossible at the time. Only the theory of the German-born physicist Albert Einstein — respected well enough at home but little known to the Anglophone world — would explain such a (relatively) large distortion, and Eddington’s confirmation of his hypothesis brought the “famous German physician” (as the New York Times would have it) instant celebrity.

“The English expedition of 1919 is ultimately to blame for this whole misery, by which the general masses seized possession of me,” Einstein once remarked; but he was not so very sorry for the attention. Forget the usual image of Einstein the loveable old eccentric. Picture instead a forty-year-old who, when he steps into a room, literally causes women to faint. People wanted his opinions even about stupid things. And for years, if anyone said anything wise, within a few months their words were being attributed to Einstein.

“Why is it that no one understands me and everyone likes me?” Einstein wondered. His appeal lay in his supposed incomprehensibility. Charlie Chaplin understood: “They cheer me because they all understand me,” he remarked, accompanying the theoretical physicist to a film premiere, “and they cheer you because no one understands you.”

Several books serve to mark the centenary of the 1919 eclipse observations. Though their aims diverge, they all to some degree capture the likeness of Einstein the man, messy personal life and all, while rendering his physics a little bit more comprehensible to the rest of us. Each successfully negotiates the single besetting difficulty facing books of this sort, namely the way science lends itself to bad history.

Science uses its past as an object lesson, clearing all the human messiness away to leave the ideas standing. History, on the other hand factors in as much human messiness as possible to show how the business of science is as contingent and dramatic as any other human activity.

While dealing with human matters, some ambiguity over causes and effects is welcome. There are two sides to every story, and so on and so forth: any less nuanced approach seems suspiciously moralistic. One need only look at the way various commentators have interpreted Einstein’s relationship with his first wife.

Einstein was, by the end of their failing marriage, notoriously horrible to Mileva Einstein-Maric; this in spite of their great personal and intellectual closeness as first-year physics students at the Federal Swiss Polytechnic. Einstein once reassured Elsa Lowenthal, his cousin and second-wife-to-be, that “I treat my wife as an employee I can not fire.” (Why Elsa, reading that, didn’t run a mile, is not recorded.)

Albert was a bad husband. His wife was a mathematician. Therefore Albert stole his theory of special relativity from Mileva. This shibboleth, bandied about since the 1970s, is a sort of of evil twin of whig history, distorted by teleology, anachronism and present-mindedness. It does no one any favours. The three separately authored parts of Einstein’s Wife: The real story of Mileva Einstein-Maric unpick the myth of Mileva’s influence over Albert, while increasing, rather than diminishing, our interest in and admiration of the woman herself. It’s a hard job to do well, without preciousness or special pleading, especially in today’s resentment-ridden and over-sensitive political climate, and the book is an impressive, compassionate accomplishment.
Matthew Stanley’s Einstein’s War, on the other hand, tips ever so slightly in the other direction, towards the simplistic and the didactic. His intentions, however, are benign — he is here to praise Einstein and Eddington and their fellows, not bury them — and his slightly on-the-nose style is ultimately mandated by the sheer scale of what he is trying to do, for he succeeds in wrapping the global, national and scientific politics of an era up in a compelling story of one man’s wild theory, lucidly sketched, and its experimental confirmation in the unlikeliest and most exotic circumstances.

The world science studies is truly a blooming, buzzing confusion. It is not in the least bit causal, in the ordinary human sense. Far from there being a paucity of good stories in science, there are a limitless number of perfectly valid, perfectly accurate, perfectly true stories, all describing the same phenomenon from different points of view.

Understanding the stories abroad in the physical sciences at the fin de siecle, seeing which ones Einstein adopted, why he adopted them, and why, in some cases, he swapped them for others, certainly doesn’t make his theorising easy. But it does give us a gut sense of why he was so baffled by the public’s response to his work. The moment we are able to put him in the context of co-workers, peers and friends, we see that Einstein was perfecting classical physics, not overthrowing it, and that his supposedly peculiar theory of relativity — as the man said himself –“harmonizes with every possible outlook of philosophy and does not interfere with being an idealist or materialist, pragmatist or whatever else one likes.”

In science, we need simplification. We welcome a didactic account. Choices must be made, and held to. Gravity’s Century by the science writer Ron Cowen is the most condensed of the books mentioned here; it frequently runs right up to the limit of how far complex ideas can be compressed without slipping into unavoidable falsehood. I reckon I spotted a couple of questionable interpretations. But these were so minor as to be hardly more than matters of taste, when set against Cowen’s overall achievement. This is as good a short introduction to Einstein’s thought as one could wish for. It even contrives to discuss confirmatory experiments and observations whose final results were only announced as I was writing this piece.

No Shadow of a Doubt is more ponderous, but for good reason: the author Daniel Kennefick, an astrophysicist and historian of science, is out to defend the astronomer Eddington against criticisms more serious, more detailed, and framed more conscientiously, than any thrown at that cad Einstein.

Eddington was an English pacifist and internationalist who made no bones about wanting his eclipse observations to champion the theories of a German-born physicist, even as jingoism reached its crescendo on both sides of the Great War. Given the sheer bloody difficulty of the observations themselves, and considering the political inflection given them by the man orchestrating the work, are Eddington’s results to be trusted?

Kennefick is adamant that they are, modern naysayers to the contrary, and in conclusion to his always insightful biography, he says something interesting about the way historians, and especially historians of science, tend to underestimate the past. “Scientists regard continuous improvement in measurement as a hallmark of science that is unremarkable except where it is absent,” he observes. “If it is absent, it tells us nothing except that someone involved has behaved in a way that is unscientific or incompetent, or both.” But, Kennefick observes, such improvement is only possible with practice — and eclipses come round too infrequently for practice to make much difference. Contemporary attempts to recreate Eddington’s observations face the exact same challenges Eddington did, and “it seems, as one might expect, that the teams who took and handled the data knew best after all.”

It was Einstein’s peculiar fate that his reputation for intellectual and personal weirdness has concealed the architectural elegance of his work. Higher-order explanations of general relativity have become clichés of science fiction. The way massive bodies bend spacetime like a rubber sheet is an image that saturates elementary science classes, to the point of tedium.

Einstein hated those rubber-sheet metaphors for a different reason. “Since the mathematicians pounced on the relativity theory,” he complained, “I no longer understand it myself.” We play about with thoughts of bouncy sheets. Einstein had to understand their behaviours mathematically in four dimensions (three of space and one of time), crunching equations so radically non-linear, their results would change the value of the numbers originally put into them in feedback loops that drove the man out of his mind. “Never in my life have I tormented myself anything like this,” he moaned.

For the rest of us, however, A little, prophylactic exposure to Einstein’s actual work pays huge dividends. It sweeps some of the weirdness away and reveals Einstein’s actual achievement: theories that set all the forces above the atomic scale dancing with an elegance Isaac Newton, founding father of classical physics, would have half-recognised, and wholly admired.