Pluck

Reading Gunpowder and Glory: The Explosive Life of Frank Brock OBE by Harry Smee and Henry Macrory for the Spectator, 21 March 2020

Early one morning in October 1874, a barge carrying three barrels of benzoline and five tons of gunpowder blew up in the Regent’s Canal, close to London Zoo. The crew of three were killed outright, scores of houses were badly damaged, the explosion could be heard 25 miles away, and “dead fish rained from the sky in the West End.”

This is a book about the weird, if obvious, intersection between firework manufacture and warfare. It is, ostensibly, the biography of a hero of the First World War, Frank Brock. And if it were the work of more ambitious literary hands, Brock would have been all you got. His heritage, his school adventures, his international career as a showman, his inventions, his war work, his violent death. Enough for a whole book, surely?

But Gunpowder and Glory is not a “literary” work, by which I mean it is neither self-conscious nor overwrought. Instead Henry Macrory (who anyway has already proved his literary chops with his 2018 biography of the swindler Whitaker Wright) has opted for what looks like a very light touch here, assembling and ordering the anecdotes and reflections of Frank Brock’s grandson Harry Smee about his family, their business as pyrotechnical artists, and, finally, about Frank, his illustrious forebear.

I suspect a lot of sweat went into such artlessness, and it’s paid off, creating a book that reads like fascinating dinner conversation. Reading its best passages, I felt I was discovering Brock the way Harry had as a child, looking into his mother’s “ancient oak chests filled with papers, medals, newspapers, books, photographs, an Intelligence-issue knuckleduster and pieces of Zeppelin and Zeppelin bomb shrapnel.”

For eight generations, the Brock family produced pyrotechnic spectaculars of a unique kind. Typical set piece displays in the eighteenth century included “Jupiter discharging lightning and thunder, Two gladiators combating with fire and sword, and Neptune finely carv’d seated in his chair, drawn by two sea horses on fire-wheels, spearing a dolphin.”

Come the twentieth century, Brock’s shows were a signature of Empire. It would take a write like Thomas Pynchon to do full justice to “a sixty foot-high mechanical depiction of the Victorian music-hall performer, Lottie Collins, singing the chorus of her famous song ‘Ta-ra-ra-boom-de-ay’ and giving a spirited kick of an automated leg each time the word ‘boom’ rang out.”

Frank was a Dulwich College boy, and one of that generation lost to the slaughter of the Great War. A spy and an inventor — James Bond and Q in one — he applied his inherited chemical and pyrotechnical genius to the war effort — by making a chemical weapon. It wasn’t any good, though: Jellite, developed during the summer of 1915 and named after its jelly-like consistency during manufacture, proved insufficiently lethal.

On such turns of chance do reputations depend, since we remember Frank Brock for his many less problematic inventions. Dover flares burned for seven and a half minutes
and lit up an area of three miles radius, as Winston Churchill put it, “as bright as Piccadilly”. U boats, diving to avoid these lights, encountered mines. Frank’s artificial fogs, hardly bettered since, concealed whole British fleets, entire Allied battle lines.

Then there are his incendiary bullets.

At the time of the Great War a decent Zeppelin could climb to 20,000 feet, travel at 47 mph for more than 1,000 miles, and stay aloft for 36 hours. Smee and Mcrory are well within their rights to call them “the stealth bombers of their time”.

Brock’s bullets tore them out of the sky. Sir William Pope, Brock’s advisor, and a professor of chemistry at Cambridge University, explained: “You need to imagine a bullet proceeding at several thousand feet a second, and firing as it passes through a piece of fabric which is no thicker than a pocket handkerchief.” All to rupture a gigantic sac of hydrogen sufficiently to make the gas explode. (Much less easy than you think; the Hindenburg only crashed because its entire outer envelope was set on fire.)

Frank died in an assault on the mole at Zeebrugge in 1918. He shouldn’t have been there. He should have been in a lab somewhere, cooking up another bullet, another light, another poison gas. Today, he surely would be suitably contained, his efforts efficiently channeled, his spirit carefully and surgically broken.

Frank lived at a time when it was possible — and men, at any rate, were encouraged — to be more than one thing. That this heroic idea overreached itself — that rugby field and school chemistry lab both dissolved seamlessly into the Somme — needs no rehearsing.

Still, we have lost something. When Frank went to school there was a bookstall near the station which sold “a magazine called Pluck, containing ‘the daring deeds of plucky sailors, plucky soldiers, plucky firemen, plucky explorers, plucky detectives, plucky railwaymen, plucky boys and plucky girls and all sorts of conditions of British heroes’.”

Frank was a boy moulded thus, and sneer as much as you want, we will not see his like again.

 

Are you experienced?

Reading Wildhood by Barbara Natterson-Horowitz and Kathryn Bowers for New Scientist, 18 March 2020

A king penguin raised on Georgia Island, off the coast of Antarctica. A European wolf. A spotted hyena in the Ngorongoro crater in Tanzania. A north Atlantic humpback whale born near the Dominican Republic. What could these four animals have in common?

What if they were all experiencing the same life event? After all, all animals are born, and all of them die. We’re all hungry sometimes, for food, or a mate.

How far can we push this idea? Do non-human animals have mid-life crises, for example? (Don’t mock; there’s evidence that some primates experience the same happiness curves through their life-course as humans do.)

Barbara Natterson-Horowitz, an evolutionary biologist, and Kathryn Bowers, an animal behaviorist, have for some years been devising and teaching courses at Harvard and the University of California at Los Angeles, looking for “horizontal identities” across species boundaries.

The term comes from Andrew Solomon’s 2014 book Far from the Tree, which contrasts vertical identities (between you and your parents and grandparents, say) with horizontal identities, which are “those among peers with whom you share similar attributes but no family ties”. The authors of Wildhood have expanded Solomon’s concept to include other species; “we suggest that adolescents share a horizontal identity,” they write: “temporary membership in a planet-wide tribe of adolescents.”

The heroes of Wildhood — Ursula the penguin, Shrink the hyena, Salt the whale and Slavc the wolf are all, (loosely speaking) “teens”, and like teens everywhere, they have several mountains to climb at once. They must learn how to stay safe, how to navigate social hierarchies, how to communicate sexually, and how to care for themselves. They need to become experienced, and for that, they need to have experiences.

Well into the 1980s, researchers were discouraged from discussing the mental lives of animals. The change in scientific culture came largely thanks to the video camera. Suddenly it was possible for behavioral scientists to observe, not just closely, but repeatedly, and in slow motion. Soon discoveries arose that could not possibly have been arrived at with the naked eye alone. An animal’s supposedly rote, mechanical behaviours turned out to be the product of learning, experiment, and experience. Stereotyped calls and gestures were unpacked to reveal, not language in the human sense, but languages nonetheless, and many were of dizzying complexity. Animals that we thought were driven by instinct (a word you’ll never even hear used these days), turned out to be lively, engaged, conscious beings, scrabbling for purchase in a confusing and unpredictable world.

The four tales that make up the bulk of Wildhood are more than “Just So” stories. “Every detail,” the authors explain, “is based on and validated by data from GPS satellite or radio collar studies, peer-reviewed scientific literature, published reports and interviews with the investigators involved”.

In addition, each offers a different angle on a wealth of material about animal behaviour. Examples of animal friendship, bullying, nepotism, exploitation and even property inheritance arrive in such numbers and at such a level of detail, it takes an ordinary, baggy human word like “friendship” or “bullying” to contain them.

“Level playing fields don’t exist in nature”, the authors assert, and this is an important point, given the book’s central claim that by understanding the “wildhoods” of other animals, we can develop better approaches “for compassionately and skillfully guiding human adolescents toward adulthood.”

The point is not to use non-human behaviour as an excuse for human misbehaviour. Lots of animals kill and exploit each other, but that shouldn’t make exploitation or murder acceptable. The point is to know which battles to pick. Making young school playtimes boring by quashing the least sign of competitiveness makes little sense, given the amount of biological machinery dedicated to judging and ranking in every animal species from humans to lobsters. On the other hand, every young animal, when it returns to its parents, gets a well-earned break from the “playground” — but young humans don’t. They’re now tied 24-7 to social media that prolongue, exaggerate and exacerbate the ranking process. Is the rise in mental health problems among the affluent young triggered by this added stress?

These are speculations and discussions for another book, for which Wildhood may prove a necessary and charming foundation. Introduced in the first couple of pages to California sea otters, swimming up to sharks one moment then fleeing from a plastic boat the next, the reader can’t help but recognise, in the animals’ overly bold and overly cautious behaviour, the gawkiness and tremor of their own adolescence.

Pollen count

THEY are red, they have stalks that look like eels, and no leaves. But Karl, the boss of the laboratory – played by the unsettling David Wilmot – has his eye on them for the forthcoming flower fair. He tells visiting investors that these genetically engineered creations are “the first mood-lifting, antidepressant, happy plant”.

Ben Whishaw’s character, Chris, smirks: “You’ll love this plant like your own child.”

Chris is in love with Alice, played by Emily Beecham, who is in love with her creations, her “Little Joes”, even to the point of neglecting her own son, Joe.

Owning and caring for a flower that, treated properly, will emit pollen that can induce happiness, would surely be a good thing for these characters. But the plant has been bred to be sterile, and it is determined to propagate itself by any means necessary.

Little Joe is an exercise in brooding paranoia, and it feeds off some of the more colourful fears around the genetic modification of plants.

Kerry Fox plays Bella, whose disappointments and lack of kids seem to put her in the frame of mind to realise what these innocent-looking blooms are up to. “The ability to reproduce is what gives every living thing meaning!” she exclaims. Her colleagues might just be sceptical about this because she is an unhappy presence in the lab, or they may already have fallen under the sway of Little Joe’s psychoactive pollen.

Popular fears around GM – the sort that dominated newspapers and scuppered the industry’s experimental programmes in the mid-1990s – are nearly as old as the science of genetics itself.

At about the turn of the 20th century, agricultural scientists in the US combined inbred lines of maize and found that crop yields were radically increased. Farmers who bought the specially bred seed found that their yields tailed off in subsequent years, so it made sense to buy fresh seed yearly because the profits from bigger crops more than covered the cost of new seeds.

In the 2000s, Monsanto, a multinational agribusiness, added “terminator” genes to the seed it was developing to prevent farmers resowing the product of the previous year’s crop. This didn’t matter to most farmers, but the world’s poorest, who still rely on replanting last year’s seed, were vociferous in their complaints, and a global scandal loomed.

Monsanto chose not, in the end, to commercialise its terminator technologies, but found it had already created a monster: an urban myth of thwarted plant fecundity that provides Jessica Hausner’s Little Joe with its science fictional plot.

What does Little Joe’s pollen do to people? Is it a vegetal telepath, controlling the behaviour of its subjects? Or does it simply make the people who enjoy its scent happier, more sure of themselves, more capable of making healthy life choices? Would that be so terrible? As Karl says, “Who can prove the genuineness of feelings? Moreover, who cares?”

Well, we do, or we should. If, like Karl, we come to believe that the “soul” is nothing more than behaviour, then people could become zombies tomorrow and no one would notice.

Little Joe’s GM paranoia may set some New Scientist readers’ teeth on edge, but this isn’t ultimately, what the movie is about. It is after bigger game: the nature of human freedom.

Read more: https://www.newscientist.com/article/mg24532733-200-little-joe-review-we-should-worry-about-these-mind-bending-plants/#ixzz6GmLN3qNx

All fall down

Talking to Scott Grafton about his book Physical Intelligence (Pantheon), 10 March 2020.

“We didn’t emerge as a species sitting around.”

So says University of California neuroscientist Scott Grafton in the introduction to his provoking new book Physical Intelligence. In it, Grafton assembles and explores all the neurological abilities that we take for granted — “simple” skills that in truth can only be acquired with time, effort and practice. Perceiving the world in three dimensions is one such skill; so is steadily carrying a cup of tea.

At UCLA, Grafton began his career mapping brain activity using positron emission tomography, to see how the brain learns new motor skills and recovers from injury or neurodegeneration. After a career developing new scanning techniques, and a lifetime’s walking, wild camping and climbing, Grafton believes he’s able to trace the neural architectures behind so-called “goal-directed behavior” — the business of how we represent and act physically in the world.

Grafton is interested in all those situations where “smart talk, texting, virtual goggles, reading, and rationalizing won’t get the job done” — those moments when the body accomplishes a complex task without much, if any, conscious intervention.. A good example might be bagging groceries. Suppose you are packing six different items into two bags. There are 720 possible ways to do this, and — assuming that like most people you want heavy items on the bottom, fragile items on the top, and cold items together — more than 700 of the possible solutions are wrong. And yet we almost always pack things so they don’t break or spoil, and we almost never have to agonise over the countless micro-decisions required to get the job done.

The grocery-bagging example is trivial, but often, what’s at stake in a task is much more serious — crossing the road, for example — and sometimes the experience required to accomplish it is much harder to come by. A keen hiker and scrambler, Grafton studs his book with first-hand accounts, at one point recalling how someone peeled off the side of a snow bank in front of him, in what escalated rapidly into a ghastly climbing accident. “At the spot where he fell,” he writes, “all I could think was how senseless his mistake had been. It was a steep section but entirely manageable. Knowing just a little bit more about how to use his ice axe, he could have readily stopped himself.”

To acquire experience, we have to have experiences. To acquire life-saving skills, we have to risk our lives. The temptation, now that we live most of our lives in urban comfort, is to create a world safe enough that we don’t need expose ourselves to such risks, or acquire such skills.

But this, Grafton tells me, when we speak on the phone, would be a big mistake. “If all you ever are walking on is a smooth, nice sidewalk, the only thing you can be graceful on is that sidewalk, and nothing else,” he explains. “And that sets you up for a fall.”

He means this literally: “The number one reason people are in emergency rooms is from what emergency rooms call ‘ground-level falls’. I’ve seen statistics which show that more and more of us are falling over for no very good reason. Not because we’re dizzy. Not because we’re weak. But because we’re inept. ”

For more than 1.3 million years of evolutionary time, hominids have lived without pavements or chairs, handling an uneven and often unpredictable environment. We evolved to handle a complex world, and a certain amount of constant risk. “Very enriched physical problem solving, which requires a lot of understanding of physical relationships, a lot of motor control, and some deftness in putting all those understandings together — all the while being constantly challenged by new situations — I believe this is really what drives brain networks towards better health,” Grafton says.

Our chat turns speculative. The more we removed risks and challenges from our everyday environment, Grafton suggests, the more we’re likely to want to complicate and add problems to the environment, to create challenges for ourselves that require the acquisition of unusual motor skills. Might this be a major driver behind cultural activities like music-making, craft and dance?

Speculation is one thing; serious findings are another. At the moment, Grafton is gathering medical and social data to support an anecdotal observation of his: that the experience of walking in the wild not only improves our motor abilities, but also promotes our mental health.

“A friend of mine runs a wilderness programme in the Sierra Nevada for at-risk teenagers,” he explains, “and one of the things he does is to teach them how to get by for a day or two in the wilderness, on their own. It’s life-transforming. They come out of there owning their choices and their behaviour. Essentially, they’ve grown up.”

Over-performing human

Talking to choreographer Alexander Whitley for the Financial Times,  28 February 2020

On a dim and empty stage, six masked black-clad dancers, half-visible, their limbs edged in light, run through attitude after attitude, emotion after emotion. Above the dancers, a long tube of white light slowly rises, falls, tips and circles, drawing the dancers’ limbs and faces towards itself like a magnet. Under its variable cold light, movements become more expressive, more laden with emotion, more violent.

Alexander Whitley, formerly of the Royal Ballet School and the Birmingham Royal Ballet, is six years into a project to expand the staging of dance with new media. He has collaborated with filmmakers, designers, digital artists and composers. Most of all, he has played games with light.

The experiments began with The Measures Taken, in 2014. Whitley used motion-tracking technology to project visuals that interacted with the performers’ movements. Then, dissatisfied with the way the projections obscured the dancers, in 2018 he used haze and narrowly focused bars of light to create, for Strange Stranger, a virtual “maze” in which his dancers found themselves alternately liberated and constrained.

At 70 minutes Overflow, commissioned by Sadler’s Wells Theatre, represents a massive leap in ambition. With several long-time collaborators — in particular the Dutch artist-designers Children of the Light — Whitley has worked out how to reveal, to an audience sat just a few feet away, exactly what he wants them to see.

Whitley is busy nursing Overflow up to speed in time for its spring tour. The company begin with a night at the Lowry in Salford on 18 March, before performing at Sadler’s Wells on 17 and 18 April.

Overflow, nearly two years in the making, has consumed money as well as time. The company is performing at Stereolux in Nantes in April and will need more overseas bookings if it is to flourish. “There’s serious doubt about the status of the UK and UK touring companies now,” says Whitley (snapping at my cheaply dangled Brexit bait); “I hope there’s enough common will to build relationships in spite of the political situation.”

It is easy to talk politics with Whitley (he is very well read), but his dances are anything but mere vehicles for ideas. And while Overflow is a political piece by any measure — a survey of our spiritual condition under survellance capitalism, for heaven’s sake — its effects are strikingly classical. It’s not just the tricksy lighting that has me thinking of the figures on ancient Greek vases. It’s the dancers themselves and their clean, elegant, tragedian’s gestures.

A dancer kneels, and takes hold of his head. He tilts it up into the light as it turns and tilts, inches from his face, and, in a shocking piece of trompe l’ioel — can he really be pulling his face apart?

Overflow is about our relationship to the machines that increasingly govern our lives. But there’s not a hint of regimentation here, or mechanisation. These dancers are not trying to perform machine. They’re trying to perform human.

Whitley laughs at this observation. “I guess, as far as that goes, they’re over-performing human. They’re caught up in the excitement and hyper-stimulation of their activity. Which is exactly how we interact with social media. We’re being hyperstimulated into excessive activity. Keep scrolling, keep consuming, keep engaging!”

It was an earlier piece, 2016’s Pattern Recognition, that set Whitley on the road to Overflow. “I’d decided to have the lights moving around the stage, to give us the sense of depth we’d struggled to achieve in The Measures Taken. But very few people I talked to afterwards realised or understood that our mobile stage lights were being driven by real-time tracking. They thought you could achieve what we’d achieved just through choreography. At which point a really obvious insight arrived: that interactivity is interesting, first and foremost, for the actor involved in the interaction.”

In Overflow, that the audience feels left out is no longer a technical problem: it’s the whole point of the piece. “We’re all watching things we shouldn’t be watching, somehow, through social media and the internet,” says Whitley. “That the world has become so revealed is unpleasant. It’s over-exposed us to elements of human nature that should perhaps remain private. But we’re all bound up in it. Even if we’re not doing it, we’re watching it.”

The movements of the ensemble in Overflow are the equivalent of emoji: “I was interested in how we could think of human emotions just as bits of data,” Whitley explains. In the 1980s a psychologist called Robert Plutchik stated that there were eight basic emotions: joy, trust, fear, surprise, sadness, anticipation, anger, and disgust. “We stuck pins at random into this wheel chart he invented, choosing an emotion at random, and from that creating an action that somehow embodied or represented it. And the incentive was to do so as quickly and concisely as possible, and as soon it’s done, choose another one. So the dancers are literally jumping at random between all these different human emotions. It’s not real communication, just an outpouring of emotional information.”

The solos are built using material drawn from each dancer’s movement diary. “The dancers made diary entries, which I then filmed, based on how they were feeling each day. They’re movement dairies: personal documents of their emotional lives, which I then chopped up and jumbled around and gave back to them as a video to learn.”

In Whitley’s vision, the digital realm isn’t George Orwell’s Big Brother, dictating our every move from above. It’s more like the fox and the cat in the Pinnochio story, egging a naive child into the worst behaviours, all in the name of independence and free expression. “Social media encourage us to act more, to feel more, to express more, because the more we do that, the more capital they can generate from our data, and the more they can understand and predict what we’re likely to do next.”

This is where the politics comes in: the way “emotion, which incidentally is the real currency of dance, is now the major currency of the digital economy”.

It’s been a job of work, packing such cerebral content into an emotional form like dance. But Whitley says it’s what keeps him working, ” that sheer impossibility of pinning down ideas that otherwise exist almost entirely in words. As soon as you scratch the surface, you realise there’s huge amount of communication always at work through the body and drawing ideas from a more cerebral world into the physical, into the emotional, is a constant fascination. There are lifetimes of enquiry here. It’s what keeps me coming back.”

Nicholas, c’est moi

Watching Color Out of Space for New Scientist, 12 February 2020

Nicholas Cage’s efforts to clear his debts after 2012’s catastrophic run-in with the IRS continue with yet another relatively low-budget movie, Color Out of Space, a film no-one expects much of. (It’s in US cinemas now; by the time it reaches UK screens, on 28 February, it will already be available on Blu-Ray.)

Have you ever watched a bad film and found yourself dreaming about it months afterwards? Color Out of Space is one of those.

To begin: in March 1927 the author H. P Lovecraft wrote what would become his personal favourite story. In “The Color Out of Space”, a meteor crashes into a farmer’s field in the Massachusetts hills. The farmer’s crops grow huge, but prove inedible. His livestock go mad. So, in the end, does the farmer, haunted by a colour given off by a visiting presence in the land: a glow that belongs on no ordinary spectrum.

This is Lovecraft’s riff on a favourite theme of fin-de-siecle science fiction: the existence of new rays, and with them, new ways of seeing. The 1890s and 1900s were, after all, radiant years. Victor Schumann discovered ultraviolet radiation in 1893. Wilhelm Röntgen discovered X-rays in 1895. Henri Becquerel discovered radioactivity in 1896. J. J. Thomson discovered that cathode rays were streams of electrons in 1897. Prosper-René Blondlot discovered N-rays in 1903 — only they turned out not to exist: an artefact of observational error and wishful thinking.

And this is pretty much what the local media assume has happened when Nathan Gardner, the not-very-effective head of a household that is downsizing after unspecified health problems and financial setbacks, describes the malevolent light he catches spilling at odd moments from his well. The man’s a drunk, is what people assume. A fantasist. An eccentric.

The film is yet another attempt to fuse American Gothic to a contemporary setting. Director Richard Stanley (who brough us 1990’s Hardware, another valuable bad movie) has written a script that, far from smoothing out the discrepancies between modern and pre-modern proprieties, manners, and ways of speaking, leaves them jangling against each other in a way that makes you wonder What On Earth Is Going On.

And what is going on, most of the time, is Nicholas Cage as Gardner. Has anyone before or since conveyed so raucously and yet so well the misery, the frustration, the rage, the self-hatred of weak men? Every time he gets into a fist-fight with a car interior I think to myself, Ah, Nicholas, c’est moi.

Even better, Cage’s on-screen wife here is Joely Richardson, an actress who packs a lifetime’s disappointments into a request to pass the sugar.

Alien life is not like earth life and to confront it is to invite madness, is the general idea. But with tremendous support from on-screen children Madeleine Arthur and Brendan Meyer, Cage and Richardson turn what might have been a series of uninteresting personal descents into a family tragedy of Jacobean proportions. If ever hell were other people, then at its deepest point you would find the Gardner family, sniping at each other across the dinner table.

Color Out of Space mashes up psychological drama, horror, and alien invasion. It’s not a film you admire. It’s a film you get into internal arguments with, as you try and sort all the bits out. In short, it does exactly what it set out to do. It sticks.

An embarrassment, a blowhard, a triumph

Watching Star Trek: Picard for New Scientist, 24 January 2020

Star Trek first appeared on television on 8 September 1966. It has been fighting the gravitational pull of its own nostalgia ever since – or at least since the launch of the painfully careful spin-off Star Trek: The Next Generation 21 years later.

The Next Generation was the series that gave us shipboard counselling (a questionable idea), a crew that liked each other (a catastrophically mistaken idea) and Patrick Stewart as Jean-Luc Picard, who held the entire farrago together, pretty much single-handed, for seven seasons.

Now Picard is back, retired, written off, an embarrassment and a blowhard. And Star Trek: Picard is a triumph, praise be.

Something horrible has happened to the “synthetics” (read: robots) who, in the person of Lieutenant Commander Data (Brent Spiner, returning briefly here) once promised so much for the Federation. Science fiction’s relationship with its metal creations is famously fraught: well thought-through robot revolt provided the central premise for Battlestar Galactica and Westworld, while Dune, reinvented yet again later this year as a film by Blade Runner 2049‘s Denis Villeneuve, is set in a future that abandoned artificial intelligence following a cloudy but obviously dreadful conflict.

And there is a perfectly sound reason for this mayhem. After all, any machine flexible enough to do what a robot is expected to do is going to be flexible enough to down tools – or worse. What Picard‘s take on this perennial problem will be isn’t yet clear, but the consequences of all the Federation’s synthetics going haywire is painfully felt: it has all but abandoned its utopian remit. It is now just one more faction in a fast-moving, galaxy-wide power arena (echoes of the Trump presidency and its consequences are entirely intentional).

Can Picard, the last torchbearer of the old guard, bring the Federation back to virtue? One jolly well hopes so, and not too quickly, either. Picard is, whatever else we may say about it, a great deal of fun.

There are already some exciting novelties, though the one I found most intriguing may turn out to be a mere artefact of getting the show off the ground. Picard’s world – troubled by bad dreams quite as much as it is enabled by world-shrinking technology – is oddly surreal, discontinuous in ways that aren’t particularly confusing but do jar here and there.

Is the Star Trek franchise finally getting to grips with the psychological consequences of its mastery of time and space? Or did the producers simply shove as much plot as possible into the first episode to get the juggernaut rolling? The latter seems more likely, but I hold out hope.

The new show bears its burden of twaddle. The first episode features a po-faced analysis of Data’s essence. No, really. His essence. That’s a thing, now. How twaddle became an essential ingredient on The Next Generation – and now possibly Picard – is a mystery: the original Star Trek never felt the need to saddle itself with such single-use, go-nowhere nonsense. But by now, like a hold full of tribbles, the twaddle seems impossible to shake off (Star Trek: Discovery, I’m looking at you).

Oh, but why cavil? Stewart brings a new vulnerability and even a hint of bitterness to grit his seemlessly fluid recreation of Picard, and the story promises an exciting and fairly devastating twist to the show’s old political landscape. Picard, growing old disgracefully? Oh, please make it so!

“I heard the rustling of the dress for two whole hours”

By the end of the book I had come to understand why kindness and cruelty cannot vanquish each other, and why, irrespective of our various ideas about social progress, our sexual and gender politics will always teeter, endlessly and without remedy, between “Orwellian oppression and the Hobbesian jungle”…

Reading Strange Antics: A history of seduction by Clement Knox, 1 February 2020

“So that’s how the negroes of Georgia live!”

Visiting W.E.B. Du Bois: Charting Black Lives, at the House of Illustration, London, for the Spectator, 25 January 2020

William Edward Burghardt Du Bois was born in Massachusetts in 1868, three years after the official end of slavery in the United States. He grew up among a small, tenacious business- and property-owning black middle class who had their own newspapers, their own schools and universities, their own elected officials.

After graduating with a PhD in history from Harvard University, Du Bois embarked on a sprawling study of African Americans living in Philadelphia. At the historically black Atlanta University in 1897, he established international credentials as a pioneer of the newfangled science of sociology. His students were decades ahead of their counterparts in the Chicago school.

In the spring of 1899, Du Bois’s son Burghardt died, succumbing to sewage pollution in the Atlanta water supply. ‘The child’s death tore our lives in two,’ Du Bois later wrote. His response: ‘I threw myself more completely into my work.’

A former pupil, the black lawyer Thomas Junius Calloway, thought that Du Bois was just the man to help him mount an exhibition to demonstrate the progress that had been made by African Americans. Funded by Congress and planned for the Paris Exposition of 1900, the project employed around a dozen clerks, students and former students to assemble and run ‘the great machinery of a special census’.

Two studies emerged. ‘The Georgia Negro’, comprising 32 handmade graphs and charts, captured a living community in numbers: how many black children were enrolled in public schools, how far family budgets extended, what people did for work, even the value of people’s kitchen furniture.

The other, a set of about 30 statistical graphics, was made by students at Atlanta University and considered the African American population of the whole of the United States. Du Bois was struck by the fact that the illiteracy of African Americans was ‘less than that of Russia, and only equal to that of Hungary’. A chart called ‘Conjugal Condition’ suggests that black Americans were more likely to be married than Germans.

The Exposition Universelle of 1900 brought all the world to the banks of the Seine. Assorted Africans, shipped over for the occasion, found themselves in model native villages performing bemused and largely made-up rituals for the visitors. (Some were given a truly lousy time by their bosses; others lived for the nightlife.) Meanwhile, in a theatre made of plaster and drapes, the Japanese geisha Sada Yacco, wise to this crowd from her recent US tour, staged a theatrical suicide for herself every couple of hours.

The expo also afforded visitors more serious windows on the world. Du Bois scraped together enough money to travel steerage to Paris to oversee his exhibition’s installation at the Palace of Social Economy.

He wasn’t overly impressed by the competition. ‘There is little here of the “science of society”,’ he remarked, and the organisers of the Exposition may well have agreed with him: they awarded him a gold medal for what Du Bois called, with justifiable pride, ‘an honest, straightforward exhibit of a small nation of people, picturing their life and development without apology or gloss, and above all made by themselves’.

At the House of Illustration in London you too can now follow the lines, bars and spirals that reveal how black wealth, literacy and land ownership expanded over the four decades since emancipation.

His exhibition also included what he called ‘the usual paraphernalia for catching the eye — photographs, models, industrial work, and pictures’, so why did Du Bois include so many charts, maps and diagrams?

The point about data is that it looks impersonal. It is a way of separating your argument from what people think of you, and this makes it a powerful weapon in the hands of those who find themselves mistrusted in politics and wider society. Du Bois and his community, let’s not forget, were besieged — by economic hardship, and especially by the Jim Crow laws that would outlive him by two years (he died in 1963).

Du Bois pioneered sociology, not statistics. Means of visualising data had entered academia more than a century before, through the biographical experiments of Joseph Priestly. His timeline charts of people’s lives and relative lifespans had proved popular, inspiring William Playfair’s invention of the bar chart. Playfair, an engineer and political economist, published his Commercial and Political Atlas in London in 1786. It was the first major work to contain statistical graphs. More to the point, it was the first time anyone had tried to visualise an entire nation’s economy.

Statistics and their graphic representation were quickly established as an essential, if specialised, component of modern government. There was no going back. Metrics are a self-fertilising phenomenon. Arguments over figures, and over the meaning of figures, can only generate more figures. The French civil engineer Charles Joseph Minard used charts in the 1840s to work out how to monetise freight on the newfangled railroads, then, in retirement, and for a hobby, used two colours and six dimensions of data to visualise Napoleon’s invasion and retreat during the 1812 campaign of Russia.

And where society leads, science follows. John Snow founded modern epidemiology when his annotated map revealed the source of an outbreak of cholera in London’s Soho. English nurse Florence Nightingale used information graphics to persuade Queen Victoria to improve conditions in military hospitals.

Rightly, we care about how accurate or misleading infographics can be. But let’s not forget that they should be beautiful. The whole point of an infographic is, after all, to capture attention. Last year, the House of Illustration ran a tremendous exhibition of the work of Marie Neurath who, with her husband Otto, dreamt up a way of communicating, without language, by means of a system of universal symbols. ‘Words divide, pictures unite’ was the slogan over the door of their Viennese design institute. The couple’s aspirations were as high-minded as their output was charming. The Neurath stamp can be detected, not just in kids’ picture books, but across our entire designscape.

Infographics are prompts to the imagination. (One imagines at least some of the 50 million visitors to the Paris Expo remarking to each other, ‘So that’s how the negroes of Georgia live!’) They’re full of facts, but do they convey them more effectively than language? I doubt it. Where infographics excel is in eliciting curiosity and wonder. They can, indeed, be downright playful, as when Fritz Kahn, in the 1920s, used fast trains, street traffic, dancing couples and factory floors to describe, by visual analogy, the workings of the human body.

Du Bois’s infographics aren’t rivals to Kahn or the Neuraths. Rendered in ink, gouache watercolour and pencil, they’re closer in spirit to the hand-drawn productions of Minard and Snow. They’re the meticulous, oh-so-objective statements of a proud, decent, politically besieged people. They are eloquent in their plainness, as much as in their ingenuity, and, given a little time and patience, they prove to be quite unbearably moving.