A lawyer scenting blood

Reading Unwired by Gaia Bernstein for New Statesman, 15 May 2023

In 2005, the journal Obesity Research published a study that, had we but known it, told us everything we needed to know about our coming addiction to digital devices.

The paper, “Bottomless Bowls: Why Visual Cues of Portion Size May Influence Intake” was about soup. Researchers led by Brian Wansink of Cornell University invited volunteers to lunch. One group ate as much soup as they wanted from regular bowls. The other ate from bowls that were bolted to the table and refilled automatically from below. Deprived of the “stopping signal” of an empty bowl, this latter group ate 73 per cent more than the others — and had no idea that they had over-eaten.

It’s a tale that must haunt the dreams of Asa Raskin, the man who invented, then publically regretted, “infinite scroll”. That’s the way mobile phone apps (from Facebook to Instagram, Twitter to Snapchat) provide endless lists of fresh content to the user, regardless of how much content has already been consumed.

Gaia Bernstein, a law professor at Seton Hall, includes infinite scroll in her book’s catalogue of addicting smart-device features. But this is as much about what these devices don’t do. For instance in his 2022 book Lost Focus Johann Hari wonders why Facebook never tells you which of your friends are nearby and up for a coffee. Well, the answer’s obvious enough: because lonely people, self-medicating with increasing quantities of social media, are Facebook’s way of making money.

What do we mean when we say that our mobile phones and tablets and other smart devices are addicting?

The idea of behavioural addiction was enshrined in DSM-5, the manual of mental disorders issued by the American Psychiatric Association, in 2015. DSM-5 is a bloated beast, and yet its flaky-sounding “Behavioral Addictions” — that, on the face of it, could make a mental disorder of everything we like to do — have proved remarkably robust, as medicine reveals how addictions, compulsions and enthusiasms share the same neurological pathways. You can addict humans (and not just humans) to pretty much anything. All you need to do is weaponise the environment.

And the environment, according to Bernstein’s spare, functional and frightening account, is most certainly weaponised. Teenagers, says Bernsteins, spend barely a third of the time partying that they used to in the 1980s, and the number of teens who get together with their friends has halved between 2000 and 2015. If ever there was a time to market a service to lonely people by making them more lonely, it’s now.

For those of us who want to sue GAMA (Google, Amazon, Meta, Apple) for our children’s lost childhood, galloping anxiety, poor impulse control, obesity, insomnia and raised suicide risk, the challenge is to demonstrate that it’s screentime that’s done all this damage to how they feel, and how they behave. And that, in an era of helicopter-parenting, is hard to do. danah boyd’s 2014 book It’s Complicated shows how difficult it’s going to be to separate the harms inflicted by little Johnny’s iPhone from all the benefits little Johnny enjoys. To hear boyd tell it, teenagers “obsessed” with social media are simply trying to recreate, for themselves and each other, a social space denied them by anxious parents, hostile authorities, and a mass media bent on exaggerating every conceivable out-of-doors danger.

The Covid pandemic has only exacerbated the stay-at-home, see-no-one trend among young people. Children’s average time online doubled from three to six hours during lockdown. It use to be that four per cent of children spent more than eight hours a day in front of a smart screen. Now over a quarter of them do.

Nor have we merely inherited this dismal state of affairs; we’ve positively encouraged it, stuffing our schools with technological geegaws in the fond and (as it turns out) wildly naive belief that I.T. will improve and equalise classroom performance. (It doesn’t, which this is why Silicon Valley higher-ups typically send their children to Waldorf schools, which use chalk, right up until the eighth grade.)

Bernstein, who regularly peppers an otherwise quite dry account with some eye-popping personal testimony, recalls meeting one mum whose son was set to studying history through a Roblox game mode called Assassin’s Creed Odyssey (set in ancient Greece). “Since then, whenever she asks him to get off Roblox, he insists it is homework.”

Bernstein believes there’s more to all this than a series of unfortunate events. She thinks the makers of smart devices knew exactly what they were doing, as surely as the tobacco companies knew that the cigarettes they manufactured caused cancer.

Bernstein reckons we’re at a legal tipping point: this is her playbook for making GAMA pay for addicting us to glass.

Here’s what we already know about how companies respond to being caught out in massive wrong-doing.

First, they ignore the problem. (In 2018 an internal Facebook presentation warned: “Our algorithm exploits the human brain’s attraction to divisiveness… If left unchecked [it would feed users] more and more divisive content to gain user attention & increase time on the platform.” Mark Zuckerberg responded by asking his people “not to bring something like that to him again”.)

Then they deny there’s a problem. Then they go to war with the science, refuting critical studies and producing their own. Then, they fend off public criticism — and place responsibility on the consumer — by offering targeted solutions. (At least the filter tips added to cigarettes were easy to use. Most “parental controls” on smart devices are so cumbersome and inaccessible as to be unuseable.) Finally, they offer to create a system of self-regulation — by which time, Bernstein reckons, you’ve won, or you will have won, so long as you have proven that the people you’re going after intended, all along, to addict their customers.

You might, naively, imagine that this matter rests upon the science. It doesn’t, and Bernstein’s account of the screentime science wars is quite weak — a shallow confection built largely of single studies.

The scientific evidence is stronger than Bernstein makes it sound, but there’s still a problem: it’ll take a generation to consolidate. There are other, better ways to get at the truth in a timely manner; for instance, statistics, which will tell you that we have the largest ever recorded epidemic of teenage mental health problems, whose rising curves correlate with terrifying neatness with the launch of various social media platforms.

Bernstein is optimistic: “Justifying legal interventions,” she says, “is easier when the goal is to correct a loss of autonomy”, and this after all, is the main charge she’s laying at GAMA’s door: that these companies have created devices that rob us of our will, leaving us ever more civically and psychologically inept, the more we’re glued to their products.

Even better (at least from the point of view of a lawyer scenting blood), we’re talking about children. “Minors are the Achilles heel,” Bernstein announces repeatedly, and with something like glee. Remember how the image of children breathing in their parents’ second-hand smoke broke big tobacco? Well, just extend the analogy: here we have a playground full of kids taking free drags of Capstans and Players No. 6.

Unwired is not, and does not aspire to be, a comprehensive account of the screen-addiction phenomenon. It exists to be used: an agenda for social change through legal action. It is a knife, not a brush. But it’ll be of much more than academic value to those of us whose parenting years were overshadowed by feelings of guilt, frustration and anxiety, as we fought our hopeless battles, and lost our children to TikTok and Fortnite.

On not being a horrible person

Reading The Human Mind by Paul Bloom for New Scientist, 11 May 2023

Inspired, he tells us, by The Origin of the Universe, John Barrow’s 1994 survey of what was then known about cosmology, the Canadian American psychologist Paul Bloom set about writing an introductory tome of his own: a brief yet comprehensive guide to the human mind.

Emulating Barrow’s superb survey has been hard because, as Bloom cheekily points out, “the mysteries of space and time turn out to be easier for our minds to grasp than those of consciousness and choice.”

The first thing to say — though hardly the most interesting — is that Bloom nevertheless succeeds, covering everything from perception and behaviour to language and development; there’s even a small but very worthwhile foray into abnormal psychology. It’s an account that is positive, but never self-serving. Problems in reproducing some key studies, the field’s sometimes scandalous manipulation of statistics, and the once prevailing assumption that undergrad volunteers could accurately represent the diversity of the entire human species, are serious problems, dealt with seriously.

Of course Bloom does more than simply set out the contents of the stall (with the odd rotten apple here and there); he also explores psychology’s evolving values. He recalls his early behaviourist training, in a climate hostile to (then rather woolly) questions about consciousness. “If we were asked to defend our dismissal of consciousness,” he recalls, “we would point out that intelligence does not require sentience.”

Intelligence is no longer the field’s only grail, and consciousness is now front and centre in the science of the mind. This is not only a technical advance; it’s an ethical one. In 1789 Jeremy Bentham asked whether the law could ever refuse its protection to “any sensitive being”, and pointed out that “The question is not, Can [certain beings] reason?, nor Can they talk? but, Can they suffer?”

Suffering requires consciousness, says Bloom; understanding one enables us to tackle the other; so the shift in interest to consciousness itself is a welcome and humanising move.

This strong belief in the humanitarian potential of psychology allows Bloom to defend aspects of his discipline that often discomfort outside observers. He handles issues of environmental and genetic influences on the mind very well, and offers a welcome and robust defence of Alfred Binet’s 1905 invention, the measure of general intelligence or “intelligence quotient”. Bloom shows that the IQ test is as robust a metric as anything in social science. We know that a full half of us score less than 100 on that test; should this knowledge not fill us with humility and compassion? (Actually our responses tend to be more ambiguous. Bloom points out that Nazi commentators hated the idea of IQ because they thought Jews would score better than they would.)

Bloom is concerned to demonstrate that minds do more than think. The privileging of thinking over feeling and intuiting and suffering is a mistake. “A lot depends on what is meant by ‘rational.’ Bloom writes. If you’re stepping outside and it’s raining and you don’t want to get wet, it’s rational to bring an umbrella. But rationality defined in this manner is separate from goodness. “Kidnapping a rich person’s child might be a rational way to achieve the goal of getting a lot of money quickly,” Bloom observes, “so long as you don’t have other goals, such as obeying the law and not being a horrible person.”

Bloom’s ultimate purpose is to explain how a robustly materialistic view of the mind is fully compatible with the existence of choice and morality and responsibility. This middle-of-the-road approach may disappoint intellectual storm-chasers, but the rest of us can can be assured of an up-to-the-minute snapshot of the field, full of unknowns and uncertainties, yes, and speculations, and controversies — but guided by an ever-more rounded idea of what it is to be human.

The mind unlocked

Reading The Battle for Your Brain by Nita Farahany for New Scientist, 19 April 2023

Iranian-American ethicist and lawyer Nita Farahany is no stranger to neurological intervention. She has sought relief from her chronic migraines in “triptans, anti-seizure drugs, antidepressants, brain enhancers, and brain diminishers. I’ve had neurotoxins injected into my head, my temples, my neck, and my shoulders; undergone electrical stimulation, transcranial direct current stimulation, MRIs, EEGs, fMRIs, and more.”

Few know better than Farahany what neurotechnology can do for people’s betterment, and this lends weight to her sombre and troubling account of a field whose speed of expansion alone should give us pause.

Companies like Myontec, Athos, Delsys and Noraxon already offer electromyography-generated insights to athletes and sports therapists. Control Bionics sells NeuroNode, a wearable EMG device for patients with degenerative neurological disorders, enabling them to control a computer, tablet, or motorised device. Neurable promises “the mind unlocked” with its “smart headphones for smarter focus.” And that’s before we even turn to the fast-growing interest in implantable devices; Synchron, Blackrock Neurotech and Elon Musk’s Neuralink all have prototypes in advanced stages of development.

Set aside the legitimate medical applications for a moment; Farahany is concerned that neurotech applications that used to let us play video games, meditate, or improve our focus have opened the way to a future of brain transparency “in which scientists, doctors, governments, and companies may peer into our brains and minds at will.”

Think it can’t be done? Think again. In 2017 A research team led by UC Berkeley computer scientist Dawn Song reported an experiment in which videogamers used a neural interface to control a video game. As they played, the researchers inserted subliminal images into the game and watched for unconscious recognition signals. This game of neurological Battleships netted them one player’s credit card PIN code — and their home address.

Now Massachusetts-based Brainwave Science is selling a technology called iCognative, which can extract information from people’s brains. At least, suspects are shown pictures related to crimes and cannot help but recognise whatever they happen to recognise. For example, a murder weapon. Emirati authorities have already successfully prosecuted two cases using this technology.

This so-called “brain fingerprinting” technique is as popular with governments (Bangladesh, India, Singapore, Australia) as it is derided by many scientists.

More worrying are the efforts of companies, in the post-Covid era, to use neurotech in their continuing effort to control the home-working environment. So-called “bossware” programmes already take regular screenshots of employees’ work, monitor their keystrokes and web usage, and photograph them at (or not at) their desks. San Francisco bioinformatics company Emotiv now offers to help manage your employees’ attention with its MN8 earbuds. These can indeed be used to listen to music or participate in conference calls — and also, with just two electrodes, one in each ear, they claim to be able to record employees’ emotional and cognitive functions in real time.

It’ll come as no surprise if neurotech becomes a requirement in modern workplaces: no earbuds, no job. This sort of thing has happened many times already.

“As soon as [factory] workers get used to the new system their pay is cut to the former level,” complained Vladimir Lenin in 1912. “The capitalist attains an enormous profit for the workers toil four times as hard as before and wear down their nerves and muscles four times as fast as before.”

Six years later, he approved funding for a Taylorist research institute. Say what you like about industrial capitalism, its logic is ungainsayable.

Farahany has no quick fixes to offer for this latest technological assault on the mind — “the one place of solace to which we could safely and privately retreat”. Her book left me wondering what to be more afraid of: the devices themselves, or the glee with which powerful institutions seize upon them.

“And your imaginations would again run out of room…”

Reading The Beetle in the Anthill and The Waves Extinguish the Wind by Arkady and Boris Strugatsky. For The Times, 18 April 2023 

In Arkady and Boris Strugatsky’s The Beetle in the Anthill, zoopsychologist Lev Abalkin and his alien companion, a sentient canine “bighead” called Puppen-Itrich, are sent to the ruined and polluted planet Hope to find out what happened to its humanoid population. Their search leads them through an abandoned city to a patch of tarmac, which Puppen insists is actually an interdimensional portal. Lev wonders what new world this portal might it lead to?

‘“Another world, another world…” grumbles Puppen. “As soon as you made it to another world, you’d immediately begin to remake it in the image of your own. And your imaginations would again run out of room, and then you’d look for another world, and you’d begin to remake that one, too.”’

Futility sounds like a funny sort of foundation for an enjoyable book, but the Strugatskys wrote a whole series of them, and they amount to a singular triumph. Fresh translations of the final two “Noon universe” books are being published this month.

Arkady Strugatsky was born in Batumi, Georgia, in 1925. His kid brother Boris, born in Leningrad in 1933, outlived him by nearly twenty years, though without his elder brother to bounce ideas off, he found little to write about. The brothers dominated Soviet science fiction throughout the 1970s. Their earliest works towed the socialist-realist line and featured cardboard heroes who (to the reader’s secret relief) eventually sacrificed themselves for the good of Humanity. But their interest in people became too much for them, and they ended up writing angst-ridden masterpieces like Roadside Picnic (which everyone knows, because Andrei Tarkovsky’s film Stalker is based on it) and Lame Fate/Ugly Swans (which no-one knows, though Maya Vinokaur’s cracking English translation came out in 2020). Far too prickly to be published in the Soviet Union, their best work circulated in samizdat and (often unauthorised) translation.

The stories and novels of their “Noon universe” series ask what humans and aliens, meeting among the stars, would get up to with each other. Ordinary conflict is out of the question, since spacefaring civilisations have access to infinite resources. (The Noon universe is a techno-anarchist utopia, as is Iain Banks’s Culture, as is Star Trek’s Federation, and all for the same unassailable reason: there’s nothing to stop such a strange and wonderful society from working.)

The Strugatskys assume that there’s one only one really grand point to life as a technically advanced species — and that is to see to the universe’s well-being by nurturing sentience, consciousness, and even happiness.

To which you can almost hear Puppen grumble: Yes, but what sort of consciousness are you talking about? What sort of happiness are you promoting? In The Waves Extinguish the Wind (originally translated into English as The Time Wanderers), as he contemplates the possibility that humans are themselves being “gardened” by a superior race dubbed “Wanderers”, alien-chaser Toivo Glumov complains, “Nobody believes that the Wanderers intend to do us harm. That is indeed extremely unlikely. It’s something else that scares us! We’re afraid that they will come and do good, as they understand it!”’

Human beings and the Wanderers (whose existence can only ever be inferred, never proved) are the only sentient species who bother with outer space, and stick their noses into what’s going on among people other than themselves. And maybe Puppen is right; maybe such cosmic philanthropy boils down, in the end, to nothing more than vanity and overreach.

By the time of these last two novels, the Wanderers’ interference in human affairs is glaring, though it’s still impossible to prove.

In The Beetle in the Anthill Maxim Kammerer — a former adventurer, now a prominent official — is set on the trail of Lev Abalkin, a rogue “progressor” who is heading back to Earth.

Progressors travel from planet to planet and go undercover in “backward” societies to promote their technical and social development. But why shouldn’t Abalkin come home for a bit? He’s spent fifteen years doing a job he never wanted to do, in the remotest outposts, and he’s just about had enough. “Damn it all,” Kammerer complains, “would it really be so surprising if he had finally run out of patience, given up on COMCON and Headquarters, abandoned his military discipline, and come back to Earth to sort things out?”

By degrees, Kammerer and the reader discover why Kammerer’s bosses are so afraid of Abalkin’s return: he may, quite unwittingly, be a “Wanderer” agent.

So an individual’s ordinary hopes and frustrations play out against a vast, unsympathetic realpolitik. This is less science fiction than spy fiction — The Spy Who Came in From the Cold against a cosmic backdrop. And it’s tempting, though reductive, to observe the whole “noon universe” through a Cold War lens. Boris himself says in his afterword to The Beetle…:

“We were writing a tragic tale about the fact that even in a kind, gentle, and just world, the emergence of a secret police force (of any type, form, or style) will inevitably lead to innocent people suffering and dying.”

But the “noon universe” is no bald political parable, and it’s certainly not satire. Rather, it’s an unflinching working-out of what Soviet politics would look like if it did fulfil its promise. It’s a philosophical solvent, stripping away our intellectual vanities — our ideas of manifest destiny, our “outward urge” and all the rest — to expose our terrible littleness, and tremendous courage, in the face of a meaningless universe.

In their final novel The Waves Extinguish the Wind — assembled from fictional documents, reports, letters, transcripts and the like — we follow a somewhat older and wiser Maxim Kammerer as he oversees the heartbreaking efforts of his protogée Toivo Glumov to prove the existence of the Wanderers for once and for all. It’s an odyssey (involving peculiar disappearances, bug-eyed monsters and a bad-tempered wizard) that would be farcical, were it not tearing Glumov’s life to pieces.

Kammerer reckons Glumov is a fanatic. Does it even matter that humans are being tended and “progressed” by some superior race of gardener? “After all,” Kammerer says to his boss, Excellentz, ‘“what’s the worst we can say about the Wanderers?’” He’s thinking back to the planet called Hope, and that strange square of tarmac: ‘“They saved the population of an entire planet! Several billion people!”’

‘“Except they didn’t save the population of the planet,”’ Excellentz points out. ‘“They saved the planet from its population! Very successfully, too… And where the population has gone — that’s not for us to know.”’

Not our Battle of Britain

Watching Andrew Legge’s film Lola for New Scientist, 12 April 2023

Two sisters, orphans, play among the leavings of their parents’ experiments in radio, and by 1938 the one who’s a genius, Thomasina (Emma Appleton), is listening to David Bowie’s “Space Oddity” on a ceiling-high television set that can tune in to the future.

The politics of the day being what it is, Thom’s sister Martha (Stefanie Martini) decides that this invention (named Lola after their dead mother) cannot remain their personal plaything — it belongs to the world. With the help of Sebastian, a sympathetic army officer (soon enough Martha falls in love with him) the sisters are soon collaborating with British intelligence to fox Nazi operations a day before they happen.

Drunk on success, Thom lets her ambition get the better of her, and starts sacrificing the civilians of tomorrow in order to draw out the Wehrmacht. When a horrified President Roosevelt catches wind of this, it spells the end of Churchill’s efforts to draw the US into the war against Hitler.

Good intentions, ambitious plans and unintended consequences usher the world into Hell in this often stunning piece of micro-budget science fiction. As high concept movie ideas go, Lola’s counterfactual 20th-century history is up there with Memento and Primer and Source Code.

Attentive readers will feel a “but” hovering here. For some reason the director and co-writer Andrew Legge took a day of rest after fleshing out this winning idea; he seems neither to have finished the script, nor given his actors much directorial guidance. Lola is more a short story narrated to a visual accompaniment than a fully fledged film. Thom and Mars are supposed to be nice 1930s gals transfigured by their access to glimpses of 1960s pop culture — but it’s impossible not to see them for what they are, personable young actors from the 2020s let loose to do their thing in front of the camera.

This makes Lola a good movie, rather than a great one — and it’s a shame. Some extra scriptwork and a spot of voice coaching would have added hardly anything to Lola’s admittedly tight budget. In 2009, Legge made The Chronoscope, a 20-minute foray into the same territory. Lola is more solemn than that short outing, but no more serious, as though Legge were intimidated, rather than inspired, by the possibilities offered by the feature format.

Elsewhere, the film’s resources are deployed with flair and ingenuity. The film is an historically and technologically impossible but highly convincing assembly of found footage and home movie. (Among Thom’s other incidental inventions is a hand-held camera that records sound.) Famous radio broadcasts of the period are repurposed to chilling effect. (Lola’s “Battle of Britain” is not our battle of Britain). The Zelig-like manipulations of newsreel footage are fairly crude in purely technical terms, but I defy you not to gasp at the sight of Nazi invaders waving their Swastika over a bombed-out London, or Adolf Hitler being driven in state down the Mall. And Neil Hannon (the maverick musical talent behind The Divine Comedy, not to mention Father Ted’s “My Lovely Horse” song) has a quite indecent amount of fun here, cooking up the beats of a counterfactual 1970s fascist Top 10.

These days the choice confronting British and Irish filmmakers is stark: do you want to make your movie as quickly as possible, on the lowest possible budget, get it seen, and generate interest? Or do you want to spend twenty years in development hell, working with overseas production companies who don’t know whether they can trust you, and — with many millions of dollars on the line — are likely to homogenise your project out of all recognition?

I wish Lola had impressed me less and involved me more. But in a business as precarious as this one, Legge’s choices make sense, and Lola is an effective and enjoyable industry calling card.

The Value of Psychotic Experience

Reading The Best Minds: A story of friendship, madness and the tragedy of good intentions by Jonathan Rosen. For the Telegraph, 3 April 2023

This is the story of the author’s lifelong friendship with Michael Laudor: his neighbour growing up in in 1970s Westchester County; his rival at Yale; and his role-model as he abandoned a lucrative job for a life of literary struggle. Through what followed — debilitating schizophrenia; law school; and national celebrity as he publicised the plight of people living with mental illness — Laudor became something of a hero for Jonathan Rosen. And in June 1998, in the grip of yet another psychotic episode, he stabbed his pregnant girlfriend to death.

Her name was Caroline Costello and, says Rosen, she would not have been the first person to ascribe everything in Laudor’s disintegration, “from surface tremors to elliptical apocalyptic utterances, to the hidden depths of a complex soul.”

It will take Rosen over 500 pages to unpick Costello and Laudor’s tragedy, but he starts (and so, then, will we) with the Beats — that generation of writers who, he says, “released madness like a fox at a hunt, then rushed after — not to help or heal but to see where it led, and to feel more alive while the chase was on.”

As a young man, the poet Alan Ginsburg (whose Beat poem “Howl” gives this book its title) gave permission for his mother’s lobotomy. He spent the rest of his life atoning for this, “spinning the culture around him,” says Rosen, “into alignment with his mother’s psychosis”. By the summer of 1968 the Esalen Institute in Big Sur was sponsoring events under the heading “The Value of Psychotic Experience”.

To the amateur dramatics of the Beat generation (who declared mental illness a myth) add the hypocrisies of neo-Marxist critics like Paul de Mann (who proved texts mean whatever we want them to mean) and Franz Fanon (for whom violence was a cleansing force with a healing property) and the half-truths of anti-psychiatrists like Felix Guattari, who hid his clinic’s use of ECT to bolster his theory that schizophrenia was a disease of capitalist culture. Stir in the naiveties of the community health movement (that judged all asylums prisons), and policies ushered in by Jack Kennedy’s Community Mental Health Act of 1963 (“predicated,” says Rosen, “on the promise of cures that did not exist, preventions that remained elusive, and treatments that only work for those who were able to comply”); and you will begin to understand the sheer enormity of the doom awaiting Laudor and those he loved, once the 1980s had “backed up an SUV” over the ruins of America’s mental health provision.

This is a tragedy that enters wearing the motley of farce: on leaving Yale, Laudor became a management consultant, hired by the multinational firm Bain & Co. Rosen reckons Laudor got the job by talking with authority even when he didn’t know what he was talking about; “That, in fact, was why they had hired him.”

By the time Laudor enters Yale Law School, however, his progressive disintegration is clear enough to the reader (if not to the Dean of the school, Guido Calabresi, besotted with the way an understanding of mental health could, in Rosen’s words, “undermine the authority of courts, laws, facts, and judges, by exposing the irrational nature of the human mind”).

Soon Michael is being offered a million dollars for his memoir and another million for the movie rights. He’s the poster child for every American living with mental illness, and at the same time, he’s an all-American success story: proof that the human spirit can win out over psychiatric neglect.

Only it didn’t.

No one knows what to do about schizophrenia. The treatments don’t work, except sometimes they do, and when they do, no-one can really say why.

Rosen’s book is a devastating attack on a generation that refused to look this hard truth in the eye, and turned it, instead into some sort of flattering sociopolitical metaphor, and in so doing, deprived desperate people of care.

Rosen’s book is the mea culpa of a man who now understands that “the revolution in consciousness I hoped would free my mind… came at the expense of people whose mental pain I could not begin to fathom.”

It’s the darkest of literary triumphs, and the most gripping of unbearable reads.

The monster comes from outside

Reading To Battersea Park by Philip Hensher for The Spectator, 1 April 2023

We never quite make it to Battersea Park. By the time the narrator and his husband reach its gates, it’s time for them, and us, to return home.

The narrator is a writer, living just that little bit too far away from Battersea Park, inspired by eeriness of the Covid lockdown regime but also horribly blocked. All kinds of approaches to fiction beckon to him in his plight, and we are treated to not a few of them here.

Each section of this short novel embodies a literary device. We begin, maddeningly, in “The Iterative Mood” (“I would have”, “She would normally have”, “They used to…”) and we end in “Entrelacement”, with its overlapping stories offering strange resolutions to this polyphonous, increasingly surreal account of Lockdown uncanny. Every technique the narrator employs is an attempt to witness strange times using ordinary words.

Hensher didn’t just pluck this idea out of the void. Fiction has a nasty habit of pratfalling again and again at the feet of a contemporary crisis. Elizabeth Bowen’s The Heat of the Day (the Blitz) dribbles away into an underpowered spy thriller; Don DeLillo’s Falling Man (the September 11 attacks) only gets going in the last few dozen pages, when the protagonist quits New York for the poker-tournament circuit. Mind you, indirection may prove to be a winning strategy of itself. The most sheerly enjoyable section of To Battersea Park is a “hero’s journey” set in post-apocalyptic Whitstable. Hensher nails perfectly the way we distance ourselves from a crisis by romanticising it.

Milan Kundera wrote about this — about how “the monster comes from outside and is called History” — impersonal, uncontrollable, incalculable, incomprehensible and above all inescapable.

In To Battersea Park, Hensher speaks to the same idea, and ends up writing the kind of book Kundera wrote: one that appeals, first of all — almost, I would say, exclusively — to other writers.

In the middle of the book there’s a short scene in which a journalist interviews a novelist called Henry Ricks Bailey, and Bailey says:

“When people talk about novels, if they talk at all, they talk about the subject of those novels, or they talk about the life of the person who wrote it. This is a wonderful book, they say. It’s about a couple who fall in love during the Rwandan Genocide, they say… It’s as if all one had to do to write a novel is pick up a big box of stuff in one room and move it into the next.”

This (of course, and by design) borders on the infantile: the writer boo-hooing because the reader has had the temerity to beg a moral.

Hensher is more circumspect: he understands that the more you do right by events — the endless “and-then”-ness of everything — the less you’re going to to able to interest a reader, who has after all paid good money to bathe in causes and consequences, in “becauses” and “buts”.

To Battersea Park reveals all the ways we try to comprehend a world that isn’t good or fair, or causal, or even comprehensible. It’s about how we reduce the otherwise ungraspable world using conventions, often of our own devising. An elderly man fills half his house with a model railway. A dangerously brittle paterfamilias pumps the air out of his marriage. A blocked writer experiments with a set of literary devices. A horrified child sets sail in an imaginary boat. It’s a revelation: a comedy of suburban manners slowed to the point of nightmare.

That said, I get nervous around art that’s so directly addressed to the practitioners of that art. It’s a novel that teaches, more than it inspires, and a small triumph, in a world that I can’t help but feel is gasping for big ones.

 

“The white race cannot survive without dairy products”

Visiting Milk at London’s Wellcome Collection. For the Telegraph, 29 March 2023

So — have you ever drunk a mother’s milk? As an adult, I mean. Maybe you’re a body-builder, following an alternative health fad; maybe you’re a fetishist; or you happened to stumble into the “milk bar” operated now and again by performance artist Jess Dobkin, whose specially commissioned installation For What It’s Worth — an “unruly archive” of milk as product, labour and value —
brings the latest exhibition at London’s Wellcome Collection to a triumphant, chaotic and decidedly bling climax.

Why is breast milk such a source of anxiety, disgust, fascination and even horror? (In Sarah Pucill’s 1995 video Backcomb, on show here, masses of dark, animated hair slither across a white tablecloth, upturning containers of milk, cream and butter.)

Curators Marianne Templeton and Honor Beddard reckon our unease has largely to do with the way we have learned to associate milk almost entirely with cow’s milk, which we now consume on an industrial scale. It’s no accident that, as you enter their show, an obligatory Instagram moment is provided by Julia Bornefeld’s enormous hanging sculpture, suggestive at once of a cow’s udders and a human breast.

Milk is also about Whiteness. In “Butter. Vital for Growth and Health”, an otherwise unexceptionable pamphlet from the National Dairy Council in Chicago (one of the hundred or so objects rubbing shoulders here with artworks and new commissions), there’s a rather rather peculiar foreword by Herbert Hoover, the man who was to become the 31st U.S. President. “The white race,” Hoover writes, “cannot survive without dairy products.”

Say what?

Hoover (if you didn’t know) was put in charge of the American Relief Administration after the first World War, and saw to the food supply for roughly 300 million people in 21 countries in Europe and the Middle East. Even after government funding dried up, the ARA still managed to feed 25 to 35 million people during Russia’s famine of 1921-22 — which remains the largest famine relief operation in world history.

So when Hoover, who knows a lot about famine, says dairy is essential to the white race, he’s not being malign or sectarian; he believes this to be literally true — and this exhibition goes a very long way to explaining why.

Large portions of the world’s population react to milk the way my cat does, and for the same reason — they can’t digest the lactose. This hardly makes dairy a “White” food unless, like Hoover, your terms of reference were set by eugenics; or perhaps because, like some neo-Nazis in contemporary USA, you see your race in terminal decline, and whole milk as the only honest energy drink available in your 7-11. (Hewillnotdivide.us, Luke Turner’s 2017 video of drunk, out-of-condition MAGA fascists, chugging the white stuff and ranting on about purity, is the least assuming of this show’s artistic offerings, but easily the most compelling.)

Milk also asks how dairy became both an essential superfood and arguably the biggest source of hygiene anxiety in the western diet. Through industry promotional videos, health service leaflets, meal plans and a dizzying assortment of other ephemera, Milk explains how the choice to distribute milk at scale to a largely urban population led to the growth of an extraordinary industry, necessarily obsessed with disinfection and ineluctably driven toward narrow norms and centralised distribution; an industry that once had us convinced that milk is not just good for people, but is in fact essential (and hard cheese (sorry) to the hordes who can’t digest it).

The current kerfuffle around dairy and its vegan alternatives generates far more heat than light. If one show could pour oil on these troubled waters (which I doubt), it isn’t this one. No one will walk out of this show feeling comfortable. But they will have been royally entertained.

The sirens of overstatement

Visiting David Blandy’s installation Atomic Light at John Hansard Gallery, University of Southampton, for New Scientist, 22 March 2023

The Edge of Forever, one of four short films by Brighton-based video and installation artist David Blandy, opens with an elegaic pan of Cuckmere Haven in Sussex. A less apocalyptic landscape it would be hard to imagine. Cuckmere is one of the most ravishing spots in the Home Counties. Still, the voiceover insists that we contemplate “a ravaged Earth” and “forgotten peoples” as we watch two children exploring their post-human future. The only sign of former human habitation is a deserted observatory (the former Royal Observatory at Herstmonceux Castle in Sussex). The children enter and study the leavings of dead technologies and abandoned ambitions, steeped all the while in refracted sunlight: Claire Barrett’s elegiac camerawork is superb.

The films in Blandy’s installation “Atomic Light” connect three different kinds of fire: the fire of the sun; the wildfires that break out naturally all over the earth, but which are gathering force and frequency as the Earth’s climate warms; and the atomic blast that consumed the Japanese city of Hiroshima on 6 August 1945.

There’s a personal dimension to all this, beyond Blandy’s vaunted concern for the environment: his grandfather was a prisoner of the Japanese in Singapore during the second World War, and afterwards lived with the knowledge that, had upwards of 100,000 civilians not perished in Hiroshima blast, he almost certainly would not have survived.

Bringing this lot together is a job of work. In Empire of the Swamp
a man wanders through the mangrove swamps at the edge of Singapore, while Blandy reads out a short story by playwright Joel Tan. The enviro-political opinions of a postcolonial crocodile are as good a premise for a short story as any, I suppose, but the film isn’t particularly well integrated with the rest of the show.

Soil, Sinew and Bone, a visually arresting game of digital mirrors composed of rural footage from Screen Archive South East, equates modern agriculture and warfare. That there is an historical connection is undeniable: the chemist Franz Haber received the Nobel Prize in Chemistry in 1918 for his invention of the Haber–Bosch process, a method of synthesising ammonia from nitrogen and hydrogen. That ammonia, a fertiliser, can be used in the manufacture of explosives, is an irony familiar to any GCSE student, though it’s by no means obvious why agriculture should be left morally tainted by it.

Alas, Blandy can’t resist the sirens of overstatement. We eat, he says “while others scratch for existence in the baked earth.” Never mind that since 1970, hunger in the developing world has more than halved, and that China saw its hunger level fall from a quarter of its vast population to less than a tenth by 2016 — all overwhelmingly thanks to Haber-Bosch.

Defenders of the artist’s right to be miserable in face of history will complain that I am taking “Atomic Light” far to literally — to which I would respond that I’m taking it seriously. Bad faith is bad faith whichever way you cut it. If in your voiceover you dub Walt Disney’s Mickey “this mouse of empire”, if you describe some poor soul’s carefully tended English garden as the “pursuit of an unnatural perfection wreathed in poisons”, if you use footage of a children’s tea party to hector your audience about wheat and sugar, and if you cut words and images together to suggest that some jobbing farmer out shooting rabbits was a landowner on the lookout for absconding workers, then you are simply piling straws on the camel’s back.

Thank goodness, then, for Sunspot, Blandy’s fourth, visually much simpler film, that juxtaposes the lives and observations of two real-life solar astronomers, Joseph Hiscox in Los Angeles and Yukiaki Tanaka in Tokyo, who each made drawings of the sun on the day the Hiroshima bomb dropped.

Here’s a salutary and saving reminder that, to make art, you’re best off letting the truth speak for itself.

Moral good taste

Reading Humanly Possible by Sarah Bakewell for the Telegraph, 15 March 2023

In 1362 Western literary culture took a swerve when Petro Petroni, a monk from Siena, told Giovanni Boccaccio of his recent vision: that if Boccaccio didn’t stop collecting non-Christian books (never mind writing them), God was going to kill him. Boccaccio, somewhat rattled, told his friend, the poet Petrarch, and Petrarch, being a bibliophile, was unimpressed: if Boccaccio did decide to take Petroni’s advice, would he mind giving him first pick of the discards?

Humanly Possible is an anecdotal history — witty, warm-hearted (here and there, gratingly matey) — of the Western mind’s seven-hundred-year effort to ignore priestly and sectarian blarney, so as to nurture its own voice, its own conscience, its own good. Humanists believe that we each of us have a spark of good will and that, fed on charity, education and civic effort, all these sparks can together enlighten society.

Bakewell acknowledges that such civilising efforts are traceable in many traditions, and have probably been going on for ever. (And I do mean forever. I remember the relief I felt once, walking through Athens’s Acropolis museum, as I left behind the dead-eyed, fatuously grinning statues of the Archaic period (550 BC) for the dignified, melancholy, humane creatures of the Classical era that followed.)

Bakewell’s is story of spiritual and intellectual triumph, beginning in Italy around 1300 and which, thanks to that bloody Twentieth century — its two world wars and litany of totalitarian atrocity — ends with a hideous and disturbing twist.

Organising Europe’s humanists into a “tradition” is rather like herding cats. Bakewell’s organisational ability deserves applause. Here, bibliomanes like Boccaccio give way to physicians like Vesalius, then, via memoirists like Montaigne and philosophers like Paine and Hume, to novelists like E.M. Forster, with his impassioned plea that we “only connect!” with each other. It’s an epic, spine-tingling, seamless account.

Bakewell, who used to look after early printed books at the Wellcome Library in London, has a melting love of all those joyful 16th-century recitations of human excellence, and does a terrific job of communicating the ethical achievements behind such apparent fripperies as the Third Earl of Shaftesbury’s “Inquiry Concerning Virtue and Merit”, which would replace stern religiosity and unthinking obedience with nothing more than “a kind of moral ‘good taste’”.

I was reminded here of Kenneth Clark, in the 1969 TV series Civilisation, trying to explain why the paintings of Watteau (of all people) represent some sort of ethical high-water mark for western civilisation. In both cases, the point is well made, and well received, but at the same time, both assertions feel a bit underwhelming. All that cultural effort, all that struggle and invention, suffering and heroism, led up to — good taste? (Bakewell, fully aware of the problem, quotes the Cambodian filmmaker Rithy Panh: “There’s also a banality of good; and an everydayness of good.”)

The argument against humanism goes like this: an unbeliever, guided purely by their own conscience, is a person without morals. If society tolerated such people, breakdown would ensue.

And the awful thing is that this argument is not altogether wrong. Sometimes, breakdown is exactly what happens. Josef Stalin didn’t see to the deaths of 20 million of his own people by being an anti-humanist. Not at all: he was one of the best-read men of his generation, obsessed to the point of madness with constant intellectual and (though he wouldn’t have called it this) spiritual self-improvement. All fascism’s John-the-Baptist figures — from Maurice Barres to Martin Heidegger — were humanists maddened by the alienations of heavy industry, looming automation and evident democratic failure.

It’s not that the humanist idea is flawed, so much as it is no defence against our self-fulfilling belief in our own badness (and that’s a switch that’s frighteningly easy to throw: Bakewell mentions Savonarola and “his bonfire of the vanities”, but, generously, turns a blind eye to Greta Thunberg).

In his autobiography, quoted here, the Austrian novelist Stefan Zweig talked of the humanists’ “beautiful error” — believing that “better learning, better reading and better reasoning would be enough to bring about a better world.”

Bakewell is too clear-eyed an historian, and too honest a writer, to gloss over the weakness of this pose. But she’s clear enough about the alternatives, too. They’re all species of harshness in Bakewell’s chatty and persuasive book — one form or another of force or war or slavery. We may feel jolly silly at times, waving Oscar Wilde’s love of curtain fabrics in the face of the world’s barbarity — but that might be as good a weapon as we’ll ever get.