Have they not seen rocks?

Watching Brian Cory Dobbs’s Blue Planet Red for New Scientist, 8 October 2025

Blue Planet Red purports to be a feature-length documentary about Mars. Writer-director Brian Cory Dobbs’s red planet is not the one you and I might recognise, but it certainly has some appeal: home to an advanced civilisation of pyramid-builders who either couldn’t save their homeworld from destruction, or who blew it up in an orgiastic nuclear conflict.

Corey presents his arguments for advanced Martian life straight to camera, with many a raised eyebrow and artful stutter and hestitation. I quite liked him. But I was not in the least surprised, after watching his documentary, to discover that his showreel consists mainly of woo (by which I mean, YouTube shorts about mobile phones, electromagnetic fields, and cancer).

By intention or not, Blue Planet Red is an historical document: the last hurrah of a generation of researchers, enthusiasts, oddballs and narcissists who came to maturity under the shadow of a two-kilometre-long mesa in Cydonia. Here, where the southern highlands of Mars meet its northern plains, NASA’s Viking orbiters snapped blurry images of what looked like a gigantic human face: the Face on Mars.

Let’s not spend too much time debunking here what has been debunked, so often and so convincingly, elsewhere. Improve the image resolution, and the Face disappears. Mars’s hexagonal craters are a commonplace of rocky planets, and imply some fluid subsurface (think the patterns porridge makes, boiling in a pan). Lightning bolts cannot leap from planet to planet. The presence of the xenon 129 isotope in the Martian atmosphere will imply ancient nuclear conflict only if you ignore the well-understood process by which a now-extinct isotope, iodine 129, would have decayed to xenon 129 in Mars’s rapidly cooling and ever-more inert and boring lithosphere. Is that a rock? Yes. Even the one that looks like a bone? Yes. Even the one that looks like a tumble-dryer? For the love of God, yes — have you not seen rocks?

Ron Levin, son of Gilbert Levin, the engineer who cooked up Viking’s Labeled Release experiment, wonders why NASA ignored two clear positive results and scotched its early claim that there was microbial life on Mars. Well, NASA didn’t ignore the results. Neither did it ignore the results of Viking’s Gas Chromatograph-Mass Spectrometer experiment, which found no evidence of any organic molecules in the Martian soil. Reconciling these results gave us our current understanding of Martian soil chemistry. By that measure, the Labeled Release experiment was a success: why be resentful?

More poignant, though no more convincing, are the idees fixees of Richard Brice Hoover (born 1943) who headed astrobiology research at NASA’s Marshall Space Flight Center until his retirement in 2011 and did more than most to establish the existence of extremophile life on Earth.

He’s convinced he’s found diatoms and other microfossils in meteorites, and such is his enthusiasm, he never quite gets around to explaining why each of these objects is lying on the top of the rock sample, instead of being embedded in its matrix.

John Brandenburg (born 1953) is a pretty well-regarded plasma scientist, if you can get him off the subject of Martian nuclear war. And what about Mark Carlotto, who’s spent forty years seeing civilisational remains on Mars where everyone else sees rocks? Drag him down to earth, and he’s a capable archaeologist, who really has traced the lines of a forgotten colonial settlement in the middle of Cape Ann – an island community north of Boston.

After the final Apollo moon landing in 1972, the initial excitement of the Space Race began to wane. The images the Viking orbiters sent back promised the next great discovery. Their blurry amalgams of groundbreaking yet ambiguous data were the perfect growth medium for fringe ideas, especially in the United States, where the Vietnam War and the Watergate scandal encouraged scepticism and paranoia.

Dobbs’s flashy retread of tall Martian tales thinks it’s about what happened 3.7 billion years ago, that turned a wet, warm planet into a dustbowl. For me, it’s much more about what happened to some squirrely enthusiasts, glued to monitors and magazines in 1972. Let’s lay our scorn aside a moment and look this generation in the eye. Fond hopes will not trip up fine minds in quite this way again.

“Look at me, top of the leader board!”

Reading Paul Mullen’s Running Amok for the Telegraph, 27 September 2025

Paul Mullen has spent years trying to understand the internal world of the lone mass killer: the sort of person who draws their weapon in a school, on a factory floor, or at a supermarket. In this pursuit, says Mullen – a forensic psychiatrist – we should remember, and admit, that everyone has the odd unpleasant impulse from time to time: it’s part of being human. So, he writes, when discussing the most sickening criminals, we mustn’t “endow perfectly normal mental mechanisms with a pathological, sinister significance”.

For example, many of us feel undervalued. Many of us feel in possession of skills and attributes that, in a better world, would surely bring us recognition. Who among us has not looked in the mirror and met a creature consumed by resentment or depression? Life can be crushing, and as Mullen says, “disappointed, egotistical, isolated and rigid men are ten-a-penny.” Pushed to the edge, they’re much more likely to put an end to themselves than go out in a blaze of vicious “glory”. (The male suicide rate in the UK last year was 17.4 deaths per 100,000 people, vastly larger than the rate of male deaths by homicide.)

Mullen is best known for his research into the link between common-or-garden jealousy and the obsessional, sometimes homicidal, behaviour of stalkers. He takes a similar tack in Running Amok, a devastating compendium of mass killings, arranged by locale and severity. Many lone mass killers, we learn, are persistent whiners – “querulants” is Mullen’s term-of-art – which led me to wonder what our burgeoning culture of complaint is doing to stoke their fires. From those who self-righteously pursue their grievances to others who seem to live fantasies of battling persecution, you wonder how thin the cognitive dividing-line can safely grow. And yet: whether or not the world is filling up with narcissistic whiners, most of them don’t turn to slaughter. So what leads a handful to make that change? Or, to put it another way, what actuates them even more than what, in truth, are the perfectly common means (guns, vehicles, knives), and motive (the desire for “a semblance of power and significance”)?

Mullen, who has met a wide range of criminals across his professional career, and was the first non-military defence expert to gain access to the detention centre at Guantanamo Bay, points the finger at the availability of an incident the would-be killer can emulate: what Mullen calls a “social script”. In his experience, mass killers are invariably fixated on reports of previous massacres; also on their fictional depiction. Rambo is a fine movie, intelligently written, but there’s a reason the DVD keeps turning up on the shelves of such people.

As societies change, so do the scripts they make available to the despondent, the despairing, the rejected and the humiliated. In the 1970s and early 1980s, homicidal losers used to fixate on a belief; now they’re more likely to kill in the name of a group. The 2016 Orlando killer Omar Mateen claimed allegiance to both Isis and Hizbullah: a neat trick, given how violently these groups are opposed to each other. In this shift from ideology to tribe, Mullen detects the influence of the internet, with its pseudo-communities of extremists desperate to represent some persecuted minority.

The other essential characteristic of these scripts is that they are self-perpetuating. Killers inspire killers. It’s why Mullen won’t mention the killers he’s writing about by name, a tactic that gives the reader the initial impression – quickly dispelled – that the author is only marginally acquainted with his subject matter. On the contrary, Mullen anatomises, with skill and a certain amount of garrulousness, what seems a desperately intractable problem, noting in particular the inflammatory influence of a predominantly on-line incel culture, the depredations of the attention economy, and the addictiveness of certain videogames. The violence or otherwise of these games is not at issue: much more important is their ability to offer the pathologically lonely a semblance of social validation: “Look at me, top of the leader board!” Internet tribalisms of all sorts service the lone killer’s need to belong — and not just to belong, but to crawl to the top of some specious hierarchy. “I’ve got the record, haven’t I?” was practically the first question Martin Bryant asked after shooting and killing 35 people and injuring 23 others in the Tasmanian tourist town of Port Arthur in 1996.

So much for sociology. Mullen would sooner engage with the extreme inner worlds of lone mass killers than explain them away with platitudes. Whatever maddened these people in the first place (and let’s face it, some people are just born miserable), by the time mass homicide seems like a solution to their problems, they are, by any common definition of the term, mad, and should be treated as such.

This is where Mullen turns to discuss, of all people, Queen Victoria. Across her long reign, she was the victim of eight assassination attempts. By the time she died, entirely peacefully, the Metropolitan Police had learned that the most effective strategy for avoiding or mitigating attacks on a permanently public target was, as far as possible, to dampen down publicity. Ever since, would-be regicides have been arrested without fanfare, and often ushered into psychiatric treatment. Thus, within the bounds of law, a security issue has been turned into a public-health one.

Mullen would like to see potential lone mass killers spotted and treated in much the same way. He proposes a Threat Assessment and Response Centre (targeted on random killings) modelled on the Met’s Fixated Threat Assessment Centre, which handles the security of known targets. Faced with a credible threat, the Centre should be given access to the suspect’s police and medical records and their internet history. Why? Because identification is ninety per cent of the battle. Treatment, by comparison, is startingly simple: obsessives on the path to atrocity are, in Mullen’s experience, remarkably cooperative and frank with those who’ve managed to stop them.

At the time of writing, there have been just over 300 mass shootings in America so far, and while gun-control laws may have preserved Britain and other Western countries from that specific plague, a spate of vehicle ramming attacks in Nice, Berlin, London, Barcelona, Stockholm and other European cities have left us, and our security services, in a state of hypervigilance.

So can we do anything? Mullen wants us to overcome our reticence and take seriously the threats made by miserable obsessives. False alarms will be raised, but psychologists aren’t witch-finders, Mullen assures us — in fact much of his time is spent avoiding the false attribution of madness in the people he meets.

I fear public awareness won’t do much good, however. Now that civic society has declared war on nuance and arrests people (or, Graham Linehan, anyway) for jokes, how can any of us be expected to hear the signal over the noise?

Dig another hole

Reading The Secret History of Gold by Dominic Frisby for the Telegraph, 22 August 2025

used to sift this soft, off-colour metal out of the beds of streams. Then, once the streams ran out of the stuff, we dug it out of the ground. These days, lest the smallest grain elude us, we gather the stuff by leaching gold out of the ore and into a solution of cyanide. Then we precipitate it back into a solid, melt it down and, says American investor Warren Buffet rather wryly, “dig another hole, bury it again and pay people to stand around guarding it. It has no utility. Anyone watching from Mars would be scratching their head.”

Well, not quite. Gold is useful. It’s a gift to artists: soft enough to work without fire (which is why it was the first metal we ever harnessed), and chemically so stable that it never tarnishes. No matter what we make of the stuff, though, sooner or later, as renaissance goldsmith Benvenuto Cellini and countless anonymous Inca artists will attest, someone else may come along and melt it down. “Gold may last,” says Dominic Frisby – a curious chap, part financial journalist, part stand-up comedian – “but art made from gold rarely does.”

This is Frisby’s real subject in his affable, opinionated new book, The Secret History of Gold: gold is fungible. An ounce of gold is equal in value to every other ounce of gold. Since it doesn’t corrode, rust, or tarnish, and since it resists most common acids. we can melted and recast and melt it again, and still end up with an ounce of gold. This makes the element about as honest a medium of exchange as the physical world has to offer – a point that besotted the Spanish conquistadors who melted down enough South American artwork into bullion to destroy their own empire’s balance of payments, and has been lost on few serious politicians since. “We have gold,” said Herbert Hoover in 1933, “because we cannot trust governments.”

Frisby himself is no fan of the State. His 2013 book Life After the State was subtitled “Why We Don’t Need Government”. His 2019 book about tax was called Daylight Robbery. Frisby’s is a sentimental conservatism, weaponised on stage in ditties such as the Brexit victory song “Seventeen Million F—-Offs” (a treasurable joy, whatever your politics), and reasoned out in books that offer up cogent entertainment, even if they don’t always convince.

The Secret History of Gold is another addition to that trend. As a history, his tales – whether topical, historical or mythological – are well-turned, comprehensive and occasionally surprising. King Midas’s “touch”, we learn, was a just-so story cooked up to explain the unreal amounts of gold discovered in the bed of the river Pactolus. Alexander the Great created the world’s first standardised currency by adopting consistent weights for gold and silver coins across his territories. A latter-day alchemist called Heinz Kurschildgen was prosecuted for fraud in 1922, after convincing several investors that he could turn base metal into gold; later, he convinced Heinrich Himmler that he could make petrol out of water too.
Intellectual property is now so frictionless that the business of “gathering tales” is something any fool can do by pressing a key. Frisby is perfectly entitled to rehash many of the tales from Timothy Green’s 2007 The Ages of Gold, since he recognises that debt in his end-notes. This isn’t inherently a problem – how else but by reading do books get made? – but the internet has made us all potentially that erudite. What matters, then, with books such as Frisby’s, is less how much you know than how much fun you have with what you’ve assimilated. Thankfully, Frisby entertains here, impressively and convincingly so. It’s just that it seems a bit silly for Frisby and his publishers to call a book such as this a “secret history”, when it’s simply combining accessible materials into a compelling new weave.
Each story in that weave, at least, does inform Frisby’s argument and obsession – that the world (or, failing the that, Britain) might return to a gold standard. This is the business of tying a country’s currency to a fixed quantity of gold, so that its paper money can in theory be exchanged for the metal. Pegging currencies to the value of gold certainly makes life simple, or at least, it seems to, if you don’t pay too much attention to the prospecting and the mining, the pillaging, the counterfeiting and the fraud. Frisby wouldn’t dream of skirting such a rich source of entertainment, and his tales of German and Japanese gold-hunting during the Second World War are eye-popping. In Merkers salt mine, U.S. troops discovered a Nazi stash including over 100 tons of gold bullion, but General Tomoyuki Yamashita’s treasure horde, meant for Japan’s post-war rebuilding, remains untraced and untracerable.
It’s true that the gold standard stops governments from recklessly printing money and inflating the economy. And this, Frisby argues, is exactly what has happened, pretty much everywhere, again and again. Crippled by the costs of the first world war and the Great Depression, Britain was first to abandon the gold standard in 1931. But 1971 was when the rot really set in. Saddled with rising inflation, increasing trade deficits and the cost of the Vietnam War, Richard Nixon’s US abandoned the standard and took the rest of the world with it down the path of perdition; government after government has since then repeatedly devalued their currency on the world’s markets. Why else would houses cost 70 times more now than when I was born in 1965?
Frisby’s proposed cure is for the world to adopt cryptocurrency. Despite not being a material entity, like gold, a bitcoin is pure money – a bearer asset. When I hand you a bitcoin, its value is, as Frisby explains, immediately transferred from me to you. What’s not to like about digital gold? Well, for starters, manufacturing these magic numbers – mining these bitcoins – requires a lot of IT infrastructure and no small amount of energy, so it puts production in the hands of just a few powerful nations, rather as the gold standard put production in the hands of just a few gold-rich territories.
Frisby’s arguments for pegging currencies to a digital standard might also carry more weight if he were a little more realistic with himself, and with us, about why we left the gold one. By abandoning gold for a currency backed by empty promises – a fiat currency system – governments no longer have to manage the amount of gold they have. This means they can concentrate on stabilising prices, by controlling interest rates. They might not do a brilliant job of it, but it’s what made the difference between how we experienced the Wall Street Crash of 1929 and the much bigger but infinitely less ruinous crash of 2008.
Until cryptocurrency has caught up, Frisby is inclined to pin all our current economic woes on Nixon’s 1971 decision to abandon the gold standard. As an economic thesis, that’s not even wrong, just hopelessly insufficient. It fails to acknowledge the benefits of free trade that the fiat system has enabled, despite its difficulties, and leaves us wondering just how it is that since 1971, extreme poverty and infant mortality have dropped by more than two-thirds worldwide, while the number of children in primary school has grown from 2.3 million to over 700 million.
But Frisby is an entertainer, and the more he entertains, the more he’s likely to convince. He didn’t really need to lumber his book with the whole “secret history” shtick, and his yarns, ripping though they are, sometimes just get in the way. At its best, this book sets Frisby up as a colourful and sly adversary to contemporary financial and political pieties and sometimes – I would humbly suggest – to common sense. But even at his most eccentric, in his enthusiasm and wit, he’s a worthy adversary. I’m not sure, despite this book’s flaws, that one could really ask for more.

 

This isn’t High Noon

Reading Sheepdogs by Elliot Ackerman for The Telegraph, 18 August 2025

Hey! It looks like you are trying to shoot someone at point-blank range with a small 9mm pistol. Would you like help?

If you are going to kill someone with a 9mm pistol, it is very important that you stare at the ground as you make your approach. Next, raise your head until you are focused on your target’s centre mass. Think heart and organs. Avoid their eyes and — no, don’t draw, this isn’t High Noon — have the gun in your hand in your pocket, and shoot through the fabric of your suit. Now go and rehearse, and remember: practice makes perfect!

Elliot Ackerman knows something about skill acquisition, task analysis and work breakdown structure. He also knows about the mechanics of a SIG Sauer P938 micro-compact single-action concealed carry. In a crisis, though, hardware will play second-fiddle to the hours of practice. Sheepdogs is a bright and breezy thriller about prepared paramilitary types who know what they’re doing.

Ackerman’s background is such, even a confection like Sheepdogs begs to be read autobiographically. He served five tours of duty in Iraq and Afghanistan. He received the Silver Star, the Bronze Star for Valor, and the Purple Heart. He was also, for a little while, attached to the Ground Branch of the CIA’s Special Activities Division, and he has a whole lot of fun with that institution here, as “Uncle Tony”, a Division spook obsessed with Hyatt reward points, scrabbles about the globe looking for ways to pay the wages of off-the-books armies everywhere from Iraq to Somalia, Yemen to Ukraine.

Uncle Tony looks to have inspired the mess in which our heroes are here embroiled. Cheese (as in “the big cheese”, the most versatile pilot Afghanistan’s military ever produced, now working in a filling station) and former Marine Raider Skwerl (think “squirrel” — Marines can’t spell for shit — financially and reputationally ruined for whistleblowing on an intelligence FUBAR) are being paid to steal — sorry, repossess — that most reliable of thriller macguffins, a private jet.

But the handover in Marseille goes badly wrong, the jet’s owners seem to be stealing it from themselves, and Skwerl and Cheese soon find themselves out of the loop, out of pocket and decidedly out of luck, pursued back and forth across the Atlantic by a remarkably well-connected former Afghan security guard who’s out to avenge, well, something…

Ackerman has a lot of fun with that private plane, a Bombardier Challenger 600 that loses an aileron (a control flap) in a collision with a golf cart, and not long after has its leather and mahogany interior torn out by a famished grizzly bear. The business of hiding the fixing the plane brings in a couple of well-drawn side characters, the survivalist Just Shane and Ephraim, an excommunicated Amish handyman who whittles a replacement aileron out of wood (not as daft as it sounds). Cheese’s better half Fareeda (four months pregnant) and Skwerl’s much more frightening half Sinaed (a professional dominatrix) round out a cast just kooky and diverse enough — and small enough — to tick every box at Apple TV, who’ve paid seven figures to develop Sheepdogs as a series.

Announcements of the novel’s bright televisual future make it slightly tricky to review, since what makes perfect sense for the IP doesn’t necessarily play well on the page. Ackerman is determined not to create any monsters here; he’s much more interested in telling — in the gentlest manner imaginable — broader truths about modern warfare, its commercial imperatives and human toll.

After dozing through tosh like Citadel and The Night Agent, we’ll surely lap up a TV thriller created by someone who knows guns, and better still, understands the men who wield them. That said, I can’t but deplore a literary thriller that leaves all my favourite characters standing, and not just standing, chatting, and not just chatting, understanding each other.

Well, you don’t make an omelette without cracking eggs, I suppose. I can remember when, in 1987, a fine literary writer called James Lee Burke wrote a detective novel about Dave Robicheaux. I adored Burke’s early books, but nearly forty years and over two dozen outings later, I’m hardly going to sit here and say that palling up with Dave was a backward move, now, am I?

Besides, Ackerman’s literary career has been sliding about all over the place, from brilliant memoirs of combat in Afghanistan (and don’t get him started about that catastrophic US withdrawal in August 2021) to best-selling geopolitical thrillers with James Stavridis, a retired US admiral, to clotted oddities like 2023’s Halcyon, a family drama set in an alternate Gore-led America that has cured death. The thriller genre has its limitations, but one of the very best things it can do is give writers a point of focus, who would otherwise go off like a box of firecrackers.

The trouble with Sheepdogs — a thriller that lacks excitement, a comedy without much in the way of humour, and a story about the wages of war that eludes depth — is that it shows its writer still shuffling up to the starting line and sucking on the water bottle. I know I shouldn’t second-guess Ackerman’s intentions. But I hope there will be sequels, and that Sheepdogs becomes a long project for him. Keeping up with the small screen will do him good. Remember: practice makes perfect!

The Stirring Adventures of Relikh and Shovlin

Watching David Cronenberg’s The Shrouds for New Scientist, 6 August 2025

Myrna (Jennifer Dale) has likely had better blind dates. The edible flowers on her starter look funereal; her table-for-two is hemmed in by strange shrouds in tall vitrines; and as she makes small-talk with the owner Karsh (Vincent Cassel), it becomes increasingly apparent to her, and to us, that this restaurant is attached — financially, architecturally and intellectually — to a cemetery.

And not just any cemetery: its headstones have viewscreens. Because they’re swaddled in those natty (camera-riddled, internet-enabled) shrouds, you can come here to watch your dead loved ones decompose.

Over a career spanning more than half a century, David Cronenberg has mastered the art of delivering everything at the wrong speed. On paper, and in precis, his films look like satires. Their playfulness is self-evident. Just look at the characters’ surnames: Karsh’s is “Relikh”. Myrna’s is “Shovlin”, for heaven’s sake. And — again on paper — what’s to take seriously about this scenario, which takes pot-shots at internet-of-everything boosterists (who would surely network the dead if they could) and “grief tech” start-ups that, among other money-making wheezes, invite you to chat with posthumously fed, AI-enabled avatars of your deceased loved ones?

But Cronenberg does not write satires. He writes full-throated screenplays (and one novel) about what you and I might actually experience, were these oh-so-satirical scenarios to come to pass, stretching our sense of ourselves.

Karsh’s date with Myrna goes nowhere, but the tech entrepreneur does find solace — and more than solace — in Terry, his dead wife’s identical twin. Diane Kruger plays both the living sister and the dead one, and also voices Hunny, an untrustworthy digital assistant programmed by Terry’s loser ex-husband Maury (a wonderfully weasily Guy Pearce). At night the dead sister Becca turns up, without a breast, without an arm, as the ravages of her disease take hold. Are these nighttime visitations flashbacks, or fantasies? Do they humanise Karsh, because he loves his wife, however disfigured, or do they damn him, because he very clearly loves his wife’s disfigurement? Karsh is caught between guilt, anger and desire, convinced Becca was unfaithful with her old professor and first lover, and at the same time that the professor was conducting illegal experiments on her; and at the same time that all of this is a smokescreen concealing some deeper, more political conspiracy involving China, or maybe Russia, or maybe Budapest, or maybe Iceland (and all the while Terry, who loves a good conspiracy, can’t help but encourage Karsh’s mounting mania).

David Cronenberg’s wife of 38 years, Carolyn Zeifman, died in 2017 after a long battle with cancer, and it’s tempting to watch The Shrouds as an act of cinematic over-sharing. All five of Elisabeth Kübler-Ross’s “five stages of grief” — denial, anger, bargaining, depression, and acceptance — are explored in Cassel’s superb performance, weaponised by fantastical technology, or by paranoid technological fantasy, into a welter of unresolved plot macguffins. What if the strange growths on Becca’s dead bones are surveillance devices? What if the Chinese government is using our own corpses to spy on us? What if those growths are just cunningly camouflaged video assets; did Maury code them?

Imagine a restaurant full of exit signs and no exits and the maitre d’ shouting “Fire! Fire!” in your ear.

While The Shrouds may well be an expression of purely personal grief, 26 films in it’s equally clear that grief is Cronenberg’s abiding theme, and the engine that’s been driving his entire artistic output. In his movies, we make what accommodations we can with reality, but by the last reel it’s clear reality just isn’t listening.

The Shrouds is a wordy film, whose characters calmly explain ever more unlikely technologies to each other, convince each other of ever more complex conspiracy theories, and assert themselves in ever more outlandish ways. Nothing happens because, you know, DEATH. Calm, slow, relentless, The Shrouds is one of those devastating chamber pieces great directors make sometimes when they have nothing left to prove, and everything still to say.

The twist is, there is no twist

Watching Danny Boyle’s 28 Years Later for New Scientist, 18 June 2025

Here’s a bit of screenplay advice to nail above the desk: make your plots simple and your characters complicated.

We can polish off the story of 28 Years Later in a couple of paragraphs. It’s the late-coming third instalment in a series that began in 2002, with 28 Days Later. A lab-grown neurotoxic virus has spread uncontrollable, orgiastic rage across continental Europe. The infection is eventually quarantined to mainland Britain. International fleets ensure that no-one leaves Blighty.

Twelve-year-old Spike (Alfie Williams, a relative newcomer and definitely a face to follow) lives in the relative safety of a small northern island, connected to the mainland by a causeway passable only at low tide. At twelve years old (rather young for the task, but his dad reckons he’s ready) Spike leaves for the mainland to be blooded. Amid trackless forests (perhaps not quite trackless enough after 28 years; otherwise the film’s mis-en-scene is superb and chilling) Spike kills a very slow zombie, misses a blisteringly fast one, and generally gives a good account of himself.

But it sits ill with Spike, once he’s home, to be cheered as a hero by all these drunken villagers, even as his mother lies bedridden with a mysterious illness, and his father Jamie (Aaron Taylor-Johnson) seeks distraction with another woman. So Spike sneaks his mum (Jodie Comer) off the island and sets out with her in search of the only doctor he’s ever heard of — a painted lunatic who spends his days in the woods burning corpses.

The twist—and let’s face it, you’re agog for the twist—is that there is no twist. Having established the rules of this world in 2002’s 28 Days Later, writer Alex Garland has simply and wonderfully stuck to his guns. There are flourishes: a vanishingly small number of zombies have survived the initial viral outbreak to breed and become an almost-viable competitor species. Some of them now grow very big indeed, thanks to the “steroid-like” effects of the original infection. But these aren’t new attractions so much as patches and fixes, and they’re delivered very much in the make-and-mend-and-keep-going spirit that hangs over Spike’s doughty little island village.

Nothing is quite as it seems — when is it ever? — and every once in a while, Boyle mischievously intercuts Laurence Olivier’s Henry V with Great War newsreel and 28 Weeks Later zombie outbreak footage to imply a deeper, darker significance to the village’s homespun defence league and its culling expeditions. There are nods to folk horror, to Apocalypse Now, to Aliens 3 and to Predator. But this is not a tricksy movie, and its intent is clear: in this world so long steeped in horror, there’s going to be this human story, about loss and disillusion, about growing up and growing apart, about when to stand with others, and when to stand alone, and all conveyed through the credible words and reasonable actions of largely unexceptional human beings. The budget is modest (somewhere between $60 and 75 million). The casting is meticulous (see how Christopher Fulford plays Spike’s grandfather with an effortless friendliness that all the while implies some harrowing backstory). And don’t get me wrong: 28 Years Later is full of invention, laden with fan-pleasing call-backs and cineastic cap-tugging. But never once does it cheat. There’s not a single fatuous macguffin pulling us through. No dumb quest. No magical grail. No grand unmasking. Only the feeling spilling from Alfie Williams’s eyes as young Spike learns, line by line and scene by scene, what he must acquire, and what he must let go, if he’s to be a man in this world.

All credit to Days, whose fast and furious “infected” shocked and delighted us all in 2002; all credit, too, to 2007’s oddly overlooked Weeks, an ingenious sequel and quite as good an expansion on its original as Aliens was to Alien. But Years carries the crown, at least for now (there’s a second instalment coming).

How not to save an elephant

Reading Saabira Chaudhuri’s Consumed for the Telegraph, 16 May 2025 

What happens when you use a material that lasts hundreds of years to make products designed to be used for just a few seconds?

Saabira Chaudhuri is out to tell the story of plastic packaging. The brief sounds narrow enough, but still, Chaudhuri must touch upon a dizzying number of related topics, from the science of polymers to the sociology of marketing.

Consumed is a engaging read, fuelled by countless interviews and field trips. Chaudhuri, a reporter for the Wall Street Journal, brings a critical but never censorious light to bear on the values, ambitions and machinations of (mostly US) businesspeople, regulators, campaigners and occasional oddballs. Medical and environmental concerns are important elements, but hers is primarily a story about people: propelled to unexpected successes, and stymied by unforeseen problems, even as they unwittingly steep the world in chemicals causing untold damage to us all.

Plastic was once seen as the environmental option. With the advent of celluloid snooker balls and combs in the 1860s, who knows how many elephant and tortoise lives were saved? Lighter and stronger than paper (which is far more polluting and resource-intensive than we generally realise), what was not to like about plastic packaging? When McDonald’s rolled out its polystyrene clamshell container across the US in 1975, the company reckoned four billion of the things sitting in landfill was a good thing, since they’d “help aerate the soil”.

The idea was fanciful and self-serving, but not ridiculous, the prevailing assumption then being that landfills worked as giant composters. But landfill waste is not decomposed so much as mummified, so mass and volume count: around half of all landfill is paper, while plastic takes up only a few per cent of the space. This mattered when US landfill seemed in short supply (a crisis that proved fictitious — waste management companies were claiming a shortage so they could jack up their fees.)

The historical (and wrong) assumption that plastics were environmentally inert bolstered a growing post-war enthusiasm for the use-and-discard lifestyle. In November 1963, Lloyd Stouffer, the editor of Modern Plastics magazine, addressed hundreds at a conference in Chicago: ‘The happy day has arrived when nobody any longer considers the plastics package too good to throw away.’

A 2015 YouTube video of a turtle found off the coast of Costa Rica with a plastic straw lodged in his nose highlighted how plastic, so casually disposed of, destroyed the living world, entangling animals, blocking their guts, and rupturing their internal organs.

Rather than abandon disposability, however, manufacturers were looking for ways to make plastics recyclable, or at least compostable, and this is where the trouble really began, and commercial imperatives took a truly Machiavellian turn. Recycling plastic is hard to do, and Chaudhuri traces the ineluctable steps in business logic that reduced much of that effort into nothing more than a giant marketing campaign: what she dubs “a get-out-of-jail-free card in a situation otherwise riddled with reputational risk.”

Recycling cannot, in any event, address the latest crisis to hit the industry, and the world. In 2024 the New England Journal of Medicine published an article linking the presence of microplastics to an increased risk of stroke or heart attack, substantiating the suspicion that plastic particles ranging in size from 5,000 micrometres to just 0.001 micrometres, are harmful to health and the environment.

First, they turned up in salt, honey, teabags and beer, but in time, says Chaudhuri, “microplastics were found in human blood, breast milk, placentas, lungs, testes and the brain.”

Then there are the additives that lend a plastic its particular qualities (flexibility, strength, colour and so on); these disrupt endocrine function, contributing to declining fertility in humans — and not just humans — all around the world. The additives transmigrate and degrade in unpredictable ways (not least when being recycled), to the point where no-one has any idea what and how many chemicals we’re exposing ourselves to. How can we protect ourselves against substances we’ve never even seen, never mind studied?

But it’s the humble plastic sachet — developed in India to serve a market underserved by refuse collectors and low on running water — that provides Chaudhuri with her most striking chapter. The sachets are so cheap, they undercut bulk purchases; so tiny, no recycler can ever make a penny gathering them; and so smeared with product, no recycling process could ever handle them.

Chaudhuri’s general conclusions are solid, but it’s engaging business anecdotes like this one that truly terrify.

Go down singing

Watching Joshua Oppenheier’s musical The End for New Scientist, 14 May 2025

Life on the planet’s surface has become nigh-on unbearable, but with money and resources enough, the finest feelings and highest aspirations of our culture can be perpetuated underground, albeit for only a chosen few. Michael Shannon plays an oil magnate who years ago brought his family to safety in an old mine. Here he rewrites his and his company’s history in a self-serving memoir dictated to his grown-up but critically inexperienced son (George Mackay; I last encountered him in The Beast, which I reviewed for New Scientist) while his wife, the boy’s mother, played by Tilda Swinton, curates an art collection somehow (and perhaps best not ask how) purloined from the great collections of the world.

The mine — an actual working salt mine in Patralia Soprana, Sicily — is simultaneously a place of wonder and constriction. You can walk out of the bunker and wander around its galleries, singing as you go (did I mention this was a musical?), but were you to hike outside the mine, I don’t fancy your chances. It’s a premise familiar from any number of post-apocalyptic narratives, from Return to the Planet of the Apes (1975) to last year’s streaming hit Fallout.

When a rare surface-dweller (Moses Ingram) stumbles on their home, it looks as though she’ll be expelled, and more likely killed, to keep this shangri-la a secret. But at the last moment the Boy cries, “I don’t want to do this!”

It turns out nobody else wants to, either, not even the Mother, who’s the most terrified of the bunch.

Clumsily, over two and a half hours, the family draw this stranger into their bubble of comforting lies. (The End is too long, but you could lay the same charge at most of the musicals on which it’s modelled — have you tried sitting through Oklahoma! recently?)

Lies — this is the film’s shocking premise — are necessary. Lies stand between us and despair. They create the bubbles in which kindness, generosity and love can be grown. Like the golden-age musicals of the 1950s to which it plays musical homage, The End tells an optimistic tale.

The young visitor resists assimilation at first, because she can’t forgive herself for abandoning her own family on the surface. Living as if she belonged to this new family would be to let herself off the hook for what she did to the old one.

Worn down by the young woman’s honesty, the family reveals its complicity in the end of the world. The father’s industry set fire to the sky. The mother finally admits she wants the planet’s surface to be uninhabitable because if it isn’t, the family she abandoned there might still be alive and suffering. The mother’s best friend, her son’s confident, played by Bronagh Gallagher, sacrificed her own child years ago to ensure her own survival.

But then, bit by bit, song by song, this wounded and reconfigured family sews itself a new cocoon of lies and silences, taboos and songs (the songs are accomplished and astonishing), all to make life not just bearable, but possible. Of course the stranger ends up absorbed in this effort. Of course she ends up singing along to the same song. Wouldn’t you, in time?

Whoever these people used to be, and however you much you point accusing fingers at their past, the fact is that these are all good people, singing their way back into the delusion they need to keep going, day after subterranean day.

True, the lies we tell today tell us tomorrow. But this unlikely, left-field, musical — my tentative pick for best SF movie of 2025 — is prepared to forgive its compromised characters. We can only get through life by lying about it, so is it any wonder we make mistakes? Should the worst come to the worst, we should at least be permitted to go down singing.

Expendable

Watching Bong Joon Ho’s Mickey 17 for New Scientist, 2 April 2025

Mickey (Robert Pattinson) is an “expendable”. Put him in harm’s way, and if he dies, you can always print another. (Easily the best visual gag in this disconcertingly unfunny comedy is the way the 3D printer stutters and jerks when it gets to Mickey’s navel.)

And for human colonists on the ice planet of Niflheim (one for all you Wagner fans out there) there’s plenty of harm for Mickey to get in the way of. There’s the cold. There’s the general lack of everything, so that the settlers must count every calorie and weigh every metal shaving. Most troublesome of all are the weevil-like creatures that contrive to inhabit — and chomp through — the planet’s very ice and rock. What they’re going to do to the humans’ tin-can settlement is anyone’s guess.

Mickey’s been reprinted 16 times already, mostly because medical researchers have been vivisecting him in their effort to cure a plague. The one thing that doesn’t kill him, ironically enough, is falling into a crevasse and being swallowed by a weevil. Who saw that one coming? Certainly not the other colonists: when Mickey returns to camp, he finds he’s already been reprinted.

As science fiction macguffins go, this one’s nearly a century old, its seeds sown by Aldous Huxley’s Brave New World (1932) and John Campbell’s “Who Goes There?” (1938). Nor can we really say that Korean director Bong Joon Ho (celebrated for savage social satires like Parasite and Snowpiercer) “rediscovered” it. Actor Sam Rockwell turned in an unforgettable tour de force, playing two hapless mining engineers (or the same hapless mining engineer twice) in Duncan Jones’s Moon — and 2009 is not that long ago.

The point about macguffins is that they’re dead on arrival. They have no inner life, no vital force, no point. They stir to life only when characters get hold of them, and through them, reveal who they truly are. It’s hard to conceive of an idea more boring than invisibility. HG Wells’s invisible man, on the other hand, is (or slowly and steadily becomes) a figure out of nightmare — one that, going by the number of movie remakes, the culture cannot get out of its head.

What does Bong Joon Ho say with his “doubles” macguffin? It depends where you look. For the most satisfying cinematic experience, keep your eyes fixed on Robert Pattinson. Asked to play a man who’s died sixteen times or seventeen times already, he turns in two quite independent performances, wildly different from each other and both utterly convincing. Mickey 17 is crushed by all his many deaths; Mickey 18 is rubbed raw to screaming by them. Consider the character of Connie Nikas in Benny and Josh Safdie’s Good Time (2017), or Thomas Howard in Robert Eggers’s The Lighthouse (2019): to say that Pattinson plays underdogs well is like saying that Frank Whittle knew a thing or two about motors, or that Hubert de Givenchy dreamt up some nice frocks.

Taken all in all, however, Mickey 17 is embarrassingly bad. It takes 2022’s bright, breezy, blackly comic novel by Edward Ashton, strips out its cleverness and gives us, in return, Mark Ruffalo’s unfunny Donald Trump impression as colony leader Kenneth Marshall, and Naomi Ackie (as Mickey’s — sorry, Mickeys’ — love interest) throwing a foul-mouthed hissy-fit out of nowhere (I swear you can see the confusion in the actor’s eyes).

Anyone who read Ashton’s book and watched Ho’s Snowpiercer might be forgiven for expecting Mickey 17 to be a marriage made in cinema heaven. For one brief moment in its overlong (2-hour 17-minute) run-time, a cruelly comic dinner party scene seems about to tip us into that other, much better film — a satire tied around power and hunger.

Then Tim Key turns up in a pigeon costume. Now, I adore Tim Key, but sticking him in a pigeon costume (for an entire movie, yet) in the hope that this will make him even funnier is as wrong-headed and insulting to the talent as, say, under-lighting Christopher Walken’s face to make him look even scarier.

When a film goes this badly awry, you really have to wonder what happened in the editing suite. My guess is that some bright spark from the studio decided the film was far too difficult for its audience. This would at least explain the film’s endless monologuing and its yawn-inducing pre-credits sequence, which loops back like a conscientious nursery-school worker to make sure the stragglers are all caught up.

Oh, enough! I’m done. Even the weevils were a disappointment. In the book they were maliciously engineered giant centipedes. How, I ask you, can a famously visual filmmaker not even embrace them?

Machine men with machine minds

Reading The Ideological Brain by Leor Zmigrod for the Telegraph, 9 March 2025

At the end of his 1940 film The Great Dictator, Charlie Chaplin, dressed in Adolf Hitler’s motley, looked directly at the camera and declared war on the “machine men with machine minds” who were, even then, marching roughshod across his world. Nor have they vanished. In fact, if you believe Leor Zmigrod, you may, without knowing it, be an inflexible, dogmatic robot yourself.

Zmigrod, a young Cambridge neuroscientist, studies “cognitive rigidity”. What does that look like in the real world? Jimmy McGill, hero of the TV series Better Call Saul, can give us a quick handle on one of its manifestations: “The fallacy of sunk costs. It’s what gamblers do. They throw good money after bad thinking they can turn their luck around. It’s like, ‘I’ve already spent this much money, or time, whatever. I got to keep going!’” And as Jimmy adds: “There’s no reward at the end of this game.”

Zmigrod’s ambition has been to revive the project of 18th-century French nobleman Antoine Destutt de Tracy, whose science of “ideologie” sought a rigorous method for discovering when ideas are faulty and unreliable. (This was my first warning signal. If a man in a white coat told me that my ideas were “objectively unreliable”, I would snatch up the biggest stick I could find. But let’s run with the idea.)

A happier lodestar for Zmigrod’s effort is Else Frenkel-Brunswik, an Austrian refugee in America who spent the 1950s testing children to see whether or not she could predict and treat “machine minds”. Frenkel-Brunswik found it easy to distinguish between the prejudiced and unprejudiced, the xenophobic and the liberal child. She found that an upbringing steeped in obedience and conformity not only fostered authoritarianism; it made children cognitively less flexible, so that they had a hard time dealing with ordinary, everyday ambiguities. Even sorting colours left them vexed and unhappy.

Zmigrod argues that the dogmatic mind’s information processing style isn’t restricted to dealing with ideological information. It’s “a more generalised cognitive impairment that manifests when the dogmatic individual evaluates any information, even at the speed of under a second.” Cognitive rigidity thus goes hand-in-hand with ideological rigidity. “This may seem obvious to some,” Zmigrod writes. “A rigid person is a rigid person. But in fact, these patterns are not obvious.”

But they are, and that’s the problem. Nearly 200 pages into The Ideological Mind, they’re even more obvious than they were when Zmigrod first said they weren’t. She reveals, with no little flourish, that in one of her studies, “obedient actions evoked neural activity patterns that were markedly different from [those produced by] free choices.” Well, of course. If the mind can differentiate between free action and obedience, and a mind is a property of a brain at work, then a good enough scanner is bound to be able to spot “differences in neural activity” at such times. Again and again, Zmigrod repackages news about improved brain-scanning techniques as revelations about the mind. It’s the oldest trick in the neuroscientific writer’s book.

Zmigrod began work on The Ideological Brain as Donald Trump became US president for the first time and Britain voted to leave the EU. Her prose is accomplished, and she takes us on an entertaining ramble past everything from demagoguery to dopamine uptake, but the fresh alarm of those days bleeds too consistently through her views. She’s unremittingly hostile towards belief systems of all kinds; when she writes of a 2016 study she conducted that it showed “the extreme Right and the extreme Left were cognitively similar to each other”, I wondered whether proving this truism in the lab really contributed to an understanding of the actual merits, or otherwise, of such ideologies. Zmigrod seems, sometimes, to mistrust belief per se. In her last chapter, we get her idea of the good life: “No pressure, no predestination, no ancestors on your shoulders or rituals to obey, no expectations weighing you down or obstructing your movement.” I had to read this several times before I could believe it. So her alternative to dogmatism is, what, Forrest Gump?

Zmigrod’s arguments assume that people are a unity. They’re not. The most thoughtful and open-minded people in argument I ever met were members of MAGA militias. She assumes the workings of mind can be read off a scanner. I’m not at all against hard materialism as an idea, but imagine explaining the workings of a computer chip by describing the icons on a computer screen: that’s how Zmigord describes thought.

Besides, the evident fact that brains age, and minds with them, is never a factor in Zmigord’s argument. Variations in cognitive flexibility can be easily explained by the aging process, without any recourse to machines that go “bleep”. Is it so unreasonable to expect a young mind to be more liberal and open to exploration, and an old mind to be more conservative, more dedicated to wringing value from what it already knows? Few of us want to be old before our time; few want to be a 90-year-old adolescent.

“Repeating rules and rituals, rules and rituals, has stultifying effects on our minds,” Zmigrod insists. “With every reiteration and rote performance, the neural pathways underpinning our habits strengthen, whereas alternative mental associations – more original yet less frequently rehearsed – tend to decay.” I see this as a picture of learning; Zmigrod sees a slippery slope ending in extremism. You can look at the world that way if you want, but I can’t see that it’ll get you far.