A finite body in space

Reading Carlo Rovelli’s Anaximander and the Nature of Science for New Scientist, 8 March 2023

Astronomy was conducted at Chinese government institutions for more than 20 centuries, before Jesuit missionaries turned up and, somewhat bemused, pointed out that the Earth is round.

Why, after so much close observation and meticulous record-keeping did seventeenth-century Chinese astronomers still think the Earth was flat?

The theoretical physicist Carlo Rovelli, writing in 2007 (this is an able and lively translation of his first book) can be certain of one thing: “that the observation of celestial phenomena over many centuries, with the full support of political authorities, is not sufficient to lead to clear advances in understanding the structure of the world.”

So what gave Europe its preternaturally clear-eyed idea of how physical reality works? Rovelli’s ties his several answers — covering history, philosophy, politics and religion — to the life and thought and work of Anaximander, who was born 26 centuries ago in the cosmopolitan city (population 100,000) of Miletus, on the coast of present-day Turkey.

We learn about Anaximander, born 610 BCE, mostly through Aristotle. The only treatise of his we know about is now lost, aside from a tantalising fragment that reveals Anaximander’s notion that there exist natural laws that organise phenomena through time. He also figured out where wind and rain came from, and deduced, from observation, that all animals originally came from the sea, and must have arisen from fish or fish-like creatures.

Rovelli is not interested in startling examples of apparent prescience. Even a stopped watch is correct twice a day. He is positively enchanted, though, by the quality of Anaximander’s thought.

Consider the philosopher’s most famous observation — that the Earth is a finite body of rock floating freely in space.

Anaximander grasps that there is a void beneath the Earth through which heavenly bodies (the sun, to take an obvious example) must travel when they roll out of sight. This is really saying not much more than that, when a man walks behind a house, he’ll eventually reappear on the other side.

What makes this “obvious” observation so radical is that, applied to heavenly bodies, it contradicts our everyday experience.

In everyday life, objects fall in one direction. The idea that space does not have a privileged direction in which objects fall runs against common sense.

So Anaximander arrives at a concept of gravity: he calls it “domination”. Earth hangs in space without falling because does not have any particular direction in which to fall, and that is because there’s nothing around big enough to dominate it. You and I are much smaller than the earth, and so we fall towards it. “Up” and “down” are no longer absolutes. They are relative.

The second half of Rovelli’s book (less thrilling, and more trenchant, perhaps to compensate for the fact that it covers more familiar territory) explains how science, evolving out of Anaximander’s constructive yet critical attitude towards his teacher Thales, developed a really quite unnatural way of thinking.

Thales, says Anaximander, was a wise man who was wrong about everything being made of water. The idea that we can be wise and wrong at the same time, Rovelli says, can come only from a sophisticated theory of knowledge “according to which truth is accessible but only gradually, by means of successive refinements.”

All Rovelli’s wit and intellectual dexterity are in evidence in this thrilling early work, and almost all his charm, as he explains how Copernicus perfects Ptolemy, by applying Ptolemy’s mathematics to a better-framed question, and how Einstein perfected Newton by pushing Newton’s mathematics past certain a priori assumptions.

Nothing is thrown away in such scientific “revolutions”. Everything is repurposed.

“What on Earth do you mean?”

How the thought acts of the Oxford don J L Austin live on | Aeon Essays

Reading A Terribly Serious Adventure: Philosophy at Oxford 1900-60 by Nikhil Krishnan for the Telegraph, 6 March 2023

Philosophy is a creature of split impulses. The metaphysicians (think Plato) wonder what things mean; and the analysts (think Socrates) try and pin down what the metaphysicians are on about. When they get over-excited (which is surprisingly often) the metaphysicians turn into theologians, and the analysts become pedants in the mold of Thomas Grandgrind, the schoolmaster in Dickens’s Bleak House, concerned only with facts and numbers.

The “analytic” (or “linguistic” or “ordinary language”) philosophy practised at Oxford University in the first half of the last century is commonly supposed to have been at once pedantic and amateurish, “made a fetish of science yet showed an ignorance of it, was too secular, too productively materialist, too reactionary and somehow also too blandly moderate. The critics can’t, surely, all be right,” complains Nikhil Krishnan, launching a spirited, though frequently wry defence of his Oxford heroes: pioneers like Gilbert Ryle and A.J. Ayer and John Langshaw Austin, troopers like Peter Strawson and Elizabeth Anscombe, and many fellow travellers: Isaiah Berlin and Iris Murdoch loom large in an account that weaves biography with philosophy and somehow attains — heaven knows how — a pelucid clarity. This is one of those books that leaves readers feeling a lot cleverer than they actually are.

The point of Oxford’s analytical philosophy was, in Gilbert Ryle’s formulation, to scrape away at sentences “until the content of the thoughts underlying them was revealed, their form unobstructed by the distorting structures of language and idiom.”

In other words, the philosopher’s job was to rid the world of philosophical problems, by showing how they arise out of misunderstandings of language.

At around the same time, in the other place (Cambridge to you), Ludwig Wittgenstein was far advanced on an almost identical project. The chief lesson of Wittgenstein, according to a review by Bernard Williams, was that philosophy cannot go beyond language: “we are committed to the language of human life, and no amount of speculative investment is going to buy a passage to outer space, the space outside language.”

There might have been a rare meeting of minds between the two universities had Wittgenstein not invested altogether too much in the Nietzschean idea of what a philosopher should be (ascetic, migrainous, secretive to the point of paranoia); so, back in Oxford, it was left to dapper, deceptively bland manager-types like John Austin to re-invent a Socratic tradition for themselves.

Krishnan is too generous a writer, and too careful a scholar, to allow just one figure to dominate this account of over half a century’s intellectual effort. It’s clear, though, that he keeps a special place in his heart for Austin, whose mastery of the simple question and the pregnant pause, demand for absolute accuracy and imperviousness to bluster must have served him frighteningly well when interrogating enemy captives in the second world war.

While Wittgenstein concocted aphorisms and broke deck chairs, Austin’s mild-mannered, quintessentially English scepticism acted as a mirror, in which his every colleague and student struggled to recognise themselves: “What on Earth do you mean?” he would say.

Are kitchen scissors utensils or tools?

Why can we speak of someone as a good batsman but not as the right batsman?

Can someone complain of a pain in the waist?

Austin’s was a style of philosophy that’s easy to send up, harder to actually do.

It drove people mad. ”You are like a greyhound who doesn’t want to run himself,” A. J. Ayer once snapped, “and bites the other greyhounds, so that they cannot run either.”

But it’s not hard to see why this project — down-to-earth to the point of iconoclasm — has captured the imagination of philosopher and historian Nikhil Krishnan; he hails from India, whose long and sophisticated philosophical tradition is, he says, :”honoured today chiefly as a piece of inert heritage.”

Krishnan’s biographical approach may be a touch emollient; where the material forces him to choose, he puts the ideas before the idiosyncrasies. But his historical sense is sharp as he skips, in sixty short years, across whole epochs and through two world wars. Oxford, under Krishnan’s gaze, evolves from Churchman’s arcadia to New Elizabethan pleasure-park with a sort of shimmering H G Wells Time Machine effect.

John Austin died in 1960 at only forty-eight; this and his lack of easily-emulated Viennese mannerisms robbed him of much posthumous recognition. But by taking Austin’s critics seriously — and indeed, by stealing their thunder, in passage after passage of fierce analysis — Krishnan offers us a fresh justification of a fiercely practical project, in a field outsiders assume is supposed to be obscure.

A cherry is a cherry is a cherry

Life is Simple: How Occam’s Razor Sets Science Free and Shapes the Universe
by Johnjoe McFadden, reviewed for the Spectator, 28 August 2021

Astonishing, where an idea can lead you. You start with something that, 800 years hence, will sound like it’s being taught at kindergarten: Fathers are fathers, not because they are filled with some “essence of fatherhood”, but because they have children.

Fast forward a few years, and the Pope is trying to have you killed.

Not only have you run roughshod over his beloved eucharist (justified, till then, by some very dodgy Aristotelian logic-chopping); you’re also saying there’s no “essence of kinghood”, neither. If kings are only kings because they have subjects, then, said William of Occam, “power should not be entrusted to anyone without the consent of all”. Heady stuff for 1334.

How this progression of thought birthed the very idea of modern science, is the subject of what may be the most sheerly enjoyable history of science of recent years.

William was born around 1288 in the little town of Ockham in Surrey. He was probably an orphan; at any rate he was given to the Franciscan order around the age of eleven. He shone at Greyfriars in London, and around 1310 was dispatched to Oxford’s newfangled university.

All manner of intellectual, theological and political shenanigans followed, mostly to do with William’s efforts to demolish almost the entire edifice of medieval philosophy.

It needed demolishing, and that’s because it still held to Aristotle’s ideas about what an object is. Aristotle wondered how single objects and multiples can co-exist. His solution: categorise everything. A cherry is a cherry is a cherry, and all cherries have cherryness in common. A cherry is a “universal”; the properties that might distinguish one cherry from another are “accidental”.

The trouble with Aristotle’s universals, though, is that they assume a one-to-one correspondence between word and thing, and posit a universe made up of a terrifying number of unique things — at least one for each noun or verb in the language.

And the problem with that is that it’s an engine for making mistakes.

Medieval philosophy relied largely on syllogistic reasoning, juggling things into logical-looking relations. “Socrates is a man, all men are mortal, so Socrates is mortal.”

So he is, but — and this is crucial — this conclusion is arrived at more by luck than good judgement. The statement isn’t “true” in any sense; it’s merely internally consistent.

Imagine we make a mistake. Imagine we spring from a society where beards are pretty much de rigeur (classical Athens, say, or Farringdon Road). Imagine we said, “Socrates is a man, all men have beards, therefore Socrates has a beard”?

Though one of its premises is wrong, the statement barrels ahead regardless; it’s internally consistent, and so, if you’re not paying attention, it creates the appearance of truth.

But there’s worse: the argument that gives Socates a beard might actually be true. Some men do have beards. Socrates may be one of them. And if he is, that beard seems — again, if you’re not paying attention — to confirm a false assertion.

William of Occam understood that our relationship with the world is a lot looser, cloudier, and more indeterminate than syllogistic logic allows. That’s why, when a tavern owner hangs a barrel hoop outside his house, passing travellers know they can stop there for a drink. The moment words are decoupled from things, then they act as signs, negotiating flexibly with a world of blooming, buzzing confusion.

Once we take this idea to heart, then very quickly — and as a matter of taste more than anything — we discover how much more powerful straightforward explanations are than complicated ones. Occam came up with a number of versions of what even then was not an entirely new idea: “It is futile to do with more what can be done with less,” he once remarked. Subsequent formulations do little but gild this lily.

His idea proved so powerful, three centuries later the French theologian Libert Froidmont coined the term “Occam’s razor”, to describe how we arrive at good explanations by shaving away excess complexity. As McFadden shows, that razor’s still doing useful work.

Life is Simple is primarily a history of science, tracing William’s dangerous idea through astronomy, cosmology, physics and biology, from Copernicus to Brahe, Kepler to Newton, Darwin to Mendel, Einstein to Noether to Weyl. But McFadden never loses sight of William’s staggering, in some ways deplorable influence over the human psyche as a whole. For if words are independent of things, how do we know what’s true?

Thanks to William of Occam, we don’t. The universe, after Occam, is unknowable. Yes, we can come up with explanations of things, and test them against observation and experience; but from here on in, our only test of truth will be utility. Ptolemy’s 2nd-century Almagest, a truly florid description of the motions of the stars and planetary paths, is not and never will be *wrong*; the worst we can say is that it’s overcomplicated.

In the Coen brothers’ movie The Big Lebowski, an exasperated Dude turns on his friend: “You’re not *wrong*, Walter” he cries, “you’re just an asshole.” William of Occam is our universal Walter, and the first prophet of our disenchantment. He’s the friend we wish we’d never listened to, when he told us Father Christmas was not real.

An intellectual variant of whack-a-mole

Reading Joseph Mazur’s The Clock Mirage for The Spectator, 27 June 2020 

Some books elucidate their subject, mapping and sharpening its boundaries. The Clock Mirage, by the mathematician Joseph Mazur, is not one of them. Mazur is out to muddy time’s waters, dismantling the easy opposition between clock time and mental time, between physics and philosophy, between science and feeling.

That split made little sense even in 1922, when the philosopher Henri Bergson and the young physicist Albert Einstein (much against his better judgment) went head-to-head at the Société française de philosophie in Paris to discuss the meaning of relativity. (Or that was the idea. Actually they talked at complete cross-purposes.)

Einstein won. At the time, there was more novel insight to be got from physics than from psychological introspection. But time passes, knowledge accrues and fashions change. The inference (not Einstein’s, though people associate it with him) that time is a fourth dimension, commensurable with the three dimensions of space, is looking decidedly frayed. Meanwhile Bergson’s psychology of time has been pruned by neurologists and put out new shoots.

Our lives and perceptions are governed, to some extent, by circadian rhythms, but there is no internal clock by which we measure time in the abstract. Instead we construct events, and organise their relations, in space. Drivers, thinking they can make up time with speed, acquire tickets faster than they save seconds. Such errors are mathematically obvious, but spring from the irresistible association we make (poor vulnerable animals that we are) between speed and survival.

The more we understand about non-human minds, the more eccentric and sui generis our own time sense seems to be. Mazur ignores the welter of recent work on other animals’ sense of time — indeed, he winds the clock back several decades in his careless talk of animal ‘instincts’ (no one in animal behaviour uses the ‘I’ word any more). For this, though, I think he can be forgiven. He has put enough on his plate.

Mazur begins by rehearsing how the Earth turns, how clocks were developed, and how the idea of universal clock time came hot on the heels of the railway (mistimed passenger trains kept running into each other). His mind is engaged well enough throughout this long introduction, but around page 47 his heart beats noticeably faster. Mazur’s first love is theory, and he handles it well, using Zeno’s paradoxes to unpack the close relationship between psychology and mathematics.

In Zeno’s famous foot race, by the time fleet-footed Achilles catches up to the place where the plodding tortoise was, the tortoise has moved a little bit ahead. That keeps happening ad infinitum, or at least until Newton (or Leibniz, depending on who you think got to it first) pulls calculus out of his hat. Calculus is an algebraic way of handling (well, fudging) the continuity of the number line. It handles vectors and curves and smooth changes — the sorts of phenomena you can measure only if you’re prepared to stop counting.

But what if reality is granular after all, and time is quantised, arriving in discrete packets like the frames of a celluloid film stuttering through the gate of a projector? In this model of time, calculus is redundant and continuity is merely an illusion. Does it solve Zeno’s paradox? Perhaps it makes it 100 times more intractable. Just as motion needs time, time needs motion, and ‘we might wonder what happens to the existence of the world between those falling bits of time sand’.

This is all beautifully done, and Mazur, having hit his stride, maintains form throughout the rest of the book, though I suspect he has bitten off more than any reader could reasonably want to swallow. Rather than containing and spotlighting his subject, Mazur’s questions about time turn out (time and again, I’m tempted to say) to be about something completely different, as though we were playing an intellectual variant of whack-a-mole.

But this, I suppose, is the point. Mazur quotes Henri Poincaré:

Not only have we not direct intuition of the equality of two periods, but we have not even direct intuition of the simultaneity of two events occurring in two different places.

Our perception of time is so fractured, so much an ad hoc amalgam of the chatter of numerous, separately evolved systems (for the perception of motion; for the perception of daylight; for the perception of risk, and on and on — it’s a very long list), it may in the end be easier to abandon talk of time altogether, and for the same reason that psychologists, talking shop among themselves, eschew vague terms suchas ‘love’.

So much of what we mean by time, as we perceive it day to day, is really rhythm. So much of what physicists mean by time is really space. Time exists, as love exists, as a myth: real because contingent, real because constructed, a catch-all term for phenomena bigger, more numerous and far stranger than we can yet comprehend.

Beware the indeterminate momentum of the throbbing whole

2ndfromright_speculative-realism-materialism

Graham Harman (2nd from right) and fellow speculative materialists in 2007

 

In 1942, the Argentine writer Jorge Luis Borges cooked up an entirely fictitious “Chinese” encyclopedia entry for animals. Among its nonsensical subheadings were “Embalmed ones”, “Stray dogs”, “Those that are included in this classification” and “Those that, at a distance, resemble flies”.

Explaining why these categories make no practical sense is a useful and enjoyable intellectual exercise – so much so that in in 1966 the French philosopher Michel Foucault wrote an entire book inspired by Borges’ notion. Les mots et les choses (The Order of Things) became one of the defining works of the French philosophical movement called structuralism.

How do we categorise the things we find in the world? In Immaterialism, his short and very sweet introduction to his own brand of philosophy, “object-oriented ontology”, the Cairo-based philosopher Graham Harman identifies two broad strategies. Sometimes we split things into their ingredients. (Since the enlightenment, this has been the favoured and extremely successful strategy of most sciences.) Sometimes, however, it’s better to work in the opposite direction, defining things by their relations with other things. (This is the favoured method of historians and critics and other thinkers in the humanities.)

Why should scientists care about this second way of thinking? Often they don’t have to. Scientists are specialists. Reductionism – finding out what things are made of – is enough for them.

Naturally, there is no hard and fast rule to be made here, and some disciplines – the life sciences especially – can’t always be reducing things to their components.

So there have been attempts to bring this other, “emergentist” way of thinking into the sciences. One of the most ingenious was the “new materialism” of the German entrepreneur (and Karl Marx’s sidekick) Friedrich Engels. One of Engels’s favourite targets was the Linnaean system of biological classification. Rooted in formal logic, this taxonomy divides all living things into species and orders. It offers us a huge snapshot of the living world. It is tremendously useful. It is true. But it has limits. It cannot record how one species may, over time, give rise to some other, quite different species. (Engels had great fun with the duckbilled platypus, asking where that fitted into any rigid scheme of things.) Similarly, there is no “essence” hiding behind a cloud of steam, a puddle of water, or a block of ice. There are only structures, succeeding each other in response to changes in the local conditions. The world is not a ready-made thing: it is a complex interplay of processes, all of which are ebbing and flowing, coming into being and passing away.

So far so good. Applied to science, however, Engels’ schema turn out to be hardly more than a superior species of hand-waving. Indeed, “dialectical materialism” (as it later became known) proved so unwieldy, it took very few years of application before it became a blunt weapon in the hands of Stalinist philosophers who used it to demotivate, discredit and disbar any scientific colleague whose politics they didn’t like.

Harman has learned the lessons of history well. Though he’s curious to know where his philosophy abuts scientific practice (and especially the study of evolution), he is prepared to accept that specialists know what they are doing: that rigor in a narrow field is a legitimate way of squeezing knowledge out of the world, and that a 126-page A-format paperback is probably not the place to reinvent the wheel.

What really agitates him, fills his pages, and drives him to some cracking one-liners (this is, heavens be praised, a *funny* book about philosophy) is the sheer lack of rigour to be found in his own sphere.

While pillorying scientists for treating objects as superficial compared with their tinest pieces, philosophers in the humanities have for more than a century been leaping off the opposite cliff, treating objects “as needlessly deep or spooky hypotheses”. By claiming that an object is nothing but its relations or actions they unknowingly repeat the argument of the ancient Megarians , “who claimed that no one is a house-builder unless they are currently building a house”. Harman is sick and tired of this intellectual fashion, by which “‘becoming’ is blessed as the trump card of innovators, while ‘being’ is cursed as a sad-sack regession to the archaic philosophies of olden times”.

Above all, Harman has had it with peers and colleagues who zoom out and away from every detailed question, until the very world they’re meant to be studying resembles “the indeterminate momentum of the throbbing whole” (and this is not a joke — this is the sincerely meant position statement of another philosopher, a friendly acquaintance of his, Jane Bennett).

So what’s Harman’s solution? Basically, he wants to be able to talk unapologetically about objects. He explores a single example: the history of the Dutch East India Company. Without toppling into the “great men” view of history – according to which a world of inanimate props is pushed about by a few arbitrarily privileged human agents – he is out to show that the EIC was an actual *thing*, a more-or-less stable phenomenon ripe for investigation, and not simply a rag-bag collection of “human practices”.

Does his philosophy describe the Dutch East India Company rigorously enough for his work to qualify as real knowledge? I think so. In fact I think he succeeds to a degree which will surprise, reassure and entertain the scientifically minded.

Be in no doubt: Harman is no turncoat. He does not want the humanities to be “more scientific”. He wants them to be less scientific, but no less rigorous, able to handle, with rigour and versatility, the vast and teeming world of things science cannot handle: “Hillary Clinton, the city of Odessa, Tolkein’s imaginary Rivendell… a severed limb, a mixed herd of zebras and wildebeest, the non-existent 2016 Chicago Summer Olympics, and the constellation of Scorpio”.

Immaterialism
Graham Harman
Polity, £9.99

Summa Technologiae by Stanislaw Lem

I reviewed this mix of prescience, philosophy and irony for New Scientist’s Culture Lab.

Here’s a more relaxed version for Lem initiates:

Stanislaw Lem

Image shamelessly ripped from Aleksander Jalosinski http://aleksanderjalosinski.pl

 

Halfway through his epic cybernetic rewiring of the Western cultural project, at the top of his rhetorical curve, and scant pages before the neologisms begin to gum and tack, tripping the reader’s feet (the second half is a slog), Polish satirist Stanislaw Lem recasts the entire universe as a boarding house inhabited by Mr Smith, a bank clerk, his puritanical aunt, and a female lodger.
The boarding house has a glass wall, and all the greats of science are about to look through that wall and draw truths about the universe from what they observe. Ptolemy notes how, when the aunt goes down to the cellar to fetch some vegetables, Mr Smith kisses the lodger. He develops a purely descriptive theory, “thanks to which one can know in advance which position will be taken by the two upper bodies when the loqwer one finds itself in the lowest position.”
Newton enters. “He declares that the bodies’ behaviour depends on their mutual attraction.”
So it goes on. Heisenberg notices some indeterminacy in their behaviour: “For instance, in the state of kissing, Mr Smith’s arms do not always occupy the same position.”
And on. And on. Mathematics comes unstuck in the ensuing complexity, where “a neural equivalent of an act of sneezing would be a volume whose cover would have to be lifted with a crane.”
Science is steadily pushing us into a Goethian cul-de-sac in which, the more accurate our theory, the closer it comes to the phenomenon itself, in all its ambiguity, strangeness, and inexplicability. At this point, Lem says, analysis must be abandoned in favour of creative activity — “imitological practice.” as he would have it, “considering the phenomenon itself its most perfect representation.”
There are nested ironies here, and it’s the devil’s work to unpick them all. Then again, any reader of Lem will have guessed this from the off, and will relish the opportunity afforded by this English translation – incredibly, for a book written in 1964 by a literary celebrity and reasonably well translated elsewhere, the first in the English language. Summa’s translator is Joanna Zylinska, a professor of new media and communications at Goldsmiths. Her work is diligent, imaginative, painstakingly precise; sometimes one wishes, in the later chapters, that she would be a little more slapdash and cut to the chase a little more, but this is Lem’s fault, not hers.
Lem was a garrulous old sod who said Steven Soderbergh’s 2002 version of his novel Solaris should have been renamed “Love in Outer Space” and put up a sign outside his house warning of “ferocious dogs” (in truth, five friendly dachshunds). Though he had some important intellectual training, Lem ploughed his own furrow, conjuring with ideas that would not become common currency for another half-century:  (virtual reality, nanotechnology, artificial intelligence, technological singularity…) When he succumbs to the autodidact’s anxiety, his prose is not pretty.
But then, Lem always worked at the edge of aesthetic possibility — which is to say, he was a science fiction writer. Science fiction is notorious for biting the hand that feeds it, for deliberately running counter to all expectation, and getting lost for decades at a time in the contested, often ugly territory where the humanities leave off and the sciences begin. Science fiction prides itself on crashing and burning, again and again, against the walls of narrative expectation and good taste. It’s the Gully Foyle of literature, fearsome and deranged and perilous in its promise: a Prometheus figure shoving fire in your face. “Catch this!”
This is what the Summa throws up: a vision of intelligence as cul-de-sac. Intelligence carries conscious beings to a point where their theories are no longer useful to them, where their hard-won objectivity drowns in a glut of complexity, and the only way to forward is for them to grow into the fabric of the world.
Fermi’s paradox: “If we are alive and intelligent and making some noise, where, in all the cosmos, is everybody else?”
Lem’s answer: Look at the rocks. Intelligence is a stepping stone on a circular path back to brute is-ness.
So much for cosmic irony; there’s a local, political irony here too, which needs some more exploration.
You see, after the Soviet occupation of Eastern Poland, Lem was banned from Polytechnic study owing to his “bourgeois origin”. His father pulled strings to get him accepted on a course in medicine at Lwów University in 1940, but this brought him up against the quack theories of Stalin’s intellectual poster-boy, the agronomist Trofim Denisovich Lysenko. Lem satirized Lysenko in a science magazine and soon abandoned his medical studies.
A word about Lysenko. With the blood of millions already on his hands from collectivisation – not to mention the wholesale eradication of countless varieties of domesticated plant – Josef Stalin needed to feed what was left of his nation. He wanted food and he wanted it now. Enter Trofim Denisovich, peddling an idea of evolution already two centuries out of date. Lysenko said things change their form in response to the environment, and pass any changes directly to their offspring. No element of chance. No randomness in selection. No genetic code to learn. Giraffes have long necks because their parents stretch.
And there is no brake on this process, neither, according to Lysenko. No natural conservatism. Things want to change. They just need some kindly direction. Spin your wheel and stick in your thumbs: the living world is clay. Oats will turn to wild oats, pines to firs, sunflowers to zinnias. Animal cells will turn into plant cells. Plants into animals! Cells from soup! “How can there be hereditary diseases in a socialist society?” From the nonliving will come the living.
Fast forward twenty years, and we have the Summa, and the Summa says,
“We cannot therefore catalogue Nature, our finitude being one of the reasons for this. Yet we can turn Nature’s infinity against it, so to speak by working, as Technologists…”
And what, exactly, will this work look like? (Bear in mind here that Lysenko cited the brilliant fruit-tree specialist Ivan Michurin as his intellectual forebear):
“A scientist wants an algorithm, wheras the technologist is more like a gardener who plants a tree, picks apples, and is not bothered about “how the tree did it.” A scientist considers such a narrow, utiliterian and pragmatic approach a sin against the laws of Full Knowledge. It seems that those attitudes will change in the future.”
The Summa is not just Lem’s vision of the future; it is Lysenko’s.
Of course this (irony of ironies) doesn’t mean that the vision is merely mischevious, a bitter political joke (though I think it is that). Perhaps Lem thinks Lysenko was simply ahead of his time, reaching for a plasticity in nature that it will take another century of biological research to effect.
Predictably, from a writer who seems permanently dangling off the edge of everyone else’s intellectual curve, Lem’s minatory vision is being explored and independently invented in the oddest places. Never mind the blandishments of the Kurzweilians and the extropians: Lem calls them “homunculists”, an inspired expression of contempt. What about Ridley Scott’s movie Prometheus? What about that animate yet unliving black goo that can bring life to sterile planets, in all its savagery, appetite and guile? What about that unsmiling species of near-Gods who, having mastered birth (the sexism is deliberate and important), sets life at its own neck in the service of some unnamed Next Project? Lem would have hated it. But then, Lem was an inveterate ironist who describes the Summa itself, that most cherished project, as a “slightly modernised… version of the famous Ars Magna, which clever Lullus presented quite a long time ago, that is, the the year 1300, and which was rightly mocked by Swift in Gulliver’s Travels.”
It is not that the ironies get in the way. It’s that the world itself is ironical, and Lem, with his vision-of-the-future-that-is-no-future, is its John the Baptist. Even as you follow him, watch him rip out the signposts. Even as you beg for water, watch him defecate in each and every roadside well. Gawp in dismay as he assembles Potemkin villages on the barren skyline only to kick them into the dust. Then: walk on. (It’s not like you have any choice.) The path looks straight. You know it’s anything but. You know, God help you, that you will come by this place again.