Microphotography

The eye of a Metapocyrtus subquadrulifer beetle

Covering the Nikon Small World competition for New Scientist,11 October 2018

Microphotography has come along way since Nikon staged the first Nikon Small World competition in 1974. Finalists in 2018 harnessed a dizzying array of photographic techniques to achieve the spectacular results displayed here. A full-colour calendar of the winners is in the works, and people in the US can look forward to a national tour of the top images.

Yousef Al Habshi from the United Arab Emirates won first prize with the image above of the compound eyes and surrounding greenish scales of a weevil, Metapocyrtus subquadrulifer.  It was made by stacking together 129 micrographs — photographs taken through a microscope. “I feel like I’m photographing a collection of jewelry,” said Al Habshi of his work with these beautiful Philippine beetles, which are more usually considered agricultural nuisances and targets for pest control.

fern sorus — structures that produce and contain spores

Rogelio Moreno from Panama won second prize for capturing the spore-containing structures of a fern (above). He used a technique called autoflorescence, in which ultraviolet light is used to pick out individual structures. Spores develop within a sporangium, and Moreno has successfully distinguished a group of these containers from the clustered structure called the sorus. Sporangiums at different stages of development show up in different colours.

Spittlebug nymph in its bubble house

Saulius Gugis from the USA photographed this spittle-bug in the process of making its “bubble-house”. The foamy structure helps the insect hide from predators, insulate itself and stay moist. The photograph won third prize.

A spider embryo with the surface stained

Other highlights from the prize include a portrayal of the first stirrings of arachnid life by Tessa Montague at Harvard University. The surface of this spider embryo (Parasteatoda tepidariorum) is picked out in pink. The cell nuclei are blue and other cell structures are green.

The mango seed weevil

Looking for all the world like an extra from Luc Besson’s sci-fi film The Fifth Element, this magnificent mango seed weevil (Sternochetus mangiferae) earned Pia Scanlon, a researcher for the Government of Western Australia, a place among the finalists.

VR: the state of the art

 

mg22730320.500-1_800

For New Scientist

THEY will tell you, the artists and engineers who work with this gear, that virtual realities are digital environmental simulations, accessed through wearable interfaces, and made realistic – or realistic enough – to steal us away from the real world.

I can attest to that. After several days sampling some of the latest virtual environments available in the UK, I found that reality righted itself rather slowly.

Along the way, however, I came across a question that seemed to get to the heart of things. It was posed by Peter Saville, prime mover of Manchester’s uber-famous Factory Records, and physicist Brian Cox. They explained to an audience during Manchester’s International Festival how they planned to fit the story of the universe on to sound stages better known for once having housed legendary soap Coronation Street.

Would The Age of Starlight, their planned immersive visualisation of the cosmos, give audiences an enriched conception of reality, or would people walk home feeling like aliens, just arrived from another planet?

Cox enthused about the project’s educational potential. Instead of reading about woolly mammoths, he said, we will be able to “experience” them. Instead of reading about a mammoth, trying to imagine it, and testing that imagined thing against what you already know of the world, you will be expected to accept the sensory experience offered by whoever controls the kit.”We will be able to inject people with complex thoughts in a way that’s easier for them to understand!” Cox exclaimed. So, of course, will everyone else.

Institutions of learning, then, had best associate their virtual reality experiments with the most trustworthy figure they can find, such as David Attenborough. His First Life is the London Natural History Museum’s joyride through perilous Cambrian shallows, built on the most recent research.

“When the film starts, try to keep your arms to yourselves,” begged the young chap handing out headsets at the press launch, for all the world as though this were 1895 and we were all about to run screaming from Louis Lumière’s Arrival of a Train. The animator, given free rein, renders tiny trilobites on human scale. This is a good decision – we want to see these things, after all. But such messing around with scale inevitably means that when something truly monstrous appears, we are not as awed as we perhaps ought to be.

VR sets awkward challenges like this. From a narrative perspective, it is a big, exciting step away from film. Camera techniques like zooming and tracking ape the way the eye works; with VR, it is up to us what we focus on and follow. Manipulations have a dreamlike effect. We do not zoom in; we shrink. We do not pan; we fly.

Meanwhile, virtual reality is still struggling to do things everyone assumes it can do already. Accurately reading a user’s movements, in particular, is a serious headache. This may explain the excitement about the two-person game Taphobos, which solves the problem by severely limiting the player’s movements. Taphobos, a play on the Greek words for “tomb” and “fear”, traps you in a real coffin. With oxygen running out, the entombed player, equipped with an Oculus Rift headset, must guide their partner to the burial site over a radio link, using clues dotted around the coffin.

“This combination,” say the makers, master’s students in computing at the University of Lincoln, UK, “allows you to experience what it would be like if you were buried alive with just a phone call to the outside world.” Really? Then why bother? By the time you have addressed virtual reality’s many limitations, you can end up with something a lot like, well, reality.

London’s theatre-makers know this. At first, immersive entertainments such as Faust (2006) and The Masque of the Red Death (2007), pioneered by the theatre company Punchdrunk, looked like mere novelties. Now they are captivating bigger audiences than ever.

Traditional theatregoers may grow weary of running confused across gargantuan factories and warehouses, trying to find where the action is, but for gamers such bafflement is a way of life, and to play scenarios out in the real world is refreshing.

Until 27 September, London-based Secret Cinema offers a similar sort of immersion: inviting you to come battle the evil Empire through several meticulously realised sets as a warm-up to a screening of The Empire Strikes Back. It’s all played at a gentle, playful pace: something between a theatrical experience and a club night.

Right or wrong, VR promises to outdo these entertainments. It’s supposed to be better, more engaging than even the most carefully tailored reality. That’s a very big claim indeed.

More likely, VR may be able to present the world in a way that would otherwise be inaccessible to our unaugmented senses. The first tentative steps in this direction were apparent at this year’s Develop games conference in Brighton, where the Wellcome Trust and Epic Games announced the winner of their first Big Data Challenge. Launched in March, the competition asked whether game designers could help scientists to better visualise incomprehensibly large data sets.

Among the front runners was Hammerhead, a team taking on the enormous task of designing a decent genomics browser. They have barely changed in a decade. Once they held barely a dozen data fields, now they need hundreds since studying the behaviour of different genes under different conditions is a multidimensional nightmare. Martin Hemberg of the Sanger Institute, who set the challenge, explained: “Genomics is very data-intensive. Trying to integrate all this and make sense is a huge challenge. We need better visualisation tools.”

Hammerhead’s proposal promises something close to SF writer William Gibson’s original conception of cyberspace: a truly navigable and manipulable environment made of pure information. Not surprisingly, it will take more than the challenge’s modest $20,000 to realise such a vision.

Instead, the prize was handed to two London studios, Lumacode and Masters of Pie, who collaborated on a tool that is already proving itself as it takes the 14,500 family health records in the Avon Longitudinal Study of Parents and Children, and spits them out in real time so researchers can follow their hunches. It even boasts privacy tools to facilitate the work of hundreds of researchers worldwide.

On current evidence, today’s VR is going to change everything by just a little bit. It will disconcert us in small ways. It will not give us everything we want. But reality doesn’t either, come to that. We can afford to be patient.