The hallucination machine

Imagination is a commonplace idea. To imagine is to form an image in the mind, to contrive, devise or represent something. Most would affirm that a vivid imagination is an asset, a function exercised by competent designers, poets, artists, authors and inventors. Here I am continuing a theme from an earlier post Creative cognition.

Now consider “hallucination.” According to the OED, to hallucinate is “to be deceived, suffer illusion, entertain false notions, blunder, mistake.” The word derives from Latin ālūcinārī to wander as a mental activity, but via French it has come mean a condition in which one is amazed, halluciner.

Hallucination has ready associations with drug culture and certain pathologies. It conjures up something transgressive, deviant and possibly shocking. (For some reason I am imagining the “orgasmatron” in Barbarella or Wilhelm Reich’s “orgone accumulator.”) A machine that might induce hallucinations is more provocative as an idea than one that encourages the play of the imagination — which could simply be a toy, a camera, a tv, a computer or a camera.

Controlled hallucination

Whether we call it imagination or hallucination, we draw on this faculty in every day situations of dreaming, day dreaming, conjuring up mental images, recalling, remembering, obsessing, fixating, dreading, expecting, enjoying, meditating and so on. But we also do it when trying to solve problems, to think things through, and rehearse scenarios in our minds. As part of his research into consciousness, Seth Anil describes perception in these terms as controlled hallucination.

our conscious experiences of the world and the self are forms of brain-based prediction – ‘controlled hallucinations’ (Kindle loc 164).

With Keisuke Suzuki he developed a “hallucination machine,” a demonstrator that they also described as a A Deep-Dream Virtual Reality Platform. It works with a neural network algorithm of the kind used by Google for detecting features in an image. But the deep dream platform runs it “in reverse.”

The ‘deep dream’ algorithm involves taking an artificial neural network that has been trained to recognise objects in images, and running it backwards. Networks like this consist of many layers of simulated neurons, with the connections arranged so that it resembles, in some ways, the bottom-up pathway through a biological visual system. (loc 2,114)

They trained the neural network system on over a thousand images of dogs of various breeds, as well as other objects. Run in the usual way, the algorithm would have no trouble identifying if there is a dog in a picture and even identify its breed.

The standard way these networks are used is to present them with an image and then ask what the network ‘thinks’ is in the image. The deep dream algorithm reverses the procedure, telling the network that a particular object is present, and updating the image instead. In other words, the algorithm is projecting a perceptual prediction onto and into an image (loc 2,120).

There’s a Youtube video that shows what a nightmare world would look like to someone fixated on dogs. Dogs are there in the ambiguous shapes of half shadows, foliage, and even human faces. The researchers claim to have demonstrated that the processes by which we perceive and identify elements of our world are linked to those by which we remember and imagine a world.

normal perception – in the here and now – is indeed a form of controlled hallucination (loc. 2,142)

Control is important. It implies corroboration, agreement, multiple evidence sources and some negotiation with the circumstances. Without that we would be on a continuous LSD trip, and Anil draws on his experience with the drug.

The key role of projection and anticipation are common ideas developed in Phenomenology, in particular by Martin Heidegger and Maurice Merleau-Ponty. The work of Anil and his team interests me not least as it brings laboratory observations, measurements, mathematical models, computer programs, the philosophy of science and the methods of analytical philosophy to bear on understanding the phenomena of perception and consciousness.

They also develop insights about the role of the human body in perception, identity, concepts of selfhood and abductive reasoning. Not least, there’s a pithy account of what constitutes reality.

You could even say that we’re all hallucinating all the time. It’s just that when we agree about our hallucinations, that’s what we call reality (loc.1,480).

References

  • Seth, Anil. Being You: A New Science of Consciousness. London: Faber, 2021.
  • Suzuki, Keisuke, Warrick Roseboom, David J. Schwartzman, and Anil K. Seth. “A Deep-Dream Virtual Reality Platform for Studying Altered Perceptual Phenomenology.” Scientific Reports 7, no. 1 (2017): 15982. 10.1038/s41598-017-16316-2 

Note

  • Featured image is an artwork by Colette Miller painted on the window of the Burj Khalifa, Dubai.

4 Comments

Leave a Reply