Star Trek homeSkip to main content

Begin Program: The Reality Of Building a Holodeck Today

How close is current technology to creating fully immersive photonic playgrounds?


Star Trek: Lower Decks

StarTrek.com

Since the moment we first saw Commander William T. Riker step out of an air-conditioned corridor on the U.S.S. Enterprise-D and into a lush green forest — still aboard the Enterprise — in “Encounter at Farpoint,” the holodeck has become a Starfleet fixture and a signature innovation of the Star Trek universe.

An early incarnation of the holodeck first appeared in an episode of Star Trek: The Animated Series in 1974 and was simply called a ‘recreation room.’ But it was The Next Generation that fully brought the storytelling possibilities of the 24th-century holodeck to life. It provides a space where crewmembers can run training scenarios, relax inside recreational ‘holonovels,’ and hang out with AI-powered approximations of their favorite historical figures.

Here in the 21st Century, the technology required to create interactive holographic projections still appears to be light-years away. But achieving holodeck-like experiences, capable of transporting out of our everyday world and into convincing alternate realities, are still possible.

The holodeck, from Gene Roddenberry’s original concept probably inspired by the work of Gene Dolgoff, works by blending transporter, replicator, and holography tech to create ‘solid’ photonic people and places out of ‘holomatter.’ The holodeck can also manipulate depth perception and spatial awareness to give people the sense of being in sprawling, fully-explorable environments, like the town of Sainte Claire in Voyager’s “The Killing Game” — all without anyone running into the holodeck’s solid and immovable walls.

Star Trek: The Next Generation -

StarTrek.com

Since the holodeck first appeared, people have wondered what it would take to create one in real life. “The true holodeck of the future would need to be made of some kind of infinitely configurable organic matter,” says Verity McIntosh, Senior Lecturer in Virtual and Extended Realities at the University of the West of England.

Walking up a flight of stairs, McIntosh explains, would require the floor beneath you to rise up to create steps, organize itself into the rough texture of wood as you brush past a fence, and reconfigure itself to feel like fur as a cat passes by. “It could reconstitute itself in real-time to form the shapes, textures, and properties of the virtual world,” she says.

The tech used to construct the holodeck, though, isn’t simply highly advanced; it’s physically impossible. If you were to attempt to build your own, you would have to violate the laws of physics, which, to echo the feelings of many chiefs of engineering,  is easier said than done. So until a Q pops by to leapfrog us into a new technological epoch, we can focus our energies instead on what’s readily achievable through virtual reality.

VR  has already been shown to have many applications beyond entertainment. It can help with pain management and support surgical training. Therapists have developed VR-based interventions to help with the treatment of PTSD, phobias, and managing stress. VR also gives us a meaningful way to connect with our friends, family and colleagues when we can’t see each other in person — which is why some believe it could come to define the future of working, learning, and socializing.

VR might also have an important role to play in our ongoing exploration of the cosmos. NASA has a Virtual Reality Lab where astronauts can prepare for spacewalks and train for delicate procedures, and VR has the potential to improve the wellbeing and mental health of astronauts embarking on longer missions as we travel further into space.

If VR is going to be our real-world answer to the holodeck, how does it measure up? Studies suggest that VR experiences are far more satisfying the more we feel immersed in them, which is why the goal of virtual reality developers — and what the holodeck perfectly nails — is greater immersion. Key to this is our sensation of presence,  fooling our senses to believe that we are physically present within another space.

People who work in VR have spent years figuring out what it takes to achieve this, seeking out answers to questions like ‘how should light behave,’ ‘how wide does our field of vision need to be,’  and ‘how quickly do frames need to refresh to avoid motion sickness.’ But the biggest question is how do we bridge the gap between a virtual world and our five senses.

Star Trek: The Next Generation -

StarTrek.com

“In some respects, we are already there,” says McIntosh. “We now have a huge range of experiences in VR. Many of these track our head, hands, sometimes body and even eye movements so accurately that the 3D worlds rendered in real-time trick our brains into believing that we are physically present.”

McIntosh explains that incredibly sophisticated ‘holophonic’ sounds are able to replicate the position, movement, and dynamic quality of real-world sounds—to the level that people genuinely can’t tell what’s real and what’s virtual. “I have often witnessed people lifting and replacing their headphones to try to work out if the sounds they are hearing are ‘in or out’ of the story world,” she tells us. You don’t get more present than that.

There still remain several significant obstacles between today’s VR and the hyper-real experiences of the holodeck. The first is the fidelity of our vision. In other words, how real can we make what we see inside a VR headset look?

Most people are amazed by their first VR experience, whilst recognizing that it still doesn’t come close to our visual experience of the physical world: “Even the fanciest headsets cannot yet encompass our natural field of view. The resolution achieved is nowhere near that of the human eye,” McIntosh explains.

The color spectrums and black levels in most VR headsets is worse than most TVs, and having images rendered more than 60 times per second in each eye can make a virtual world today “look more like PlayStation 2 games circa 2002,” says McIntosh. We cannot yet forget that we’re still looking at a screen.

Another major challenge for VR is in reproducing the tactile sensation of touch. Unless somebody is able to crack the code to create ‘holomatter,’ how do we reach out and touch a virtual object?

We spoke to David Parisi, Associate Professor of Emerging Media at the College of Charleston and author of the book Archaeologies of Touch, who told us about ‘sensory capture.’ This is when tech is used to “envelop and enclose the senses, through a headset and headphones and bodysuit, with sensory input being taken over by a computer-generated apparatus.” The more complete this takeover of our senses is, the more complete our sense of immersion.

Imagine being able to stand beneath a virtual thunderstorm and feel the rain on your arms because you’re wearing a long-sleeved shirt that creates small haptic vibrations to simulate the droplets as they land. Or working at a virtual canvas and feeling the paintbrush between your fingers because you’re wearing gloves that apply the perfect amount of pressure to your fingertips. Haptic technology such as this, which helps to ‘envelop’ our senses, already exists.

The Teslasuit uses electrical impulses to simulate touch all over the body. Parisi explains that this kind of tech is a big step forward but currently isn’t ideal. “It’s uncomfortable for anything more than short bursts,” he explains.

Star Trek: Lower Decks -

StarTrek.com

Whether it’s a full-body suit or a single headset, discomfort continues to be a common complaint in the VR space. The tech we currently use to step into virtual worlds is still fairly restrictive, as well as not being inclusive of all body types. And physical irritations that arise from the tech pull us out of the virtual experience. It’s hard to convince yourself you’re playing pool at a bar in the south of France if your face is throbbing from having a screen strapped to it.

Looking ahead, both Parisi and McIntosh imagine tech that can incorporate aspects of the physical world to create a hybrid ‘mixed reality’ experience — replacing the need to wear lots of cumbersome equipment. For right now, though, our brains already seem to be doing a lot of the “mental gymnastics” that help us to make VR experiences feel meaningfully realistic.

“We seem incredibly well adapted to pick up on minimal visual cues,” McIntosh says. ‘We infer stereoscopy, depth, motion, and material qualities from some pretty basic optical illusions.

“People who have tried VR often talk about what they experienced as though it actually happened to them ‘I did this’, ‘I went there’, ‘they whispered in my ear’,” McIntosh continues. “Maybe this is enough for now.” Our present-day VR tech may prove to be the historical equivalent of the VHS recorder, but the value it delivers is real and measurable. This means we could be a lot closer than we think to significant breakthroughs in the medium.

A holodeck that looks, feels, and functions like the ones we’re familiar with isn’t going to appear in our lifetimes — if at all. But the possibilities represented by virtual reality mean we may soon experience fully immersive virtual worlds and applications, which will not only benefit us here on Earth but could also be crucial to our success as we move further into outer space.


Becca Caddy (she/her) is a London-based journalist specializing in tech, science and the future. Her first book, Screen Time, was published in January 2021. You can follow her on Twitter @beccacaddy.