Click to Skip Ad
Closing in...

Interview: Lightform’s exciting new tech can turn anything around you into a screen

Augmented Reality Glasses

Back when he was at Disney Imagineering eight years ago, Brett Jones saw a demo of projected augmented reality that he thought was one of the coolest things he’d ever seen. It featured an entire enchanted forest set covered in projection, with lightning and rain, fluttering butterflies and running waterfall.

The problem, though, was the demo cost millions of dollars, and only Disney could build it. That started an idea in Jones’ head — an idea to try to help make that technology more ubiquitous.

Fast forward to today, and he’s on the road to doing so. Jones is the co-founder and CEO of Lightform, a company developing a computer made specifically for AR. The company emerged from stealth in recent weeks with almost $3 million in seed funding from Lux Capital and Seven Seas Partners, with participation from several prominent angel investors and the National Science Foundation.

The device Lightform is developing is a computer and 3D scanning device. When connected to a video projector, it lets you scan complex scenes fast and turn basically any object into a screen. AR without a headset, as the company puts it.

The basic technology behind the company’s computer is known as projection mapping. The Lightform team’s experience with projection mapping ranges from large-scale entertainment experiences to Ph.D. research experiments, everywhere from Disney Imagineering to Microsoft Research.

“The three founders, they were completing Ph.D.’s in human computer interaction and computer vision at the University of Illinois Urbana-Champaign,” said Lightform design director Phil Reyneri. “During that time, they were also doing research internships at Disney Research, Microsoft Research. They had a couple of projects at Microsoft specifically where they were working with the Kinect, and they were doing projects that involved turning an entire room into a projection mapping kind of display. So Lightform is really — it’s basically the continuation of some of that work.”

In a company blog post, Jones explains that Lightform’s device uses computer vision to 3D scan its environment, then transmits the data to the company’s desktop app. The app, he continues, uses that data to automatically generate effects and filters. And when things move, Lightform uses vision to keep content aligned.

The company’s philosophy, Reyneri says, is that projected light is more interesting than a flat screen because it can be overlaid on the existing environment. That gives designers the opportunity to blend digital content with existing materials and structures, and the company’s goal is to essentially “democratize” the medium so it can be used across film, art, education, events, signage, home entertainment and many other use cases.

“We basically think there’s a couple of interesting technical things with projection mapping that we think are cool,” Reyneri said. “I think, eventually, what we see is a confluence of lighting and display technologies, where as LED and laser light source projectors continue to shrink in size, you might start to see them become embedded throughout environments and interiors. And if those projectors had the ability to understand the environment, they could create ubiquitous displays — basically turning any surface into a display whenever, wherever. And it could be contextually aware, based on who’s in the room and what’s going on.

“I think that longer term is an interesting application of projection as opposed to screens, because again you can combine projection with the existing built environment. For now, we’re making kind of the first step there.”

Lightform is in production mode right now for its computer. The company has been working on prototypes in-house and is planning to open up preorders this summer.

“Projection mapping is a subset of augmented reality,” Reyneri said. “We’re using computer vision to align pixels with the real world. Our delivery method is just projected light as opposed to putting a transparent screen in front of your face. What we’re building now and what we’re excited about is a platform for which you can take scanned data from the environment, both 3D and 2D, and quickly create compelling content.”

Andy is a reporter in Memphis who also contributes to outlets like Fast Company and The Guardian. When he’s not writing about technology, he can be found hunched protectively over his burgeoning collection of vinyl, as well as nursing his Whovianism and bingeing on a variety of TV shows you probably don’t like.

Popular News