Why Is News Corp Investing In Augmented Reality?

Trefis Team, Contributor

Recently, News Corp Australia made its first seed investment in augmented reality company Plattar — a cloud based platform that allows publishers to create, manage and distribute augmented reality content. The Plattar platform comprises a template driven app builder and content management system for managing augmented reality experiences and can deploy content to any device. Augmented reality is touted to be the next major component in consumer engagement, especially in the real estate and gaming. News Corp’s investment appears to be aimed at tapping the potential of this technology. According to a report by Manatt digital media, augmented reality is expected to generate $80 billion in revenue (excluding hardware) by 2020 and become the backbone of immersive journalism where readers can experience a story and be part of it. We believe News Corp’s digital real estate business can get a boost by use of augmented reality technology and an immersive consumer experience can drive the company’s revenues in future.

Digital Real Estate A Key Growth Driver For News Corp

According to our estimates, the Digital Real Estate segment accounts for more than 15% of News Corp’s valuation and we expect revenues of this division to increase from around 0.92 billion in 2016 to more than 1.31 billion by the end of our forecast period.

https://www.trefis.com/forecastWidget?ticker=NWSA&driver=1685

This segment drives revenues by selling online advertising services on its residential real estate and commercial property sites We believe the introduction of augmented reality on such websites can attract more consumers. The trial conducted by the company with Plattar for its digital real estate subsidiary REA group was successful in simplifying the property search process by allowing the user to find their ideal property via a visual search. To use the technology, the end user needs to download the app and scan the print listing on his smartphone. The app then gives 3D interactive images of location map and information about the property. This improves and simplifies the user experience to search for the desired property.

Augmented and virtual reality technology is expected to change the real estate market in future. Recently, the luxury home real estate company Southeby’s started to use virtual reality headsets to showcase high end homes in the U.S. Other property companies are also launching virtual reality property tours at their offices, as the ability to see a property without actually visiting it proving extremely useful for investors looking to buy properties in locations away from their city or country. We believe News Corp’s investment in this area will boost its digital real estate business and provide a competitive edge to the company in future.

Advertisements

New Magic Leap AR Demo Show How You’ll Wake Up in the Future

Joe Carmichael

On Tuesday, augmented reality trailblazer Magic Leap released a new video demo of its very secretive product. If you can bring yourself to believe it — and to believe the fine print — the video is sure to blow your socks off.

The fine print: “Shot directly through Magic Leap technology on April 8, 2016. No special effects or compositing were used in the creation of this video (except for this text).” Assuming, against all reason, that that’s true, you’ll soon be getting accustomed to a new morning routine. Instead of rolling over, groggy, and poking mindlessly at your smartphone’s clunky screen, you’ll don a Magic Leap headset and gesticulate your way through a mixed reality tour of your messages, priorities, internet destinations, and even a jellyfish smack.

Or so Magic Leap believes: the video is optimistically called “A New Morning,” and the description proclaims: “Welcome to a new way to start your day.” If I get to wake up to what’s in this video, I’ll gladly proselytize.

The notable but possibly misleading element of the fine print is the word technology. Though not necessarily the case, it could be that the use of such an ambiguous term is meant to conceal the fact that the video was not shot through a Magic Leap headset — especially when you consider the difficulty involved in filming a presentable video through a pair of glasses.

Regardless, we’ll take the AR company at its word: if you buy this product, whenever it’s released, you’ll get what’s advertised.


To build such an AR interface, Magic Leap undoubtedly snagged some powerful minds to work there. Let’s just hope that those powerful minds are also virtuous — Magic Leap’s patent applications give us reason to be wary of its future tech.

And let’s also hope that it can get a product out to the general public before too long: all we’ve gotten are patents and videos — we’ve yet to see much that’s concrete.

What Marketers Need to Understand About Augmented Reality

 

apr16-18-hbr-ar
HBR STAFF

Imagine being able to see how a couch would fit in your living room before actually buying it — or being able to see which sunglasses suit your face or which lipstick looks good on you without physically trying anything on.

Each of these scenarios is already possible. These are real examples from IkeaRay-Ban, and Cover Girl of how companies are currently using augmented reality (AR).

AR has been piquing marketers’ interest in recent years, as it has the potential to change a range of consumer experiences, from how people find new products to how they decide which ones to buy. AR technology enhances the physical environment you see by overlaying virtual elements, such as information or images over it, either through displays such as HoloLens and Google Glass or through the camera view on your smartphone.

In order for the potential of AR to be realized, though, companies have to resist the urge to hastily create AR apps (that risk appearing gimmicky), and instead focus on better understanding how consumers will interact with the technology. Based on research I have been conducting on consumer responses to AR over the past four years, I have found that designing and implementing valuable AR apps requires the following: a better idea of how consumers would use such technology; more collaboration among computer scientists, designers, and marketers; and a strategy for integrating the applications into the existing consumer journey.

When I started working on AR as the topic for my PhD, almost no established knowledge about it existed in the marketing field. However, computer science and human-computer interaction research have been tackling AR for years, and borrowing insights from those fields can greatly help marketers understand what this technology will mean in commercial contexts.

Companies first have to understand how AR differs from other digital technologies. While it is similar in some aspects (e.g., applications are frequently used on smartphones, the content is composed of text or images, and the apps are usually highly interactive), there is something inherently different about AR: the ability to overlay virtual content on the physical world and have the two interact in real time.

I conducted a lab experiment with 60 participants to investigate how such augmentation influences consumer responses. The study is forthcoming in the Journal of Marketing Management. Participants had to look for their preferred model of sunglasses or furniture, either using an AR app (Ikea or Ray-Ban) or an app that allowed a similar activity but without AR features. The results consistently showed that when participants perceived an element of the environment to be augmented in real-time (for example, seeing a pair of sunglasses simulated on their face or seeing a virtual chair in an office), that created an immersive experience for them, significantly more so than if the sunglasses were just stuck on their online photo or if they saw furniture in a virtual room.

I also found that the augmented experience resulted in positive attitudes toward the application and willingness to use the app again and talk about it to others. But these effects didn’t seem to extend to the products themselves or the brands, just the technology.

However, another study showed that this might change depending on how the app is integrated into the consumer journey. Working with professor Yvonne Rogers from the UCL Interaction Centre and AR designers Ana Moutinho and Russell Freeman from the AR agency Holition, we conducted one of the first studies of how consumers use AR to “try on” make-up in a store. The app we used allows people to put on virtual lipstick or eye-shadows that moves with their faces.

We found that using this AR mirror in the store helped the consumers decide what to buy. The majority of them enjoyed the playful experience that allowed them to experiment with looks that would be much harder with physical testers. More importantly, when the AR app was integrated in a familiar retail setting as a part of the shopping experience, people not only thought highly of the technology, but they also positively related to the products. They were more likely to buy them and view the app as a convenient tool for shopping, not just for playing around.

Another study that we conducted online showed that when participants frequently used a similar AR make-up app on their phones over a five-day period, they also reported positive reactions towards both the technology and the products. They perceived the app to be not only enjoyable, but also useful for shopping for make-up, which again translated in their intentions to purchase the tried-on products.

Basically, if the AR experience is just a one-off episode, which was true of the lab study, the augmentation will most likely direct people’s attention towards the technology.  But if it is well integrated in an environment or in a process, it has the capacity to positively impact purchase activities and have a more far-reaching influence.

It is important to note that because shop assistants invited consumers to use the virtual mirror and showed them how to use it, it remains unclear whether customers might have a different experience without any help.

Marketers should remember that AR is not about creating a completely new reality; it’s about enhancing what already exists. When the virtual is well fitted with the physical and interacts with it, that’s when AR magic happens. As opposed to virtual reality, which immerses you into a different world (e.g. Oculus Rift), AR intertwines virtual elements that might be missing in a specific situation within physical reality (the latest best example being HoloLens’ holoportationfeature). This is one of the reasons why people like Snapchat’s AR feature, where users can play with different visual effects to transform ordinary videos into shareable stories.

The crucial part of the AR experience is whether the technology adds real value. Simply overlaying something virtual on a phone screen doesn’t always cut it and can appear gimmicky. Having an ad pop up on your smartphone camera view from scanning a brand’s logo might be fun, but people would tire of it pretty quickly. Similarly, an app that overlays information and promotions on your phone screen when you point the camera to different stores on a street or products in a shop sounds useful, but marketers have to ask: Are consumers really going to walk down the street holding their tablets or smartphones in the air? Do they want to shop by scanning every product?

The answer at this point is probably no, even for digital natives. People will only change their behavior if they perceive the value is worth the effort of adding another information layer into an already saturated digital space. So it’s important to think about the contexts in which they may be willing to do this —f or example, exploring a cultural event, an urban environment, or a historical site with an AR app (similar to how people use headsets in art museums), or wanting to learn more about an expensive product or a brand they really care about.

The real mission for commercial AR is integrating the technology so that it enhances the customer experience — makes it easier, more fun, and more convenient. We don’t want to live in a world where tangible, physical elements are replaced with digital replicas. The idea of Google Glass failed because we don’t want to walk around constantly seeing everything augmented. (The way Microsoft has been positioning HoloLens may be a different story, because it is designed for specific occasions, such as meeting rooms or workshops.) So rather than thinking of how to overlay as many places as possible with additional virtual content, the key to understanding AR is defining the specific activities where it can create real value.


Ana Javornik is a research associate at University College London Interaction Centre and Holition. She is also a PhD candidate at Università della Svizzera Italiana.

HBO, Discovery believe holographs are the future

|

Jon Stewart

Vivien Killilea | Getty Images | Telluride Film Festival
Jon Stewart

More media companies are betting that virtual and augmented reality technologies aren’t a passing fad, and that these mediums could be the future of entertainment.

On Tuesday, HBO and Discovery Communications announced that they have taken an equity stake in OTOY. The investment is intended to advance OTOY’s holographic or augmented reality technology in hopes that the networks can present the content through their TV and digital channels.

Los Angeles-based OTOY develops immersive entertainment experiences, whether that’s through holograms, virtual reality or other technologies. Financial terms of the deal were not disclosed.

“We are now entering an era where it’s not just cord-cutting and it’s not just video on demand,” OTOY CEO Jules Urbach told CNBC. “The screen itself is going away.”

Though details are scarce, one confirmed beneficiary of the OTOY and HBO deal will be Jon Stewart. In November, Stewart and HBO had signed a four-year production deal with OTOY. At the time, OTOY said Stewart was to co-develop new technology that would help him rapidly produce short-form content multiple times a day, with additional projects in the work.

Urbach said that HBO and Discovery’s investment goes past Stewart’s project. He pointed out that both media and technology companies are looking at immersive digital content as the next evolution of media.

In October 2014, Google invested $542 million in Magic Leap, a digital visual technology company that is working on creating an eyeglass that would allow people to see holographic images in real world settings. It’s seen as a competitor to Microsoft‘s HoloLens, another augmented reality device.

During this year’s Tribeca Film Festival, almost 30 exhibitors presented VR content in a variety of ways that could encourage mainstream adoption of the medium. A report from Greenlight VR and Road to VR estimated that 136 million VR headsets would be sold in the U.S. in 2025.

Urbach said OTOY’s vision is to develop technology that would allow people to see these kind of experiences without needing additional VR headsets or even a desktop computer. For example, someone may someday be able to pull up a website like HBO Now or a sports game on their mobile phone and project a holograph of the content on their coffee table, he said.

“It is like ‘Star Trek’ or ‘Star Wars’ depending on if you want Princess Leia in front of you, or if you want to go in the holodeck and the experience be around you. … It’s really not sci-fi anymore,” Urbach said.

OTOY’s roots in the tech and entertainment industry are deep. In addition to HBO and Discovery Communications, it also counts Autodesk and Yuri Milner’s Digital Sky Technologies as major investors. Its board of advisors include former Google CEO Eric Schmidt, former IBM CEO Samuel Palmisano, Mozilla co-founder and former CEO Brendan Eich and, perhaps most important for making in-roads in the entertainment industry, William Morris Endeavor IMG co-founder and co-CEO Ari Emanuel.

The Los Angeles-based company’s camera technology, which can create high-quality computer graphics in almost real time, has been used in films like “The Social Network,” “Spider-Man 3” and “The Curious Case of Benjamin Button,” and won a Scientific and Engineering Academy Award in 2010.

The Economist explains The difference between virtual and augmented reality

IF COMPUTING companies have their way, then 2016 will be the year in which virtual reality (VR) and augmented reality (AR)—two closely-related but very different technologies—become widely popular. Firms such as Facebook, Sony and Microsoft are getting ready to launch a raft of high-tech headsets designed either to layer computerised information on top of the real world, or to replace it entirely with a simulated, computer-generated alternative. What is the difference?

Start with a film analogy. If virtual reality is “The Matrix”, then augmented reality is “The Terminator”. As the name suggests, the point of VR is to persuade users that they have entered an entirely new reality. The headsets—such as Sony’s Morpheus, or Facebook’s Oculus Rift—block out the surrounding world and, making use of an old trick called stereoscopy, show slightly different images of each to a user’s eyes. That fools his brain into creating an illusion of depth, transforming the pair of images into a single experience of a fully three-dimensional world. Motion trackers, either mounted on the headset or externally, keep track of the users head, updating the view as he moves it around; optional hand controllers allow him to interact with virtual objects. The result is a reasonably convincing illusion of being somewhere else entirely. Augmented reality, by contrast, does not dispense with the real world, but uses computers to improve it in various ways. In “The Terminator”, Arnold Schwarzenegger’s killer robot sees a constant stream of useful information laid over his view of the world, a bit like the heads-up displays used by fighter pilots.

So AR and VR are close cousins, and rely on similar technology. But the two technologies have one fundamental difference. VR is immersive: the headsets must, by necessity, block out the external world. Putting one on is tricky enough to ensure that glancing away, as one might do when watching television, is not really possible. The first wave of applications, therefore, are in video games and films, where users (the companies hope) will prove willing to lock themselves into their virtual worlds.

AR, by design, maintains its users’ connection with the real world, and that means that a headset is not necessary. Heads-up displays are an early example of AR, but there are others: VeinViewer, for instance, is a medical device that projects images of a patients veins onto his skin, to help doctors aim injections. Many existing smartphone apps also make use of AR. Word Lens, for instance, translates between languages by looking at the world through a smartphone camera, recognising text, and then presenting the user with a real-time image in which that text has been replaced by its equivalent in another language. Nonetheless, the biggest AR product to be launched this year is indeed a headset, specifically Microsoft’s HoloLens. It aims to liberate computing from a fixed screen, overlaying its users’ view with useful additions (painting your email across a nearby wall, for instance, or putting weather information on a breakfast table). The firm must hope it does better than another famous AR headset, Google’s Glass, which, after years of development and months of public tinkering, was finally sent back to the drawing board last year.

Mark Zuckerberg says augmented reality glasses are ‘what we’re trying to get to’

Mark Zuckerberg is optimistic about the future of virtual and augmented reality. At his Facebook F8 conference keynote, Zuckerberg said that the company was working on “a whole new set of social experiences” across VR platforms, echoing an announcement the company made earlier this year. “Virtual reality has the potential to be the most social platform, because you actually feel like you’re right there with another person,” he said, referencing an Oculus Rift “toybox” demo that lets two people play together in VR. But in the coming decade, Zuckerberg sees a progression that many people have predicted: that virtual reality will merge with augmented reality and become part of everyday life.

Over the next 10 years, the form factor’s just going to keep on getting smaller and smaller, and eventually we’re going to have what looks like normal-looking glasses that can do both virtual and augmented reality. And augmented reality gives you the ability to see the world but also to be able to overlay digital objects on top of that.

So that means that today, if I want to show my friends a photo, I pull out my phone and I have a small version of the photo. In the future, you’ll be able to snap your fingers and pull out a photo and make it as big as you want, and with your AR glasses you’ll be able to show it to people and they’ll be able to see it.

As a matter of act, when we get to this world, a lot of things that we think about as physical objects today, like a TV for displaying an image, will actually just be $1 apps in an AR app store. So it’s going to take a long time to make this work. But this is the vision, and this is what we’re trying to get to over the next 10 years.

Palmer Luckey, inventor of the Oculus Rift headset that Facebook acquired in 2014, has previously predicted that augmented and virtual reality headsets will merge into a single piece of hardware that people carry around or wear like a pair of glasses. Granted, that’s going to be harder than it might sound. Right now, virtual and augmented reality headsets use fundamentally different visual technology, and it’s difficult for a pair of small glasses to block out the outside world the way a VR headset can.

But the photo sharing technology Zuckerberg’s describing is already plausible on an early augmented reality headset like the Microsoft HoloLens. So is the idea of selling apps to simulate physical objects, although we hope he wasn’t being literal about paying an extra dollar to simply simulate a video screen. Judging by the ambivalent response to early augmented reality headset Google Glass, it may actually be tougher to sort out the social norms than the pure technology — an issue Facebook should be more than a little familiar with.

Report: Google’s main focus in the long run is augmented reality, not VR

Chance Miller

According to a new report from The Information, behind closed doors Google is much more interested in augmented reality than it is in virtual reality. Google of course has publicly marketed its Cardboard VR product heavily, but that’s apparently not the end-goal for the company — augmented reality is.

Google has recently formed a new virtual reality unit within the company headed by Clay Bavor. Within this group, however, it’s reportedly common knowledge that the augmented, or “mixed,” reality has a much bigger market in the long run. This market would center around digital information and images being overlaid next to a real-world view. Think products like Glass or HoloLens.

Google reportedly doesn’t believe that in the public eye, people will be willing to invest in virtual reality solutions that force them to wear massive headsets, such as the Oculus headset or Gear VR. The company thinks that there’s a much bigger profit opportunity in the long run in augmented reality versus virtual reality.

Google of course has its augmented reality Project Tango initiative and earlier this year announced the first consumer-ready device in partnership with Lenovo. While unveiling the device, Lenovo and Google showed off how they could instantly map out the stage and reconfigure it with furniture such as a couch and a refrigerator. Project Tango uses real-time mapping technology paired with a complex setup of cameras and sensors to map the 3D space around users in real-time.

Personally, I happen to agree with Google on this one. I’m not a huge fan of virtual reality products, but something like Project Tango offers incredibly useful real-world applications that could push it towards mainstream popularity. I wrote on 9to5Mac earlier this year that Apple should focus primarily on augmented reality as opposed to virtual reality for this very same reason.

What do you think? Should Google focus more on augmented reality or Oculus-like virtual reality products?

Augmented Reality, Not VR, Will Be the Big Winner for Business

Sometimes exponential technologies hide in plain sight. Their innovation speed, market size, and number of practical uses seem to be progressing linearly, until suddenly they tilt upwards and turn on the boosters.

A case can be made that augmented reality (AR) in enterprises is just such an exponential technology. A growing number of companies are busy developing and releasing AR systems for enterprise settings.

Augmented and virtual reality analyst Digi-Capital’s numbers give a good indication of just how big AR is expected to be in a few short years. According to Digi-Capital AR companies will generate $120 billion in revenue by 2020, compared to the $30 billion revenue expected for their ‘cousin’ companies in virtual reality.

Part of AR’s explosive growth can be traced to a wide array of uses in business settings. The tech is a fundamental component in the hardware and software revolution, known as Factory 4.0.

augmented-reality-big-business-31First Systems Are Go

While virtual reality is about creating closed, fully immersive digital experiences, augmented reality systems overlay sensory information, such as visuals and sounds, on the real world around you.

The best-known example is Google Glass—a kind of partial AR experience where a square display appears in a user’s field of view. The device never became the success with end-users that Google was hoping for.

However, with 20-20 hindsight (if you’ll pardon the terrible pun) Google Glass was partly responsible for kickstarting a wave of innovative new AR startups. Unlike Google, these companies focused solely on AR’s potential for enterprises.

One example is the Canadian company NGrain, whose solutions have been implemented in several major companies, including Lockheed Martin and Boeing.

Lockheed has used AR systems in relation to its F-35 and F-22 aircraft.

Using smart glasses or tablets, engineers and service personnel can overlay graphics that show data like maintenance records or assembly instructions on top of a piece of real-world equipment. The system can also compare a digital 3D model of an aircraft with an actual aircraft to identify any potential damage.

The introduction of AR let Lockheed Martin engineers work up to 30 percent faster.

Meanwhile, at Boeing, several teams are looking at using AR systems to perform running quality control of parts for several aircraft, including the 787 Dreamliner. AR systems also allow maintenance crews to carry out full airplane checks much quicker than was previously possible.

“Traditionally, these tasks are carried out consulting manuals and using paper and pen,” says Barry Po, director of product management at NGrain. “Using AR-systems lets you overlay any needed information, while you have both hands free, and our visual inspection and damage assessment software can make it much quicker to identify potential issues. The result is that the time it takes to do a full plane check can go from several days to a matter of hours.”

Other AR systems have been used to deliver on the job training.

Using images and instructional illustrations and video, augmented reality units can show a new employee how to complete a job task without needing an introduction course.

Further, data has shown workers using AR technology learn up to 95% quicker and tend to make fewer mistakes than workers trained using traditional methods.

Pipes Are Being Laid to Drive Broader Adoption

While AR in enterprises has shown impressive results, most of these come from initial pilot projects, using a limited number of devices.

AR is also facing a number of challenges, including a lack of industry standards which can make integrating AR units and software within current enterprise IT ecosystems difficult.

“Traditional software systems like ERP or WMS are not necessarily ready to integrate fully with the new technologies, like AR, that make up Factory 4.0,” Pete Wassel, CEO of Augmate, says.

AR companies have often run successful trials, instigated by a company CTO, but then hit a wall when attempting a full rollout.

Enterprise IT departments have often — and often understandably so — balked at the idea of introducing camera-equipped AR units that come with a host of potential security risks and integration headaches.

It is a situation that Augmate, along with other companies, has been working to solve.

Augmate is creating the backbone, or pipe systems, that make integration of AR into existing IT ecosystems smooth and safe. Its software systems have generated a lot of interest, not only within the AR industry, but also from individual enterprises and companies within the Internet of Things space.

AR’s Stealth Mode About to End

Enterprises are quickly becoming aware of the potential of AR, with two-thirds of companies recently interviewed by Tech Pro Research saying they were considering integrating AR solutions.

At the same time, the number of use case scenarios for AR is growing rapidly.

Training, maintenance, warehouse management, emergency response at breakdowns, co-worker location, damage assessment, work order creation, assembly product design, and marketing and sales are all being augmented.

The same goes for industry-specific tasks in a number of fields.

For example, in health care AR can assist with information during surgery, medical inspections, in relation to specific medical procedures, or simply to call up and immediately display a patient’s relevant medical history hands-free on a pair of smart glasses.

One of the biggest use cases across industries is remote maintenance and inspection. Using AR systems, experts will be able to give advice to on-site personnel in any of a number of situations. This would not only eliminate having to fly key personnel around the world but dramatically improve response times.

“It makes it possible to create what I call ‘John Madden’ guides, where experts are able to draw instructions and point things out in real time,” Pete Wassel says.

Companies and startups have been working on AR solutions for many of these specific tasks, and many are nearing full release, after spending time in either beta or stealth mode.

At the same time, the hardware capabilities— field of vision, battery time, sturdiness, and ease of use — of AR devices are improving rapidly. Also, motion sensor and eye tracking technology are improving, allowing for more hands-free use.

In short, it is a perfect scenario for rapid growth in enterprise AR.

A Future Beyond the Factory

While the coming years are likely to see the use of AR technology in enterprises explode — its enterprise heyday will likely end when it’s supplanted by another exponential technology.

“Technology moves in cycles. I would think that AR in enterprises will have a good run of maybe 15 years,” Pete Wassel says. “After that, robots and AI will start to outcompete human workers and become the new dominant exponential technologies in enterprises.”

right-click-realityBut by then, it will have likely diffused beyond enterprises and become part of our daily lives.

As a species, we build knowledge on what was discovered by previous generations. We quickly realized it was impractical to rely on memory alone to do this, so we invented the printed word.

Our accumulated knowledge grew to lexical levels and then to whole libraries. Computers and the Internet are, of course, powerful new methods of storing and recalling information.

Each iteration increases the amount of information stored and makes it more readily accessible.

Augmented reality looks like another step, seamlessly integrating the physical world with our stores of information. Imagine having the ability to call up information about or perform a range of other actions on every object around you through a layer of AR.

This is the true promise of AR beyond its near-term enterprise sweet spot.

The ability to right-click on reality.

How virtual, augmented reality helps NASA explore space

By Jason HenryLos Angeles News Grou

Before astronaut Scott Kelly ended his year in space, he accomplished an unprecedented technological feat. He called mission control using Skype and streamed his first-person perspective through an augmented reality headset that NASA sent to the International Space Station in December.

“We messed around with it for like two hours and immediately I sensed this is a capability we could use right now,” Kelly said during a news conference in March.

The Microsoft HoloLens that Kelly used is just one tool being tinkered with at the Jet Propulsion Laboratory in La Cañada Flintridge that might change space exploration. JPL’s Ops Lab, a team of roughly a dozen, is experimenting with virtual and augmented reality technologies to allow NASA to take direct control of its robotic explorers, to enable humans to see distant worlds with their own eyes and to teach astronauts how to perform complex tasks on the fly.

A screen view from OnSight, a software tool developed by NASA s Jet Propulsion Laboratory in collaboration with Microsoft. OnSight uses real rover data to

A screen view from OnSight, a software tool developed by NASA s Jet Propulsion Laboratory in collaboration with Microsoft. OnSight uses real rover data to create a 3-D simulation of the Martian environment where mission scientists can meet to discuss rover operations. (Image courtesy NASA/JPL-Caltech)

This isn’t a technology relegated to the distant future; these innovations are happening today, said Jeff Norris, the founder and leader of the Ops Lab, during a presentation at the Vision Summit in Hollywood last month.

“Imagine a spacecraft designer studying full-scale holograms of a spacecraft years before a piece of metal is cut,” he said. “They could discover and correct problems before they could endanger a launch or a mission.”

Norris said he sees a future where augmented displays are integrated into spacesuit helmets. The technology will play an important role in NASA’s mission to send humans to Mars by the 2030s, he said.

“We think we’re going to use it to design the spacecraft that takes the astronauts to Mars; we think we’re going to use these technologies to assist astronauts on board the spacecraft on the way and when they arrive, to increase their autonomy so they can perform tasks that they need to on Mars without having to be helped as much on the ground,” Norris said. “That’s how we think all these things are coming together to enable the next chapter in humanity’s exploration of space.”

In Kelly’s case, his Skype test showed that scientists on Earth and the astronauts at the space station could connect live using the HoloLens, despite the station’s limited connection to the Internet.

The HoloLens, Microsoft’s foray into augmented reality, is the backbone of two projects at JPL called Sidekick and OnSight. Other experiments use Microsoft’s Kinect and the Oculus Rift, according to NASA officials.

‘SIDEKICK’ MAKES SPACE TASKS EASIER

The HoloLens headset overlays projections on top of users’ surroundings to “augment” their vision. NASA hopes to use the consumer technology through Project SideKick to create interactive guides for astronauts. The headset, through Skype, even lets experts on Earth walk astronauts through unfamiliar tasks. They can draw and share objects to the astronaut’s field of vision, such as an arrow to direct the headset wearer’s eyes or numbers to show the order to perform each step.

NASA wants to use standalone interactive manuals for the headset that use holographic animations, according to Victor Luo, a senior technical lead and human interfaces engineer at JPL. It would potentially replace bulky paper manuals used on the station today.

“As we walk through the procedures, the application is hand-holding us, showing us animations and diagrams, everything we need to know as we’re doing it,” Luo said at the Vision Summit.

Luo tested the display in an underwater space station analog in the Atlantic Ocean. Tasks expected to last an entire afternoon were completed in less than an hour, he said. The team certified the HoloLens for space in NASA’s Weightless Wonder, a jet that climbs and dives to reduce gravity.

Initially they expected Sidekick to reach the space station this summer, but a rocket explosion delayed its delivery until December. Still, Kelly was able to perform some initial tests before his return to Earth on March 1.

It’s not just astronauts who benefit from the technology, however.

TAKING MARS INTO THE OFFICE

A steady stream of images flow in every day from Mars. They come from rovers and orbiters mapping out the planet from the ground and above.

OnSight combines the data from NASA’s robotic explorers to create a virtual map of Mars. Scientists can use OnSight to virtually meet up on the Red Planet and even plot out movements for NASA’s rovers.

“We want to bring the surface of Mars into their offices,” Norris said. “Let them explore the Red Planet as geologists have explored Earth.”

Studying images from a computer screen lacks the depth of seeing it with your own eyes, Norris said. The Ops Lab tested early versions of OnSight by giving a headset to the science teams from the Curiosity and Opportunity rover missions, then asking them to complete a task using data sent back by Curiosity. They compared the results against a control group that used the standard operational tools for the rover.

“What we found was there is a dramatic, measurable and statistically significant effect on their understanding of the vicinity of the data acquired by the Mars Rover when they were wearing a head mounted display,” Norris said.

Even with zero experience with the headsets, scientists performed as well as, or better, than their counterparts using the existing tools, according to a graph Norris showed during his presentation.

A pilot group is now using the headgear to operate Curiosity, according to JPL. OnSight recently helped Abigail Fraeman, a Curiosity science team member at JPL, and her team identify a point they would like to study between two Martian rock formations.

“OnSight makes the whole process of analyzing data feel a lot more natural to me,” Fraeman said in a statement. “It really gives me the sense that I’m in the field when I put it on. Thinking about Martian geology is a lot more intuitive when I can stand in the scene and walk around the way I would if I were in the field.”

SEE MARS FOR YOURSELF

NASA and JPL have unveiled “Destination: Mars,” an exhibit opening at the Kennedy Space Center in Florida this summer, that will let guests visit Mars using an adaptation of OnSight.

The tour across several sites on Mars, reconstructed using real imagery from Curiosity, is guided by holographic versions of astronaut Buzz Aldrin and Curiosity Rover driver Erisa Hines.

“This experience lets the public explore Mars in an entirely new way. To walk through the exact landscape that Curiosity is roving across puts its achievements and discoveries into beautiful context,” said Doug Ellison, visualization producer at JPL.

Separately, NASA is working with developers to create a free video game, “The Mars 2030 Experience,” using the Unreal Engine 4 for consumer virtual reality headsets, according to an announcement on Epic Games’ website.

Augmented reality mapping out tech’s next mind-bending trip