IS APPLE GOING TO CHALLENGE MICROSOFT HOLOLENS IN AUGMENTED REALITY MARKET

Daryl Deino

Microsoft’s HoloLens, the first augmented reality device, started shipping to developers on March 30. While most people who have used the device believe it could represent the future of technology, they also feel that the current beta version is somewhat limited.

Mashable likes the build and design of the HoloLens.

“I noticed immediately that the final 1.3-pound (579 grams) well-balanced HoloLens Development Edition was the most comfortable HoloLens headset yet and, not for nothing, with its smooth matte-black finish, curves and sunglass-like visor, it’s a pretty cool-looking device. Microsoft told me that it’s made dozens of adjustments since the last time I tried it. It showed.”

Microsoft HoloLens
The limited field-of-view on the HoloLens has frustrated many users. [Photo by Ted S. Warren-Pool/Getty Images]

The author, Lance Ulanoff, doesn’t feel as bothered as others by the limited field-of-view the HoloLens offers. However, WinBeta explains why it could be a major issue.

“Whatever the reasons, the field of view is narrower than you’ll probably want. The thing that struck me most about the field of view problems was not that the screen was too small, but what happens when you want to change the field of view,” says author Kip Kniskern.

Kniskern and others say that to understand the limited field-of-view, one has to think of holding a smartphone about eight inches from your face. When a holographic image is too large for that space, it will clip off and sour the immersive experience.

Reddit poster, Theadmira1, received his HoloLens and says he loves it so far.

“Finally got our hands on the HoloLens and wanted to share some footage directly from the lens. I’m abso-f*****g-lutely in love with this thing. This is the PC to VRs Atari and I’ve been waiting since the first time I put on the DK1. The FOV sucks, but I feel the tracking, wireless, and OS issues would be harder to overcome and then already have those areas dialed so it’s really only a matter of time

If for some reason Microsoft doesn’t fulfill their duties in bringing augmented reality to the mainstream, perhaps Apple can deliver a more complete augmented reality experience. According to MacRumors, Apple is likely working on both virtual and augmented reality projects.

“Apple is investigating multiple ways virtual and augmented reality could be implemented into future iOS devices or new hardware products. It isn’t yet known when a VR or AR product will launch, but Apple’s focus on the technology has ramped up over the past several months.”

The article adds that Apple has hired hundreds of employees who specialize in both augmented and virtual reality, including computer science professor Doug Bowman, who once led Virginia Tech’s Center for Human-Computer Interaction. He not only specializes in three-dimensional user interface design, but has written a book on the subject covering 3D interfaces.

Judging by the initial response to the HoloLens, it will take at least three or four more years before a complete consumer-oriented augmented reality device hits shelves. Are you excited about augmented reality? Do you think Microsoft’s HoloLens has more promise than virtual reality devices such as the Oculus Rift and the HTC Vive? Let us know in the comments section.

[Photo by Justin Sullivan/Getty Images]

 

Advertisements

How virtual, augmented reality helps NASA explore space

By Jason HenryLos Angeles News Grou

Before astronaut Scott Kelly ended his year in space, he accomplished an unprecedented technological feat. He called mission control using Skype and streamed his first-person perspective through an augmented reality headset that NASA sent to the International Space Station in December.

“We messed around with it for like two hours and immediately I sensed this is a capability we could use right now,” Kelly said during a news conference in March.

The Microsoft HoloLens that Kelly used is just one tool being tinkered with at the Jet Propulsion Laboratory in La Cañada Flintridge that might change space exploration. JPL’s Ops Lab, a team of roughly a dozen, is experimenting with virtual and augmented reality technologies to allow NASA to take direct control of its robotic explorers, to enable humans to see distant worlds with their own eyes and to teach astronauts how to perform complex tasks on the fly.

A screen view from OnSight, a software tool developed by NASA s Jet Propulsion Laboratory in collaboration with Microsoft. OnSight uses real rover data to

A screen view from OnSight, a software tool developed by NASA s Jet Propulsion Laboratory in collaboration with Microsoft. OnSight uses real rover data to create a 3-D simulation of the Martian environment where mission scientists can meet to discuss rover operations. (Image courtesy NASA/JPL-Caltech)

This isn’t a technology relegated to the distant future; these innovations are happening today, said Jeff Norris, the founder and leader of the Ops Lab, during a presentation at the Vision Summit in Hollywood last month.

“Imagine a spacecraft designer studying full-scale holograms of a spacecraft years before a piece of metal is cut,” he said. “They could discover and correct problems before they could endanger a launch or a mission.”

Norris said he sees a future where augmented displays are integrated into spacesuit helmets. The technology will play an important role in NASA’s mission to send humans to Mars by the 2030s, he said.

“We think we’re going to use it to design the spacecraft that takes the astronauts to Mars; we think we’re going to use these technologies to assist astronauts on board the spacecraft on the way and when they arrive, to increase their autonomy so they can perform tasks that they need to on Mars without having to be helped as much on the ground,” Norris said. “That’s how we think all these things are coming together to enable the next chapter in humanity’s exploration of space.”

In Kelly’s case, his Skype test showed that scientists on Earth and the astronauts at the space station could connect live using the HoloLens, despite the station’s limited connection to the Internet.

The HoloLens, Microsoft’s foray into augmented reality, is the backbone of two projects at JPL called Sidekick and OnSight. Other experiments use Microsoft’s Kinect and the Oculus Rift, according to NASA officials.

‘SIDEKICK’ MAKES SPACE TASKS EASIER

The HoloLens headset overlays projections on top of users’ surroundings to “augment” their vision. NASA hopes to use the consumer technology through Project SideKick to create interactive guides for astronauts. The headset, through Skype, even lets experts on Earth walk astronauts through unfamiliar tasks. They can draw and share objects to the astronaut’s field of vision, such as an arrow to direct the headset wearer’s eyes or numbers to show the order to perform each step.

NASA wants to use standalone interactive manuals for the headset that use holographic animations, according to Victor Luo, a senior technical lead and human interfaces engineer at JPL. It would potentially replace bulky paper manuals used on the station today.

“As we walk through the procedures, the application is hand-holding us, showing us animations and diagrams, everything we need to know as we’re doing it,” Luo said at the Vision Summit.

Luo tested the display in an underwater space station analog in the Atlantic Ocean. Tasks expected to last an entire afternoon were completed in less than an hour, he said. The team certified the HoloLens for space in NASA’s Weightless Wonder, a jet that climbs and dives to reduce gravity.

Initially they expected Sidekick to reach the space station this summer, but a rocket explosion delayed its delivery until December. Still, Kelly was able to perform some initial tests before his return to Earth on March 1.

It’s not just astronauts who benefit from the technology, however.

TAKING MARS INTO THE OFFICE

A steady stream of images flow in every day from Mars. They come from rovers and orbiters mapping out the planet from the ground and above.

OnSight combines the data from NASA’s robotic explorers to create a virtual map of Mars. Scientists can use OnSight to virtually meet up on the Red Planet and even plot out movements for NASA’s rovers.

“We want to bring the surface of Mars into their offices,” Norris said. “Let them explore the Red Planet as geologists have explored Earth.”

Studying images from a computer screen lacks the depth of seeing it with your own eyes, Norris said. The Ops Lab tested early versions of OnSight by giving a headset to the science teams from the Curiosity and Opportunity rover missions, then asking them to complete a task using data sent back by Curiosity. They compared the results against a control group that used the standard operational tools for the rover.

“What we found was there is a dramatic, measurable and statistically significant effect on their understanding of the vicinity of the data acquired by the Mars Rover when they were wearing a head mounted display,” Norris said.

Even with zero experience with the headsets, scientists performed as well as, or better, than their counterparts using the existing tools, according to a graph Norris showed during his presentation.

A pilot group is now using the headgear to operate Curiosity, according to JPL. OnSight recently helped Abigail Fraeman, a Curiosity science team member at JPL, and her team identify a point they would like to study between two Martian rock formations.

“OnSight makes the whole process of analyzing data feel a lot more natural to me,” Fraeman said in a statement. “It really gives me the sense that I’m in the field when I put it on. Thinking about Martian geology is a lot more intuitive when I can stand in the scene and walk around the way I would if I were in the field.”

SEE MARS FOR YOURSELF

NASA and JPL have unveiled “Destination: Mars,” an exhibit opening at the Kennedy Space Center in Florida this summer, that will let guests visit Mars using an adaptation of OnSight.

The tour across several sites on Mars, reconstructed using real imagery from Curiosity, is guided by holographic versions of astronaut Buzz Aldrin and Curiosity Rover driver Erisa Hines.

“This experience lets the public explore Mars in an entirely new way. To walk through the exact landscape that Curiosity is roving across puts its achievements and discoveries into beautiful context,” said Doug Ellison, visualization producer at JPL.

Separately, NASA is working with developers to create a free video game, “The Mars 2030 Experience,” using the Unreal Engine 4 for consumer virtual reality headsets, according to an announcement on Epic Games’ website.

Hololens Round Two: Augmented Reality At Build 2016

by Brett Howse on March 31, 2016 11:50 AM EST

Last year at Build I got my first chance to try Hololens. That experience was very interesting, not only because of the potential of Augmented Reality, but the entire circus surrounding the device. The Hololens sessions were at a different location, and the groups brought over had to lock up everything electronic. We could only do photos of a unit in a display case. Naturally when Microsoft announced yesterday that Hololens would start shipping to developers yesterday, this year’s experience could never be so secret.

So when we got to the demo location, and were given keys for a locker, I was a bit taken aback. But it wasn’t anything as sinister this time, only a way to make sure there were no backpacks on the floor as tripping hazards, because this year’s untethered experience was really untethered.

That comes a bit later though. This year’s demo involved building and deploying an 3D app using Unity and Visual Studio, and each person doing the demo also got a coach to help solve any issues on the way. The Hololens unit was slightly different this year, but looking at it, it was remarkably similar to last year’s demo version. The one big change this year was very welcome. Instead of having a person physically measure the inter-pupillary distance on your head (the distance between your pupils), the experience is now handled through software when you first put the headset on. There is a quick calibration that you can run and it sets your eye position based on some air tap gestures. It was very quick and easy, and the set walks you through everything required with voice and visual cues.

Then we sat down building our apps. Since this was a demo for press, all of the coding was done ahead of time and we just had to walk through adding scripts in Unity to set up the demo. Then we build them, and deploy to a remote machine using the IP address of the Hololens.

The demo app was of an energy ball which, when locked to a location in space, would open up and show some nifty effects. The experience was very basic compared to what I would expect of the retail apps, but this was a simple demo and it worked well.

The very interesting bit was later on, when we linked our Hololens units with the other people in our pods of six people. This way all six people could interact with a single energy ball. People also got to choose an avatar which would float over their heads.

That experience was pretty amazing. With very little setup, the holograms were truly linked to a single point that all people could see. As part of this demo, my coach suggested I walk around the (very large) room and then look back. This was probably the most amazing part of the demo. After walking a hundred feet or more away, and around some tables and pillars, I looked back and the hologram was still floating exactly where I left it. The ability to really lock things to a location is really the one part that needs to be perfect for this experience to work, and they really nailed it. In addition, my pod mates were all around the room with avatars floating over their heads.

So with a night to think about it, here are my thoughts after using the Hololens a year later. The field of view issue is still very small, and clearly not something they were not able to address before they shipped to developers. I would explain it as something like a mid-sized television, in the 27-inch range, sitting a few feet away from you. My experience was better this time because there were less software issues, but the small field of view can certainly take some getting used to.

The hardware itself was very easy to put on and adjust, and it was fairly well balanced in that I never felt like the unit was heavier on the front where the lenses are. The adjustment is done with a wheel on the back, much like a welding helmet if you’ve ever seen one of those. The right side has buttons for volume, and the left side has buttons for brightness. I had to crank up the audio quite a bit because of the loud room we were in, and although the audio was spatial, it was hard to get a sense of that with the commotion going on during the demos. Although I don’t wear glasses, it looked like there would be no issues wearing glasses with it, and several of the other attendees seemed to have no issues putting the device on and using it with them.

The experience of AR is much different than VR. Because you are interacting with things in real space, you can easily move around without fear of tripping or walking into a wall. VR is able to offer much more powerful graphics and immersion right now, but you are largely bound to a single location. The use cases for AR seem, to me, to be not necessarily the same as VR and both should easily be able to co-exist.

While doing my demo, I asked my coach how to close the app we were running, and he showed me a “bloom” gesture which closes it. Once I did that, I was in another mode for the Hololens where I could see how it mapped out the physical world with polygons by tapping my finger in a direction. This was amazing and the Hololens did a great job on picking up everything in my area, including the people, with no issues.

I then did another bloom and was back at the start screen. On the demo units, this was pretty sparse, but I was able to go into settings and play around. I didn’t see anything special in there other than the process of interacting with the menus was very simple and was very easy to get used to. From a UI aspect, the Hololens did very well.

At the end of our demo we did some shooting of orbs which opened up a hole in the floor. Peering down into it, it really felt like this was something you didn’t want to step into. The holograms tend to be a bit translucent, but on this one in particular it was much more solid. There’s a lot of untapped potential here and I hope to get a chance to do some of the other demos they have here to get a better feel for that. The headset itself seemed to be near the edges of its processing power on the final demo though, which had a lot of not very complex polygons moving around, and the six people interacting. There was a lot of things to keep track of, as well as quite a few holograms flying around.

Microft then said that all of the code that we used in the demo, and all of the code used on the demos last year, is all available on GitHub to allow devs quicker access to code.

I think the Hololens is still one of the most interesting pieces of tech I’ve used in a long time. There is a lot of potential here for education, training, and even tasks like painting your house and trying different color samples. There are quite a few applications where this would work very well.

The hardware though still needs a bit of work. It is a bit bulky, and the lenses would not stay anchored to the spot in front of me where I set them, so I had to readjust. The field of view is also not very large, and this could be because the processing power is not as powerful as the tethered experiences of VR.

I look forward to seeing where this goes in the future. A lot of the pieces are already well done and on the software side, the experience is very good. With a bit better hardware, which will almost certainly come with time, this is going to be a very powerful tool from Microsoft.

MICROSOFT SHOWS HOLOLENS’ AUGMENTED REALITY IS NO GIMMICK

Screenshot 2015-04-29 15.30.54