Report: Google’s main focus in the long run is augmented reality, not VR

Chance Miller

According to a new report from The Information, behind closed doors Google is much more interested in augmented reality than it is in virtual reality. Google of course has publicly marketed its Cardboard VR product heavily, but that’s apparently not the end-goal for the company — augmented reality is.

Google has recently formed a new virtual reality unit within the company headed by Clay Bavor. Within this group, however, it’s reportedly common knowledge that the augmented, or “mixed,” reality has a much bigger market in the long run. This market would center around digital information and images being overlaid next to a real-world view. Think products like Glass or HoloLens.

Google reportedly doesn’t believe that in the public eye, people will be willing to invest in virtual reality solutions that force them to wear massive headsets, such as the Oculus headset or Gear VR. The company thinks that there’s a much bigger profit opportunity in the long run in augmented reality versus virtual reality.

Google of course has its augmented reality Project Tango initiative and earlier this year announced the first consumer-ready device in partnership with Lenovo. While unveiling the device, Lenovo and Google showed off how they could instantly map out the stage and reconfigure it with furniture such as a couch and a refrigerator. Project Tango uses real-time mapping technology paired with a complex setup of cameras and sensors to map the 3D space around users in real-time.

Personally, I happen to agree with Google on this one. I’m not a huge fan of virtual reality products, but something like Project Tango offers incredibly useful real-world applications that could push it towards mainstream popularity. I wrote on 9to5Mac earlier this year that Apple should focus primarily on augmented reality as opposed to virtual reality for this very same reason.

What do you think? Should Google focus more on augmented reality or Oculus-like virtual reality products?


Augmented Reality, Not VR, Will Be the Big Winner for Business

Sometimes exponential technologies hide in plain sight. Their innovation speed, market size, and number of practical uses seem to be progressing linearly, until suddenly they tilt upwards and turn on the boosters.

A case can be made that augmented reality (AR) in enterprises is just such an exponential technology. A growing number of companies are busy developing and releasing AR systems for enterprise settings.

Augmented and virtual reality analyst Digi-Capital’s numbers give a good indication of just how big AR is expected to be in a few short years. According to Digi-Capital AR companies will generate $120 billion in revenue by 2020, compared to the $30 billion revenue expected for their ‘cousin’ companies in virtual reality.

Part of AR’s explosive growth can be traced to a wide array of uses in business settings. The tech is a fundamental component in the hardware and software revolution, known as Factory 4.0.

augmented-reality-big-business-31First Systems Are Go

While virtual reality is about creating closed, fully immersive digital experiences, augmented reality systems overlay sensory information, such as visuals and sounds, on the real world around you.

The best-known example is Google Glass—a kind of partial AR experience where a square display appears in a user’s field of view. The device never became the success with end-users that Google was hoping for.

However, with 20-20 hindsight (if you’ll pardon the terrible pun) Google Glass was partly responsible for kickstarting a wave of innovative new AR startups. Unlike Google, these companies focused solely on AR’s potential for enterprises.

One example is the Canadian company NGrain, whose solutions have been implemented in several major companies, including Lockheed Martin and Boeing.

Lockheed has used AR systems in relation to its F-35 and F-22 aircraft.

Using smart glasses or tablets, engineers and service personnel can overlay graphics that show data like maintenance records or assembly instructions on top of a piece of real-world equipment. The system can also compare a digital 3D model of an aircraft with an actual aircraft to identify any potential damage.

The introduction of AR let Lockheed Martin engineers work up to 30 percent faster.

Meanwhile, at Boeing, several teams are looking at using AR systems to perform running quality control of parts for several aircraft, including the 787 Dreamliner. AR systems also allow maintenance crews to carry out full airplane checks much quicker than was previously possible.

“Traditionally, these tasks are carried out consulting manuals and using paper and pen,” says Barry Po, director of product management at NGrain. “Using AR-systems lets you overlay any needed information, while you have both hands free, and our visual inspection and damage assessment software can make it much quicker to identify potential issues. The result is that the time it takes to do a full plane check can go from several days to a matter of hours.”

Other AR systems have been used to deliver on the job training.

Using images and instructional illustrations and video, augmented reality units can show a new employee how to complete a job task without needing an introduction course.

Further, data has shown workers using AR technology learn up to 95% quicker and tend to make fewer mistakes than workers trained using traditional methods.

Pipes Are Being Laid to Drive Broader Adoption

While AR in enterprises has shown impressive results, most of these come from initial pilot projects, using a limited number of devices.

AR is also facing a number of challenges, including a lack of industry standards which can make integrating AR units and software within current enterprise IT ecosystems difficult.

“Traditional software systems like ERP or WMS are not necessarily ready to integrate fully with the new technologies, like AR, that make up Factory 4.0,” Pete Wassel, CEO of Augmate, says.

AR companies have often run successful trials, instigated by a company CTO, but then hit a wall when attempting a full rollout.

Enterprise IT departments have often — and often understandably so — balked at the idea of introducing camera-equipped AR units that come with a host of potential security risks and integration headaches.

It is a situation that Augmate, along with other companies, has been working to solve.

Augmate is creating the backbone, or pipe systems, that make integration of AR into existing IT ecosystems smooth and safe. Its software systems have generated a lot of interest, not only within the AR industry, but also from individual enterprises and companies within the Internet of Things space.

AR’s Stealth Mode About to End

Enterprises are quickly becoming aware of the potential of AR, with two-thirds of companies recently interviewed by Tech Pro Research saying they were considering integrating AR solutions.

At the same time, the number of use case scenarios for AR is growing rapidly.

Training, maintenance, warehouse management, emergency response at breakdowns, co-worker location, damage assessment, work order creation, assembly product design, and marketing and sales are all being augmented.

The same goes for industry-specific tasks in a number of fields.

For example, in health care AR can assist with information during surgery, medical inspections, in relation to specific medical procedures, or simply to call up and immediately display a patient’s relevant medical history hands-free on a pair of smart glasses.

One of the biggest use cases across industries is remote maintenance and inspection. Using AR systems, experts will be able to give advice to on-site personnel in any of a number of situations. This would not only eliminate having to fly key personnel around the world but dramatically improve response times.

“It makes it possible to create what I call ‘John Madden’ guides, where experts are able to draw instructions and point things out in real time,” Pete Wassel says.

Companies and startups have been working on AR solutions for many of these specific tasks, and many are nearing full release, after spending time in either beta or stealth mode.

At the same time, the hardware capabilities— field of vision, battery time, sturdiness, and ease of use — of AR devices are improving rapidly. Also, motion sensor and eye tracking technology are improving, allowing for more hands-free use.

In short, it is a perfect scenario for rapid growth in enterprise AR.

A Future Beyond the Factory

While the coming years are likely to see the use of AR technology in enterprises explode — its enterprise heyday will likely end when it’s supplanted by another exponential technology.

“Technology moves in cycles. I would think that AR in enterprises will have a good run of maybe 15 years,” Pete Wassel says. “After that, robots and AI will start to outcompete human workers and become the new dominant exponential technologies in enterprises.”

right-click-realityBut by then, it will have likely diffused beyond enterprises and become part of our daily lives.

As a species, we build knowledge on what was discovered by previous generations. We quickly realized it was impractical to rely on memory alone to do this, so we invented the printed word.

Our accumulated knowledge grew to lexical levels and then to whole libraries. Computers and the Internet are, of course, powerful new methods of storing and recalling information.

Each iteration increases the amount of information stored and makes it more readily accessible.

Augmented reality looks like another step, seamlessly integrating the physical world with our stores of information. Imagine having the ability to call up information about or perform a range of other actions on every object around you through a layer of AR.

This is the true promise of AR beyond its near-term enterprise sweet spot.

The ability to right-click on reality.

How virtual, augmented reality helps NASA explore space

By Jason HenryLos Angeles News Grou

Before astronaut Scott Kelly ended his year in space, he accomplished an unprecedented technological feat. He called mission control using Skype and streamed his first-person perspective through an augmented reality headset that NASA sent to the International Space Station in December.

“We messed around with it for like two hours and immediately I sensed this is a capability we could use right now,” Kelly said during a news conference in March.

The Microsoft HoloLens that Kelly used is just one tool being tinkered with at the Jet Propulsion Laboratory in La Cañada Flintridge that might change space exploration. JPL’s Ops Lab, a team of roughly a dozen, is experimenting with virtual and augmented reality technologies to allow NASA to take direct control of its robotic explorers, to enable humans to see distant worlds with their own eyes and to teach astronauts how to perform complex tasks on the fly.

A screen view from OnSight, a software tool developed by NASA s Jet Propulsion Laboratory in collaboration with Microsoft. OnSight uses real rover data to

A screen view from OnSight, a software tool developed by NASA s Jet Propulsion Laboratory in collaboration with Microsoft. OnSight uses real rover data to create a 3-D simulation of the Martian environment where mission scientists can meet to discuss rover operations. (Image courtesy NASA/JPL-Caltech)

This isn’t a technology relegated to the distant future; these innovations are happening today, said Jeff Norris, the founder and leader of the Ops Lab, during a presentation at the Vision Summit in Hollywood last month.

“Imagine a spacecraft designer studying full-scale holograms of a spacecraft years before a piece of metal is cut,” he said. “They could discover and correct problems before they could endanger a launch or a mission.”

Norris said he sees a future where augmented displays are integrated into spacesuit helmets. The technology will play an important role in NASA’s mission to send humans to Mars by the 2030s, he said.

“We think we’re going to use it to design the spacecraft that takes the astronauts to Mars; we think we’re going to use these technologies to assist astronauts on board the spacecraft on the way and when they arrive, to increase their autonomy so they can perform tasks that they need to on Mars without having to be helped as much on the ground,” Norris said. “That’s how we think all these things are coming together to enable the next chapter in humanity’s exploration of space.”

In Kelly’s case, his Skype test showed that scientists on Earth and the astronauts at the space station could connect live using the HoloLens, despite the station’s limited connection to the Internet.

The HoloLens, Microsoft’s foray into augmented reality, is the backbone of two projects at JPL called Sidekick and OnSight. Other experiments use Microsoft’s Kinect and the Oculus Rift, according to NASA officials.


The HoloLens headset overlays projections on top of users’ surroundings to “augment” their vision. NASA hopes to use the consumer technology through Project SideKick to create interactive guides for astronauts. The headset, through Skype, even lets experts on Earth walk astronauts through unfamiliar tasks. They can draw and share objects to the astronaut’s field of vision, such as an arrow to direct the headset wearer’s eyes or numbers to show the order to perform each step.

NASA wants to use standalone interactive manuals for the headset that use holographic animations, according to Victor Luo, a senior technical lead and human interfaces engineer at JPL. It would potentially replace bulky paper manuals used on the station today.

“As we walk through the procedures, the application is hand-holding us, showing us animations and diagrams, everything we need to know as we’re doing it,” Luo said at the Vision Summit.

Luo tested the display in an underwater space station analog in the Atlantic Ocean. Tasks expected to last an entire afternoon were completed in less than an hour, he said. The team certified the HoloLens for space in NASA’s Weightless Wonder, a jet that climbs and dives to reduce gravity.

Initially they expected Sidekick to reach the space station this summer, but a rocket explosion delayed its delivery until December. Still, Kelly was able to perform some initial tests before his return to Earth on March 1.

It’s not just astronauts who benefit from the technology, however.


A steady stream of images flow in every day from Mars. They come from rovers and orbiters mapping out the planet from the ground and above.

OnSight combines the data from NASA’s robotic explorers to create a virtual map of Mars. Scientists can use OnSight to virtually meet up on the Red Planet and even plot out movements for NASA’s rovers.

“We want to bring the surface of Mars into their offices,” Norris said. “Let them explore the Red Planet as geologists have explored Earth.”

Studying images from a computer screen lacks the depth of seeing it with your own eyes, Norris said. The Ops Lab tested early versions of OnSight by giving a headset to the science teams from the Curiosity and Opportunity rover missions, then asking them to complete a task using data sent back by Curiosity. They compared the results against a control group that used the standard operational tools for the rover.

“What we found was there is a dramatic, measurable and statistically significant effect on their understanding of the vicinity of the data acquired by the Mars Rover when they were wearing a head mounted display,” Norris said.

Even with zero experience with the headsets, scientists performed as well as, or better, than their counterparts using the existing tools, according to a graph Norris showed during his presentation.

A pilot group is now using the headgear to operate Curiosity, according to JPL. OnSight recently helped Abigail Fraeman, a Curiosity science team member at JPL, and her team identify a point they would like to study between two Martian rock formations.

“OnSight makes the whole process of analyzing data feel a lot more natural to me,” Fraeman said in a statement. “It really gives me the sense that I’m in the field when I put it on. Thinking about Martian geology is a lot more intuitive when I can stand in the scene and walk around the way I would if I were in the field.”


NASA and JPL have unveiled “Destination: Mars,” an exhibit opening at the Kennedy Space Center in Florida this summer, that will let guests visit Mars using an adaptation of OnSight.

The tour across several sites on Mars, reconstructed using real imagery from Curiosity, is guided by holographic versions of astronaut Buzz Aldrin and Curiosity Rover driver Erisa Hines.

“This experience lets the public explore Mars in an entirely new way. To walk through the exact landscape that Curiosity is roving across puts its achievements and discoveries into beautiful context,” said Doug Ellison, visualization producer at JPL.

Separately, NASA is working with developers to create a free video game, “The Mars 2030 Experience,” using the Unreal Engine 4 for consumer virtual reality headsets, according to an announcement on Epic Games’ website.

Hololens Round Two: Augmented Reality At Build 2016

by Brett Howse on March 31, 2016 11:50 AM EST

Last year at Build I got my first chance to try Hololens. That experience was very interesting, not only because of the potential of Augmented Reality, but the entire circus surrounding the device. The Hololens sessions were at a different location, and the groups brought over had to lock up everything electronic. We could only do photos of a unit in a display case. Naturally when Microsoft announced yesterday that Hololens would start shipping to developers yesterday, this year’s experience could never be so secret.

So when we got to the demo location, and were given keys for a locker, I was a bit taken aback. But it wasn’t anything as sinister this time, only a way to make sure there were no backpacks on the floor as tripping hazards, because this year’s untethered experience was really untethered.

That comes a bit later though. This year’s demo involved building and deploying an 3D app using Unity and Visual Studio, and each person doing the demo also got a coach to help solve any issues on the way. The Hololens unit was slightly different this year, but looking at it, it was remarkably similar to last year’s demo version. The one big change this year was very welcome. Instead of having a person physically measure the inter-pupillary distance on your head (the distance between your pupils), the experience is now handled through software when you first put the headset on. There is a quick calibration that you can run and it sets your eye position based on some air tap gestures. It was very quick and easy, and the set walks you through everything required with voice and visual cues.

Then we sat down building our apps. Since this was a demo for press, all of the coding was done ahead of time and we just had to walk through adding scripts in Unity to set up the demo. Then we build them, and deploy to a remote machine using the IP address of the Hololens.

The demo app was of an energy ball which, when locked to a location in space, would open up and show some nifty effects. The experience was very basic compared to what I would expect of the retail apps, but this was a simple demo and it worked well.

The very interesting bit was later on, when we linked our Hololens units with the other people in our pods of six people. This way all six people could interact with a single energy ball. People also got to choose an avatar which would float over their heads.

That experience was pretty amazing. With very little setup, the holograms were truly linked to a single point that all people could see. As part of this demo, my coach suggested I walk around the (very large) room and then look back. This was probably the most amazing part of the demo. After walking a hundred feet or more away, and around some tables and pillars, I looked back and the hologram was still floating exactly where I left it. The ability to really lock things to a location is really the one part that needs to be perfect for this experience to work, and they really nailed it. In addition, my pod mates were all around the room with avatars floating over their heads.

So with a night to think about it, here are my thoughts after using the Hololens a year later. The field of view issue is still very small, and clearly not something they were not able to address before they shipped to developers. I would explain it as something like a mid-sized television, in the 27-inch range, sitting a few feet away from you. My experience was better this time because there were less software issues, but the small field of view can certainly take some getting used to.

The hardware itself was very easy to put on and adjust, and it was fairly well balanced in that I never felt like the unit was heavier on the front where the lenses are. The adjustment is done with a wheel on the back, much like a welding helmet if you’ve ever seen one of those. The right side has buttons for volume, and the left side has buttons for brightness. I had to crank up the audio quite a bit because of the loud room we were in, and although the audio was spatial, it was hard to get a sense of that with the commotion going on during the demos. Although I don’t wear glasses, it looked like there would be no issues wearing glasses with it, and several of the other attendees seemed to have no issues putting the device on and using it with them.

The experience of AR is much different than VR. Because you are interacting with things in real space, you can easily move around without fear of tripping or walking into a wall. VR is able to offer much more powerful graphics and immersion right now, but you are largely bound to a single location. The use cases for AR seem, to me, to be not necessarily the same as VR and both should easily be able to co-exist.

While doing my demo, I asked my coach how to close the app we were running, and he showed me a “bloom” gesture which closes it. Once I did that, I was in another mode for the Hololens where I could see how it mapped out the physical world with polygons by tapping my finger in a direction. This was amazing and the Hololens did a great job on picking up everything in my area, including the people, with no issues.

I then did another bloom and was back at the start screen. On the demo units, this was pretty sparse, but I was able to go into settings and play around. I didn’t see anything special in there other than the process of interacting with the menus was very simple and was very easy to get used to. From a UI aspect, the Hololens did very well.

At the end of our demo we did some shooting of orbs which opened up a hole in the floor. Peering down into it, it really felt like this was something you didn’t want to step into. The holograms tend to be a bit translucent, but on this one in particular it was much more solid. There’s a lot of untapped potential here and I hope to get a chance to do some of the other demos they have here to get a better feel for that. The headset itself seemed to be near the edges of its processing power on the final demo though, which had a lot of not very complex polygons moving around, and the six people interacting. There was a lot of things to keep track of, as well as quite a few holograms flying around.

Microft then said that all of the code that we used in the demo, and all of the code used on the demos last year, is all available on GitHub to allow devs quicker access to code.

I think the Hololens is still one of the most interesting pieces of tech I’ve used in a long time. There is a lot of potential here for education, training, and even tasks like painting your house and trying different color samples. There are quite a few applications where this would work very well.

The hardware though still needs a bit of work. It is a bit bulky, and the lenses would not stay anchored to the spot in front of me where I set them, so I had to readjust. The field of view is also not very large, and this could be because the processing power is not as powerful as the tethered experiences of VR.

I look forward to seeing where this goes in the future. A lot of the pieces are already well done and on the software side, the experience is very good. With a bit better hardware, which will almost certainly come with time, this is going to be a very powerful tool from Microsoft.

Augmented reality mapping out tech’s next mind-bending trip

Meta Unveils Incredible Augmented Reality Headset at TED


Redwood City-based Meta showed its latest AR glasses live on stage at TED in Vancouver.

The Meta 2 was demonstrated live by CEO Meron Gribetz with a person-to-person “call” showing a hand-off of a 3D model from a holographic person. Gribetz’ perspective was shown through the glasses as he reached out and took a model of a brain — a 3D hologram — from the hands of a colleague he saw projected in front of him.

“We’re all going to be throwing away our external monitors,” Gribetz said.


Gribetz’ talk focused on the idea that “you” are the operating system. His roughly 100-person company is attempting to tap into a more natural way of interacting with information and the people around us, rather than sitting behind a computer terminal or hunched over a little rectangle of light. Instead, Gribetz sees everyone wearing tiny strips of glass in a few years.

“Living inside of Windows scares me,” he said of the current paradigm. “We’re trying to build a zero learning curve computer.”

Gribetz’ vision sounds similar to enormously well-funded Florida startup Magic Leap, which has only shown a video of its technology in action. The startups are attempting to layer digital information on top of our view of the real world, leading to entirely new ways of interacting with other people and processing information. It’s an enormously hard problem to solve though and requires huge advances in new display technologies that look good in a variety of lighting conditions, better movement tracking and lower power consumption. However, the potential for a wearable AR device you can take with you out in the real world is larger than a VR device that might be restricted to use at home.

The Meta demonstration live on stage was interspersed with videos showing footage “shot through Meta 2 technology.” The language is similar to the note at the bottom of the single public Magic Leap video, which says “shot directly through Magic Leap technology.” The disclaimers are likely there because it’s difficult to accurately depict through a traditional video what you can see when wearing an AR headset. For example, Microsoft has been criticized for the way it depicts HoloLens. The device features a limited field of view and comes at a high price, so gaming uses for the technology are likely very limited despite promotional videos showing sprawling mixed reality landscapes and games that take over entire living rooms. It’s unclear what field of view Meta 2 is capable of showing.




The Digital Art Collective Blending Fashion With Augmented Reality


One vision for the future of fashion. (Normals/

In the not-so-distant future, fashion models may wear nothing at all. The real couture would be the threads of code at play in devices—like Microsoft HoloLens or whatever rises from the ashes of Google Glass—worn by journalists, fashionistas, and celebrities, projecting elaborate outfits onto the models’ bodies.

At least, that’s the vision of Cedric Flazinski, co-founder of the French art collective Normals and co-developer of Apparel, a new app that designs digital clothing based on your social media data.

Once you give the app access to your Twitter feed, it uses the API to generate three shirts that represent your personality. Certain words, emojis, punctuations and sentiments guide the design. Say you tweet a lot about yourself—your digital chest might inflate. If you do a lot of mansplaning and tweet more authoritatively, the shoulders will grow. If your posts are cute and kind, your shirt might have a lot of cats and birds. Based on the number of animals on one of my shirts, my Tweets are more twee than I realized.

“It’s more of what you’d call a psyhcosocial profiling. It’s kind of related to the Jungian archetypes. It’s how you address others and relate to others, because this is the point of connection to fashion that we find online,” Flazinski said. “We’re not trying to better the world of fashion. We’re only reacting to more and more augmented reality. Big actors like Apple, Google and Microsoft are really interested in allowing everyone to see digital information in the environment.”

As those tech companies develop better augmented reality technology, Flazinski believes smartphones will be replaced by devices that could make many physical visuals obsolete. “What’s going to happen to the manufactured environment? What’s going to happen to fashion? What’s going to happen to signs we see in public if we can simply display them digitally? So this is where that project starts—as a piece of fiction—and this is where we’d like people to sort of reconsider the future of fashion.”

The craggy blobs you see in the app are a far-cry from Alexander Wang. But soon the app will be able to pull more data from Facebook and other sources, and the design will likely get more sophisticated. “Right now it’s more a theoretical piece, of course,” Flazinski said.  “But we’re ready, if we got some funding, to organize a proper fashion show around augmented reality. That would be amazing.”

A rendering of a future version of the Apparel app

If New York Fashion Week is any indication of what the future holds, then Flazinski’s vision of fashion might not be that far off. When he saw recent photos from the event one thing stood out as prominently as the fashion: Most people were watching the shows through their phones as they took pictures and videos to share on social media.

TED Ahead: Augmented and Virtual Reality Takes Off


  • Steve Rosenbaum  CEO, Waywire Networks; author, ‘Curate This’; Speaker: on curation and storytelling

Today – when we think of video we think of television. And when we think of computers, we think of desktops, laptops, or maybe mobile devices. But there is coming a new technology that melds video and computing into a new kind of reality. Augmented, Virtual, and beyond. It’s what comes After Television.

Last year, Chris Milk gave a TED Talk about his journey into Virtual Reality and his dream to become Evil Knievel. His talk was captivating and for many in the room, the first time that the future of VR clicked. Now he’s back – a year later – with a new TED talk. This is rare and exciting. TED’s process of choosing speakers is rigorous, and second TED talks rare and just one year later almost unheard of. But the area that Milk is working in is red hot, and his company Vrse has been collaborating to bring VR projects to New York Times readers, along with a free edition of Google Cardboard.


Milk stretches virtual reality into a new canvas for storytelling. So what did he share with the TED programmers that convinced them that he had really new things to share? We’ll, that’s going to be one of the great questions of TED2016, which takes place Feb. 15 to 19 in Vancouver,. Will VR be the big new thing? It very well may be.

I expect amazing things from Milk, but the buzz that’s building for Meron Gribetz’s talk is almost deafening. Meron Gribetz is the founder and CEO of Meta. Meta is the first company to produce and sell augmented reality (AR) glasses with natural gestural hand recognition. Last year, the AR firm Magic Leap was slated to give a TED talk and pulled out. That never happens, leading critics to wonder if it hit a snag. But now the buzz is back, as Magic Leap has just raised another $793.5 million dollars – bringing their funding to 1.39 billion dollars.


But, back to Meta. Gribetz is leading an effort to produce and sell augmented reality (AR) glasses with natural gestural hand recognition. Gribetz’ first encounter with AR was during his service in an elite technological unit of the Intelligence Corps within the Israeli IDF.

One of the first to get to try on Meta, Tech explorer Robert Scoble explained that he’s still under NDA until Meta premiers at TED. But that didn’t stop him from in an emotional video that he thought Meta is the most important new product since the original Macintosh.

Said Scoble: “the biggest product demonstration, demo -the most interesting that I’ve ever had in my life. The most important product since the Apple II” said Scoble. If Magic Leap is even second to what I saw today, it’s so f*cking undervalued, compared to the 1.3 billion dollars in magic leap. I can’t even explain how undervalued it is. In the next five years, we’re going to be wearing glasses instead of using computer monitors. We’re going to be wearing glasses instead using mobile phones. And this is in the next five years. It’s coming. it’s coming more quickly than I expected. But the markets that are going to come in the next five to six years are going to be absolutely stunning. We’re talking about Augmented Reality Glasses, and I have just seen a ghost. The iPhone was an improvement over a product we had seen. This is a new product category. I’m emotional because I haven’t seen a product like this since the MacIntosh. That’s been 30 years. When you are in it and wear it, and walk around, and look at the world. Your head starts exploding. This changes computing fundamentally.”

So all eyes are on Meta and it’s first public demonstration of Augmented Reality.

“There is no other future of computing other than this technology, which can display information from the real world and control objects with your fingers at low latency and high dexterity,” Gribetz told CNET. “It’s the keyboard and mouse of the future.”

Magic Leap files for a big pile of patents, including for a sci-fi contact lens

  • by
  • The secretive company has applied for 97 patents, providing a small hint at the futuristic tech it’s developing.

    A futuristic tech company Google  GOOG -1.32%  has invested millions in has patented what seems like the coolest contact lens ever.

    Magic Leap, which is still keeping quiet about what exactly it’s building at its Florida headquarters, has applied for 97 patents, including one for an augmented or virtual reality contact lens. The company has only hinted at what it’s cooking up via a whimsical video released in March showing a mixed reality experience in which virtual objects or elements are projected into the real world.

    The group of patents, published between August 20 and 27, include devices and techniques for capturing and manipulating light and images. Several discuss “outside light,” which refers to the light from objects that bounces off the eye and creates what the brain interprets as an image. Glasses or a contact lens could interfere, and change what the eye sees. And because Magic Leap is in the business of augmented reality, it could use some of the technology it seeks to patent to dim some of the light from real objects to make the virtual ones it projects appear more real.

    Also, as Re/Code notes, the patents discuss displaying light at different focal points. A computer or phone screen displays everything on a single plane, but the 3-D world around us doesn’t, forcing our eyes to adjust their focus. Technology that accounts for this shift in focal points would likely be critical to a contact lens or any headset the company builds.

    Here are the signs that point to Apple’s next big innovation in computing

    Lisa Eadicicco

    Apple could be working on its own augmented-reality technology, which would be a first for the company, according to Piper Jaffray’s Gene Munster.

    Apple usually focuses on mainstream consumer products such as smartphones, tablets, and laptops.

    Munster has picked up on a few clues within the industry that indicate Apple could be working on some type of augmented-reality device. For example, the company acquired a German augmented-reality startup called Metaio earlier this year, a move that was reported back in May.

    Munster notes that Metaio owns 171 worldwide patents related to augmented-reality technology, which would put Apple in 11th place for the number of augmented-reality and head-mounted-display patents held. Google, Microsoft, Sony, and Samsung remain ahead of Apple in that regard.

    Apple also acquired a company called PrimeSense back in 2013, and we have yet to see its technology appear in any of Apple’s products. PrimeSense makes cameras that can sense motion, and it is best known for making the cameras in Microsoft’s Kinect accessory for the Xbox.

    Munster initially predicted that PrimeSense’s tech could be used for motion detection inside the long-fabled Apple television, but that ended up being a bust. Now, however, Munster acknowledges those types of sensors could benefit an augmented-reality headset, since the motion-detection cameras could be used for indoor navigation.

    The company also recently poached an audio engineer from Microsoft who worked on its HoloLens augmented-reality headset. According to LinkedIn, an Apple engineer hired in July named Nick Thompson previously worked as the audio hardware engineering lead at Microsoft for the HoloLens.

    Audio is important in augmented reality, Munster says, because positional audio can make the experience more convincing (i.e., feeling like a certain object is in front of you or behind you, etc.).

    Apple mobile virtual reality patentUnited States Patent and Trademark OfficeApple patent for heads-up display.

    If Apple is working on an augmented-reality project, it would be doing so at a time in which almost every other major technology company is exploring the space. Microsoft, for instance, has been showing off its HoloLens headset at recent events.

    Google initially hyped its augmented-reality efforts with Google Glass back in 2012, and it is said to be working on an enterprise-focused version of the headset too. The company has also invested heavily in Magic Leap, a secretive company that is working on its own augmented-reality platform, which is supposed to be mind-blowing.

    This would be the first time Apple has explored the augmented-reality field, though it has filed patents for technology related to AR in the past. One, for instance, covers “interactive holograms,” while another describes a head-mounted display.

    Read more: