Mark Zuckerberg says augmented reality glasses are ‘what we’re trying to get to’

Mark Zuckerberg is optimistic about the future of virtual and augmented reality. At his Facebook F8 conference keynote, Zuckerberg said that the company was working on “a whole new set of social experiences” across VR platforms, echoing an announcement the company made earlier this year. “Virtual reality has the potential to be the most social platform, because you actually feel like you’re right there with another person,” he said, referencing an Oculus Rift “toybox” demo that lets two people play together in VR. But in the coming decade, Zuckerberg sees a progression that many people have predicted: that virtual reality will merge with augmented reality and become part of everyday life.

Over the next 10 years, the form factor’s just going to keep on getting smaller and smaller, and eventually we’re going to have what looks like normal-looking glasses that can do both virtual and augmented reality. And augmented reality gives you the ability to see the world but also to be able to overlay digital objects on top of that.

So that means that today, if I want to show my friends a photo, I pull out my phone and I have a small version of the photo. In the future, you’ll be able to snap your fingers and pull out a photo and make it as big as you want, and with your AR glasses you’ll be able to show it to people and they’ll be able to see it.

As a matter of act, when we get to this world, a lot of things that we think about as physical objects today, like a TV for displaying an image, will actually just be $1 apps in an AR app store. So it’s going to take a long time to make this work. But this is the vision, and this is what we’re trying to get to over the next 10 years.

Palmer Luckey, inventor of the Oculus Rift headset that Facebook acquired in 2014, has previously predicted that augmented and virtual reality headsets will merge into a single piece of hardware that people carry around or wear like a pair of glasses. Granted, that’s going to be harder than it might sound. Right now, virtual and augmented reality headsets use fundamentally different visual technology, and it’s difficult for a pair of small glasses to block out the outside world the way a VR headset can.

But the photo sharing technology Zuckerberg’s describing is already plausible on an early augmented reality headset like the Microsoft HoloLens. So is the idea of selling apps to simulate physical objects, although we hope he wasn’t being literal about paying an extra dollar to simply simulate a video screen. Judging by the ambivalent response to early augmented reality headset Google Glass, it may actually be tougher to sort out the social norms than the pure technology — an issue Facebook should be more than a little familiar with.

Advertisements

Samsung patent reveals ‘smart’ contact lens with built-in camera

Move over, Google Glass: in Samsung’s sci-fi vision of the future, an internet-connected contact lens could overlay search results and discreetly take photos

Samsung is exploring the development of a contact lens that can project images directly into the users’ eye, take photographs and connect wirelessly to a smartphone, a patent application has revealed.

The South Korean copyright authority has published a 29-page application made by the consumer electronics firm two years ago, reported the technology blog Sammobile, offering a rare insight into a science fiction vision of a future technology that could be closer than we think.

The lens could overlay internet-connected services directly into the user’s line of sight, in an example of what is known as augmented reality. It could also discreetly – even covertly – take photographs. The device would be controlled by eye movements or blinking, according to the patent, and it would connect with a smartphone.

It is not clear whether patent sought in the application, which was written in Korean and made in September 2014, has been granted, or whether Samsung has begun incorporating the technology into a product.

But wearable technology and augmented reality tools are actively being developed by firms in need of new device ideas. Google launched its Glass headset in February 2013, yet its combination of conspicuous, clunky design and features including map directions, phone calls and video recording failed to inspire mainstream interest.

Virtual reality device makers, including Samsung, face a similar uphill struggle persuading a mainstream audience that their new headsets – which can seem unsightly and alienating – aren’t only for geeks.

The Samsung patent says the quality of the Glass augmented-reality experience “can be insufficient”, according to a translation.

According to the application, Samsung is exploring on-eye navigation instructions and the ability to search online for more information based on what a user happens to be looking at in the real world. Executives do not, however, seem to have acknowledged one of the main complaints against Google Glass: that people found the idea of “Glassholes” walking around with a potentially always-on, inconspicuous camera to be creepy.

Google has previously disclosed ambitions to build a connected contact lens, although the device was pitched as a way to measure glucose levels for diabetes patients.

Hololens Round Two: Augmented Reality At Build 2016

by Brett Howse on March 31, 2016 11:50 AM EST

Last year at Build I got my first chance to try Hololens. That experience was very interesting, not only because of the potential of Augmented Reality, but the entire circus surrounding the device. The Hololens sessions were at a different location, and the groups brought over had to lock up everything electronic. We could only do photos of a unit in a display case. Naturally when Microsoft announced yesterday that Hololens would start shipping to developers yesterday, this year’s experience could never be so secret.

So when we got to the demo location, and were given keys for a locker, I was a bit taken aback. But it wasn’t anything as sinister this time, only a way to make sure there were no backpacks on the floor as tripping hazards, because this year’s untethered experience was really untethered.

That comes a bit later though. This year’s demo involved building and deploying an 3D app using Unity and Visual Studio, and each person doing the demo also got a coach to help solve any issues on the way. The Hololens unit was slightly different this year, but looking at it, it was remarkably similar to last year’s demo version. The one big change this year was very welcome. Instead of having a person physically measure the inter-pupillary distance on your head (the distance between your pupils), the experience is now handled through software when you first put the headset on. There is a quick calibration that you can run and it sets your eye position based on some air tap gestures. It was very quick and easy, and the set walks you through everything required with voice and visual cues.

Then we sat down building our apps. Since this was a demo for press, all of the coding was done ahead of time and we just had to walk through adding scripts in Unity to set up the demo. Then we build them, and deploy to a remote machine using the IP address of the Hololens.

The demo app was of an energy ball which, when locked to a location in space, would open up and show some nifty effects. The experience was very basic compared to what I would expect of the retail apps, but this was a simple demo and it worked well.

The very interesting bit was later on, when we linked our Hololens units with the other people in our pods of six people. This way all six people could interact with a single energy ball. People also got to choose an avatar which would float over their heads.

That experience was pretty amazing. With very little setup, the holograms were truly linked to a single point that all people could see. As part of this demo, my coach suggested I walk around the (very large) room and then look back. This was probably the most amazing part of the demo. After walking a hundred feet or more away, and around some tables and pillars, I looked back and the hologram was still floating exactly where I left it. The ability to really lock things to a location is really the one part that needs to be perfect for this experience to work, and they really nailed it. In addition, my pod mates were all around the room with avatars floating over their heads.

So with a night to think about it, here are my thoughts after using the Hololens a year later. The field of view issue is still very small, and clearly not something they were not able to address before they shipped to developers. I would explain it as something like a mid-sized television, in the 27-inch range, sitting a few feet away from you. My experience was better this time because there were less software issues, but the small field of view can certainly take some getting used to.

The hardware itself was very easy to put on and adjust, and it was fairly well balanced in that I never felt like the unit was heavier on the front where the lenses are. The adjustment is done with a wheel on the back, much like a welding helmet if you’ve ever seen one of those. The right side has buttons for volume, and the left side has buttons for brightness. I had to crank up the audio quite a bit because of the loud room we were in, and although the audio was spatial, it was hard to get a sense of that with the commotion going on during the demos. Although I don’t wear glasses, it looked like there would be no issues wearing glasses with it, and several of the other attendees seemed to have no issues putting the device on and using it with them.

The experience of AR is much different than VR. Because you are interacting with things in real space, you can easily move around without fear of tripping or walking into a wall. VR is able to offer much more powerful graphics and immersion right now, but you are largely bound to a single location. The use cases for AR seem, to me, to be not necessarily the same as VR and both should easily be able to co-exist.

While doing my demo, I asked my coach how to close the app we were running, and he showed me a “bloom” gesture which closes it. Once I did that, I was in another mode for the Hololens where I could see how it mapped out the physical world with polygons by tapping my finger in a direction. This was amazing and the Hololens did a great job on picking up everything in my area, including the people, with no issues.

I then did another bloom and was back at the start screen. On the demo units, this was pretty sparse, but I was able to go into settings and play around. I didn’t see anything special in there other than the process of interacting with the menus was very simple and was very easy to get used to. From a UI aspect, the Hololens did very well.

At the end of our demo we did some shooting of orbs which opened up a hole in the floor. Peering down into it, it really felt like this was something you didn’t want to step into. The holograms tend to be a bit translucent, but on this one in particular it was much more solid. There’s a lot of untapped potential here and I hope to get a chance to do some of the other demos they have here to get a better feel for that. The headset itself seemed to be near the edges of its processing power on the final demo though, which had a lot of not very complex polygons moving around, and the six people interacting. There was a lot of things to keep track of, as well as quite a few holograms flying around.

Microft then said that all of the code that we used in the demo, and all of the code used on the demos last year, is all available on GitHub to allow devs quicker access to code.

I think the Hololens is still one of the most interesting pieces of tech I’ve used in a long time. There is a lot of potential here for education, training, and even tasks like painting your house and trying different color samples. There are quite a few applications where this would work very well.

The hardware though still needs a bit of work. It is a bit bulky, and the lenses would not stay anchored to the spot in front of me where I set them, so I had to readjust. The field of view is also not very large, and this could be because the processing power is not as powerful as the tethered experiences of VR.

I look forward to seeing where this goes in the future. A lot of the pieces are already well done and on the software side, the experience is very good. With a bit better hardware, which will almost certainly come with time, this is going to be a very powerful tool from Microsoft.

Meta Unveils Incredible Augmented Reality Headset at TED

by UPLOADVR • FEBRUARY 17TH, 2016

Redwood City-based Meta showed its latest AR glasses live on stage at TED in Vancouver.

The Meta 2 was demonstrated live by CEO Meron Gribetz with a person-to-person “call” showing a hand-off of a 3D model from a holographic person. Gribetz’ perspective was shown through the glasses as he reached out and took a model of a brain — a 3D hologram — from the hands of a colleague he saw projected in front of him.

“We’re all going to be throwing away our external monitors,” Gribetz said.

meta-two-ar-brain

Gribetz’ talk focused on the idea that “you” are the operating system. His roughly 100-person company is attempting to tap into a more natural way of interacting with information and the people around us, rather than sitting behind a computer terminal or hunched over a little rectangle of light. Instead, Gribetz sees everyone wearing tiny strips of glass in a few years.

“Living inside of Windows scares me,” he said of the current paradigm. “We’re trying to build a zero learning curve computer.”

Gribetz’ vision sounds similar to enormously well-funded Florida startup Magic Leap, which has only shown a video of its technology in action. The startups are attempting to layer digital information on top of our view of the real world, leading to entirely new ways of interacting with other people and processing information. It’s an enormously hard problem to solve though and requires huge advances in new display technologies that look good in a variety of lighting conditions, better movement tracking and lower power consumption. However, the potential for a wearable AR device you can take with you out in the real world is larger than a VR device that might be restricted to use at home.

The Meta demonstration live on stage was interspersed with videos showing footage “shot through Meta 2 technology.” The language is similar to the note at the bottom of the single public Magic Leap video, which says “shot directly through Magic Leap technology.” The disclaimers are likely there because it’s difficult to accurately depict through a traditional video what you can see when wearing an AR headset. For example, Microsoft has been criticized for the way it depicts HoloLens. The device features a limited field of view and comes at a high price, so gaming uses for the technology are likely very limited despite promotional videos showing sprawling mixed reality landscapes and games that take over entire living rooms. It’s unclear what field of view Meta 2 is capable of showing.

 

 

 

Apple Acquires Augmented Reality Company Metaio

by   

Apple has acquired Metaio, an augmented reality startup that launched way back in 2003 as an offshoot of a project at Volkswagen. The company’s site said it stopped taking new customers, and now a legal document shows Apple has bought it. The document confirms a transfer of shares of the startup to Apple on May 21st/22nd.

When asked by TechCrunch, Apple responded with its standard reply it gives as confirmation of acquistions, “Apple buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans.”

slack-imgs.com

The company began showing signs that something was clearly amiss when it canceled its user conference in San Francisco earlier this month and later took down the company’s Twitter accounts. It also posted on its website a couple of days ago that it was ending purchase of products and subscriptions. Email tech support ends June 30th.

Our senior writer Josh Constine went to Metaio’s San Francisco office yesterday to investigate, and a nervous employee refused to speak with him and shut the door in his face.

Notice on Metaio website that it's no longer accepting purchases.

A source told TechCrunch that clients who use Metaio are “flipping out” after seeing the shut down message on the website and not hearing a word from the company about what’s going on…until now.

Metaio hadn’t taken traditional Silicon Valley venture capital, but had raised some money from Atlantic Bridge and Westcott.

The company is well established. Many impressive projects have been produced using its tools including this one of with Ferrari that gives a potential buyer an AR tour of the car (as though the actual car isn’t cool enough):

And this one for travelers in Berlin to see what the scene they are looking at would have looked like when the Berlin Wall was up. The program uses historical footage that you can see by pointing your smartphone or tablet at a particular place.

Metaio boasts a big community of developers with a 1000 customers and 150,000 users worldwide in 30 countries. All of them are wondering what’s up right now.

apple-vr-headset

AUGMENTED REALITY IS COMING TO YOUR WINDSHIELD

A HOLOGRAPHIC HEADS-UP DISPLAY SHIPS LATER THIS YEAR

Augmented Reality in a Contact Lens

contact

Photos: University of Washington

A new generation of contact lenses built with very small circuits and LEDs promises bionic eyesight

By Babak A. Parviz

Posted 1 Sep 2009 | 20:35 GMT

The human eye is a perceptual powerhouse. It can see millions of colors, adjust easily to shifting light conditions, and transmit information to the brain at a rate exceeding that of a high-speed Internet connection.

But why stop there?

Continue reading

MINI Augmented Vision

bmw-mini-augmented-vision-6

Exclusive prototype of augmented reality eyewear underlines the innovative flair and creativity of the MINI brand.

Munich, Germany,  April 9, 2015.  MINI is revealing the shape of things to come at the Auto Shanghai show with a pioneering innovation. “MINI Augmented Vision gives an insight into how intelligent connectivity between a MINI car and eyewear into which relevant content is projected might work in the future,” explains Dr. Jörg Preißinger, project-manager MINI Augmented Vision, BMW Group research and technology. “Working with several Qualcomm companies, we have created an interlinked system and augmented reality eyewear with a characteristic MINI design that revolutionise the experience both in and outside the vehicle. This prototype with its customised, interactive functions succeeds in fusing augmented reality with the brand’s trademark sense of lifestyle.” Using see-through technology, the AR eyewear shows relevant information in the driver’s direct field of vision but without concealing other road users, thereby serving to increase safety and comfort while driving. The following functions will be projected into the field of view with MINI Augmented Vision:

Continue reading