In the recent couple of years, I’ve mentioned in conversation with fellow engineers and technologists that I believe augmented reality (AR) has great practical potential to improve how we live and work.
Last week, I got to experience that myself for the first time, in a practical way, when I wanted to quickly get walking directions to a local taco shop, via my Android phone.
Google Maps presented me with the option to get walking directions via AR.
I gotta say, the experience was phenomenal, despite multiple heads up mentioning it was still in preview mode.
The recognition of my position and orientation on a street was a breeze, quick and very smooth. ( I’m assuming it was using location data in combination with visual cues matching to street view data?)
The app also suggested I put the phone down, to focus on what’s in front of me, instead of trying to walk with the phone in my hand, pointing straight. When I followed the app’s instructions, the interface changed back from a viewfinder like state (with AR overlay arrows and endpoint bubbles for my destination), to a regular maps experience.
Try it out yourself!
(side story: while I was using the app to turn the last corner, while I had the phone held up, a passerby paused and let me observe the surroundings with my phone. When I noticed him pausing, I apologized and suggested he continue on. He suggested I finish taking my selfie, to which I replied that I was using Google’s Maps AR experience to navigate. His reply was “Wow, sounds intense”. My guess is we’ll be seeing more folk on the street mistaking the AR navigators with folk taking selfies, which is the more common notion nowadays).