Intuitive Design Delights End Users

https://www.autocar.co.uk/car-news/motor-shows-geneva-motor-show/honda-bucks-industry-trend-removing-touchscreen-controls

Great example of a scenario where the User Experience (UX) strongly matters and goes well beyond any simple User Interface (UI) — intuitive controls in the car allow drivers to use them without having to look with their eyes, to perceive config changes, which is especially important to allow the driver to keep their eyes on the road.

User testing matters. User context matters.

Focus on making life easier for your users

I’m a big believer in the success of technology that makes a user or community’s day in the life, easier. In other words, the “practical tech” tends to win out in terms of mass adoption, over tech that has potential but keeps to abstract use-cases or ones that carry value for a niche set of users.

A good example of this in the recent years, has been the rollout of Augmented Reality (AR) and Virtual Reality (VR) tech and applications - for enterprise and consumer spaces. Both carried the promise to improve our experience at home and at work.

Thus far, as per the above, AR carried much more practical applications to everyday life with it, and VR has realized much fewer of these.

Google’s recent decision to focus on its AR efforts show their teams’ understanding of where they have impact on the everyday lives of their users.

https://venturebeat.com/2019/10/15/google-discontinues-daydream-vr/

Emergence of AR Applications

In the recent couple of years, I’ve mentioned in conversation with fellow engineers and technologists that I believe augmented reality (AR) has great practical potential to improve how we live and work.

Last week, I got to experience that myself for the first time, in a practical way, when I wanted to quickly get walking directions to a local taco shop, via my Android phone.

Google Maps presented me with the option to get walking directions via AR.

I gotta say, the experience was phenomenal, despite multiple heads up mentioning it was still in preview mode.

The recognition of my position and orientation on a street was a breeze, quick and very smooth. ( I’m assuming it was using location data in combination with visual cues matching to street view data?)

The app also suggested I put the phone down, to focus on what’s in front of me, instead of trying to walk with the phone in my hand, pointing straight. When I followed the app’s instructions, the interface changed back from a viewfinder like state (with AR overlay arrows and endpoint bubbles for my destination), to a regular maps experience.

Try it out yourself!


(side story: while I was using the app to turn the last corner, while I had the phone held up, a passerby paused and let me observe the surroundings with my phone. When I noticed him pausing, I apologized and suggested he continue on. He suggested I finish taking my selfie, to which I replied that I was using Google’s Maps AR experience to navigate. His reply was “Wow, sounds intense”. My guess is we’ll be seeing more folk on the street mistaking the AR navigators with folk taking selfies, which is the more common notion nowadays).