Leap Motion ‘Virtual Wearable’ Prototype is a Potent Glimpse at the Future of Your Smartphone

Because VR can take over our entire reality, it can be great for entertainment. But when it comes to AR, the hope is that the tech will be a transient and beneficial addition to reality, rather than taking over your world completely. But figuring out how that works means first understanding how we can interact with AR information at a basic level, like being able to do the same kinds of simple, information-driven tasks that you do hundreds of times per day with your smartphone. Leap Motion, a maker of hand-tracking software and hardware, has been experimenting with exactly that, and is teasing some very interesting results.

Smartphones are essential to our everyday lives, but the valuable information inside of them is constrained by small screens, unable to interact with us directly, nor the world around us. There’s widespread belief in the immersive computing sector that AR’s capacity for co-locating digital information with the physical world makes it the next big step for our smartphones.

Leap Motion has shown lots of cool stuff that can be done with their hand-tracking technology, but most of it is seen through the lens of VR. The company’s VP of Design, Keiichi Matsuda, however, has recently begun teasing prototypes for how the tech can be applied to AR, and the results are nothing short of a glimpse of what our smartphones will eventually become. Matsuda calls this prototype the ‘virtual wearable’:

https://platform.twitter.com/widgets.js

The video is shot through an unidentified AR headset which is using Leap Motion’s camera-based hand-tracking module to understand the position of the user’s hands and fingers. Matsuda has envisioned some interesting affordances which uniquely work with the limitations of Leap Motion: the ‘flick tab’ menus are a smart stand-in for what most of us would think to represent as simple buttons; the visual lack of resistance helps reduce the expectation of tactile feedback. The footage also shows impressive occlusion, where the system understands the shape of the user’s hands and appropriately renders ‘clipping’ to make the AR menu feel like it really exists in the same plane as the user’s hands.

.IRPP_kangoo , .IRPP_kangoo .postImageUrl , .IRPP_kangoo .imgUrl , .IRPP_kangoo .centered-text-area { min-height: 100px; position: relative; } .IRPP_kangoo , .IRPP_kangoo:hover , .IRPP_kangoo:visited , .IRPP_kangoo:active { border:0!important; } .IRPP_kangoo { display: block; transition: background-color 250ms; webkit-transition: background-color 250ms; width: 100%; opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #eaeaea; box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); -moz-box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); -o-box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); -webkit-box-shadow: 0 1px 2px rgba(0, 0, 0, 0.17); } .IRPP_kangoo:active , .IRPP_kangoo:hover { opacity: 1; transition: opacity 250ms; webkit-transition: opacity 250ms; background-color: #e6e6e6; } .IRPP_kangoo .postImageUrl , .IRPP_kangoo .imgUrl { background-position: center; background-size: cover; float: left; margin: 0; padding: 0; } .IRPP_kangoo .postImageUrl { width: 30%; } .IRPP_kangoo .imgUrl { width: 100%; } .IRPP_kangoo .centered-text-area { float: right; width: 70%; padding:0; margin:0; } .IRPP_kangoo .centered-text { display: table; height: 100px; left: 0; top: 0; padding:0; margin:0; } .IRPP_kangoo .IRPP_kangoo-content { display: table-cell; margin: 0; padding: 0 10px 0 10px; position: relative; vertical-align: middle; width: 100%; } .IRPP_kangoo .ctaText { border-bottom: 0 solid #fff; color: #1ABC9C; font-size: 13px; font-weight: bold; letter-spacing: .125em; margin: 0; padding: 0; text-decoration: underline; } .IRPP_kangoo .postTitle { color: #34495E; font-size: 16px; font-weight: 600; margin: 0; padding: 0; } .IRPP_kangoo .ctaButton { background-color: #e6e6e6; margin-left: 10px; position: absolute; right: 0; top: 0; } .IRPP_kangoo:hover .imgUrl { -webkit-transform: scale(1.2); -moz-transform: scale(1.2); -o-transform: scale(1.2); -ms-transform: scale(1.2); transform: scale(1.2); } .IRPP_kangoo .imgUrl { -webkit-transition: -webkit-transform 0.4s ease-in-out; -moz-transition: -moz-transform 0.4s ease-in-out; -o-transition: -o-transform 0.4s ease-in-out; -ms-transition: -ms-transform 0.4s ease-in-out; transition: transform 0.4s ease-in-out; } .IRPP_kangoo:after { content: “”; display: block; clear: both; }

SEE ALSO
Exclusive: Designing ‘Lone Echo’ & ‘Echo Arena’s’ Virtual Touchscreen Interfaces

The design also draws an on existing, well established touchscreen interface affordances, like a line indicating the grab point of a sliding ‘drawer’ menu; it’s easy to see how this approach could be effectively used to convey and act upon the same sort of basic ‘notification’ type information that we frequently deal with on our smartphones.

Another video from Matsuda shows what the underlying hand-model, as tracked by Leap Motion, looks like to the system behind the scenes:

https://platform.twitter.com/widgets.js

Leap Motion shared a sketch showing an expanded vision of the ‘virtual wearable’ concept:

Matsuda found his way to Leap Motion following the creation of two excellent short films which envision a future where AR is completely intertwined with our day to day lives: Augmented (hyper)Reality and its follow up, HYPER-REALITY (both definitely worth a watch). Now as Leap Motion’s VP of Design, he’s turning his ideas into (augmented) reality.


Leap Motion designers Barrett Fox and Martin Schubert have recently published a series of guest articles on Road to VR which are worth checking out:

The post Leap Motion ‘Virtual Wearable’ Prototype is a Potent Glimpse at the Future of Your Smartphone appeared first on Road to VR.

Be the first to comment

Leave a comment

Your email address will not be published.


*