Project Crane

MFA Thesis

Interaction Design
What’s the best input devices for the future XR devices? Turns out for the era of phone, it is your hand. What if your hand is still the best controller in this alter reality? How should the interfaces and interactions be evolved to accomodate our hands?

The Bad Controllers


Let’s be honest: people get confused when they put on a headset. Ironically, many of current VR/AR interfaces are actually re-using a lot of traditional GUI elements. We have already using them for years, and still get confused when we see them again in a headset, with big and bulky controller.

Although it is not the end of the day using 2D/2.5D interfaces in XR, they are not the best practice for the “end game”. Using a pointer to aim at a virtual screen several meters from you can be exhausting. To design for a volumetric medium, we have to look at existing volumetric interaction patterns.

Home UI in Mirage Solo (2018), Google Daydream

The Good Controllers


As a VR headset evangelist, I have seen this behavior on so many of the first time users: when they see those virtual 6dof objects in VR, all of them - like no exception - tried to use their hand to touch things and forget about the controllers in their hands. This is actually, an attempt to perform volumetric interaction. However due to the current tech limit, usually the virtual world don’t respond well to such inputs. Instead, we’ve been given controller tutorials immediately and force to learn how to use them to mimic our hand.

What if, on the boarding of a headset, the world can respond to your baby grab? And maybe even use it as the main interaction pattern of the whole system? Wouldn’t it be amazing?

It has been proved that for mobile devices, our hands are the best input devices in most of the case. What if it is also true for XR?
Just a random image here sitting next to my boring reasonings. He looks so happy and cool tho! Unfortunately the gear VR will disappoint him very soon...


Learning from Everyday Objects


So how will these fancy mysterious volumetric interactions be like? It turns out, we are already doing them everyday. To open a drawer, to unfold a paper, to browse through vinyl collections in a record stores.

These interactions can be found in the classic industrial design objects. All of these handles, bars, stacks are “inviting” us to use them. Our body reacts naturally to their invitation without go through any user manual.

They share 3 similarities:
1. They usually have some sort of “handle”. So they look juicy and you want to grab them.
2. They usually don’t require you to do complicated movement, most of the time it is just one way pull/push.
3. You have good haptic feedback so you know you are actually interacting with it.

Crane


Crane is the design language I designed to satisfy these volumetric interactions. Wanna hear about the process and reason behind them? Let’s grab a coffee and have a chat.




So...What’s Next?


The philosophy of crane was brought out in late 2018. And as a prophecy it seems pretty accurate so far. Bare hand tracking is becoming a standard for XR devices.

Around the same time Microsoft released their Hololens 2, which supports hand-tracking. Check out the part when Julia explaining how the button works (6:07).



A Year later on Oculus Connect 6 (I was there too!), Facebook announced that they making it possible for Oculus Quest to use hand tracking with the RGB camera, although it still uses indirect manipulations (a hand-generated cursor floating on the UI).




🔙    Ke Ding             in            
Human Interfaces  /  Speculative Design  /  VR&AR  /  Video Game  /  Music Production  /