It’s been a couple of months now since the experimental hand tracking feature was added to the Oculus Quest. Whilst there has not been much in the way of integration in mainstream apps, there have been some striking demo experiences available via SideQuest.
If you are new to SideQuest, don’t be scared – it’s pretty straightforward. SideQuest is a free platform that allows you to load content (including apps) onto your Quest without going through the official Oculus Store. This might seem shady but in truth it’s more about giving smaller developers the opportunity to share builds, betas and projects with a wider audience. It’s also being used to get Quest-ready projects in the hands of users whilst developers wait through the glacial process of being accepted to the official store. Engage is one key example here as SideQuest is the only way to install it right now. In fact the Engage site has a great step-by-step guide to installing SideQuest (and Engage obviously) so click here to access that and get SideQuest set up for yourself.
Once you get into SideQuest, you’ll find that there is heaps of content you can install quickly and easily. It’s tagged into categories for searchability and you will find a Hand Tracking category full of some very unique, unusual and often quite weird experiences. Here’s a little montage of some of the recent Hand Tracking experiences I have tested out…
Once you get over the nightmarish hand-fingers and goop ball visuals, I want to bring things back to an educational mindset and think a little about how hand tracking will benefit students and potentially elevate educational VR experiences.
The integration of hand tracking could really be beneficial to VR in education for a number of reasons. I remember when Suzanne Lee was a guest on #CPDinVR Live from Dubai recently and spoke about her work using VR with patients with dementia. One thing Suzanne highlighted was how useful hand tracking was going to be for her work as the patients she worked with would generally forget the various buttons and control mechanics that they would need to use from one day to the next. The same logic can applied to using VR with students since it would remove the friction related to user interface (if it’s done well) and make it easier for students to engage with VR experiences more independently.
The integration of hand tracking also brings us one step closer to the goal of haptic VR experiences wherein virtual objects can be touched and interacted with as if they were real. This will lead to more kinaesthetic learning experiences for students as they interact within the virtual world. Artists will be able to pick up a brush and paint. Scientists will be able to wield tools and instruments. Historians will be able to hold and manipulate artefacts. The possibilities are vast and the impact on the depth of learning due to the increased sense of presence will be equally augmented. Just imagine if hand tracking was integrated into VR apps like Hold the World - letting you hold the fossils directly, or HoloLab Champions - measuring the chemicals with even greater control and precision!
The addition of hand tracking also elevates the user’s ability to emote more naturally using gestures. This will have implications within multi-user educational experiences as students will now be able to do more than simply raise an arm or point it vaguely in the direction they wish you to focus. It opens up a world of smaller details, where learners can highlight minute elements within an experience for closer inspection, evaluation, analysis and more.
Hand tracking may seem like a small step but it is definitely a set in the right direction one that I think will herald many new educational experiences within virtual spaces. I’m excited to see what developers come up with and what comes next.