Yesterday I went hands-on with the IBM Watson Sandbox app on the HTC Vive. Harnessing the IBM supercomputer that combines AI and analytical software, this tech demo offers a unique glimpse at what could well become the future of user interfaces – the human voice. The app allows you to create in VR using just speech. Say “dog” and one appears. Want a gorilla or a burger or a table or a dragon? All of these and more can be spawned by simply speaking their name. In fact the app allows users to create about 200 objects using over 450 recognised words. These objects can even be customised through the addition of size and colour (e.g. small blue dragon.)
Take a look:
You can also find out more about the project here.
It’s pretty amazing and something I recommend trying for yourself. It’s definitely a glimpse of the future in terms of AI empowering VR users to create at all levels – be it harnessing the technology as a part of app development or as the core user interface. Here’s a little clip from my own test-run with the app:
I think that using the human voice as a UI really could be a game-changer in schools. Students who may be too young to read or interact with a complex on-screen UI could begin to access digital content in VR that was previously inaccessible to them. Students could verbalise stories and watch them come to life or suggest a solution to a problem and quickly ascertain if it is viable. It would mean time is saved explaining controls and thus the technology can take its rightful place in the backseat behind pedagogy.
Though the Sandbox is technically just a demo, there’s definitely potential to bring it into a classroom setting for some engaging experiences, trials and exploration. Here are a few simple suggestions for how you could use the app with students:
Challenge them to explore the app and see how many different objects they can create.
Have students experiment with the use of adjectives to customise the creations. For example, what synonyms for small produce the same effect?
Set the task of creating as many animals as possible then moving them to order by size or group by species.
Let students create a scene using objects of their choosing and then screenshot it. They could then produce some writing that either describes the scene or the process involved with its creation.
Create a set of colouring cubes and use it to model a mathematical concept like fractions or ratio.
If you want to see just how Watson can be harnessed within a full VR experience, check out the Star Trek Bridge Crew app on Vive. This app utilises the same technology to allow you to issue commands to your crew. Take a look at this clip where the team behind the project explain the role that Watson will play in the next generation of VR: