Apologies to those who have been following this series and wondering what happened to Part 4. Traditional Christmas chaos kicked in at school and then I travelled home for the holidays for the first time in eight years so the amount of time I could spend on writing was limited. We're back now though and ready to see this thing through to a decisive conclusion.
For those that are just diving into the series, here are the links to the previous rounds: Round 1: Ease of Implementation - here Round 2: Range of Content - here Round 3: Financial Implications - here Ok so let’s ring the bell for Round 4!
ROUND 4: Student Interaction
We’re three rounds in and AR has taken the lead after a decisive victory in Round 3 where the focus was cost. For this round I’m going to switch gears and examine how students interact and engage with each platform. Much has been written over the last five years about the need for students to be active in their learning. The possibilities availed by mobile technology in the classroom meant that students finally had a means to easily create rather than just consume digital content. This has been linked to theoretical models like Bloom's Taxonomy, Puentedura's SAMR model and Dale’s Cones of Learning. So how do AR and VR fit in the scheme of things? In this round we'll look at the types of experiences avilable and how students can interact with them to both consume and create content.
Let’s look at VR first this time.
Virtual Reality As detailed in my Depths of VR model, the majority of lower level VR experiences are generally quite passive in the way students interact with them. Whether looking around panoramic images or viewing 360 video content, the simple fact is that students are still consuming content. Now whilst theorist might suppose that this is shallow learning, there is something consider in the actual nature of this content consumption. As I explained in a recent interview with Caitlin Krause, VR content consumption is inherently less passive than other forms of consumption. Users have a freedom that is not availed to them via traditional media in that they can choose exactly how they engage with the content by altering their focus. If they are watching a traditional video, their POV is determined by the director. In VR they can choose to focus on a specific element like a certain character or element within the scene. This leads to a more personal experience with the content. I saw this in action recently during our Rainforest project using the brilliant Under The Canopy film. Some students saw things that others didn’t and it affected their experience.
More interactive experiences will also offer students choices in how they navigate through a virtual world. This could mean simple point and click navigation as with the ViewMaster apps or full movement within a room-scale VR experience on a higher-end headset. Either way, the students are again being given more freedom to explore. An analogy I drew recently during a presentation of my “Why VR?” article is that of the school field trip. All teachers (especially of younger students) have at some point experiences the stress that is inherently part of a field trip since you have a duty of care for students and often find yourself in a large, unfamiliar space, surrounded by strangers. You spend half the time taking headcounts! You’re also forced to lead students together as a group to ensure that they are safe. In a virtual museum or an experience like the awesome Describing Egypt (which lets you explore Egyptian tombs for free via the web) students can explore autonomously, with the teacher taking a facilitator role.
Less passive means more meaningful and the thread that leads through my Depths of VR is definitely autonomy. As we head towards the more advanced forms of VR afforded by platforms like the Vive and Oculus, the way students interact with learning content is transformed radically. Students can use apps like Organon to pull a human body apart and put it back together or fix the ISS using Home: A VR Spacewalk. They can can explore the depths of the oceans, interacting with wildlife in Operation Apex or build three dimensional mindmaps in Noda. These are truly transformational learning experiences - indeed the term experiential learning is framed and clarified here. Just as a pilot would use a simulation to learn to fly a plane, students can develop skills by engaging with tasks that may not be readily possible in reality.
There are also a range of VR apps that allow students to create content. In some cases these are experiences where the creation is external – the user is creating a VR world which they can then navigate and interact with. CoSpaces is an excellent example of this but there are more advanced platforms like Sansar which allow older students access to more advanced 3D design tools to design detailed, immersive worlds. Then there’s the organic link between VR and 360 photography and videography. Students with access to a 360 camera can use platforms like Thinglink’s Teleport 360, InstaVR and many more to stitch media together and embellish with interactivity to create virtual tours.
Beyond these we still have the platforms that actually allow users to create content within VR. Artists can break the laws of physics using Tilt Brush or Masterpiece VR to create works of art that are impossible in reality. Similarly, DT departments can harness platforms like Gravity Sketch to experience the 3D design process in a totally new way, before exporting work to be produced in the real-world. Google’s Poly recently opened up even more doors for students in this regard by providing a platform for virtual creators to share and remix each other’s work. Here's a clip from a recent project with a Year 12 student who used Tilt Brush to design and visualise an upcoming exhibition:
Overall, virtual reality offers a diverse and rich range of ways for students to interact with learning content. They can both consume and create media like never before and each experience is uniquely personalised by the sheer fact that they control the way that they perceive the virtual world.
The current state of AR means that most students interact with content via a mobile device. Whether using “old-school” AR with triggers or the newer ARKit-enabled, trigger-less applications, experiences are typically ones of content consumption. As with VR though, this is a new form of content consumption that is innately elevated above traditional media since students can control how they view and interact with content. Better apps will provide additional information to students as they view AR too, with my old favourite ZooKazam being a prime example of this.
I actually have a separate article in the works called More than Models which will explore the nature of educational AR experiences and the need to ensure that they provide rich learning experiences. Suffice to say that some educational AR apps simple produce on-screen models. Whilst these can still be viewed from any angle and some may even be animated, the depth of learning afforded by them is somewhat limited unless very carefully framed by well-considered lesson-structure.
Then we have the type of AR experiences that are destined to permeate every day life more readily – those that overlay digital content on top of the real-world view to enrich the viewers understanding. Peaks AR allows students to view statistics about mountains in situ, the Virtuali-Tee opens up the wearer’s chest to reveal the organs beneath whilst platforms like NeoBear turn traditional globes into interactive banks of information. This type of digital experience is already becoming more and more embedded in all aspects of society from apps that provide directions to those that provide reviews of the restaurants and services available around you. For students it does represent an excellent way to engage with physical learning content in their classrooms which is unlike anything previous generations have had available to them.
Moving from the consumption of content to the creation of it is a tricky beast when looking at AR. External creation is definitely feasible with an increasing number of platforms offering 3D models that can be “augmentified” and made to appear in the real world. Does this really offer students a richer learning experience than if they just viewed and manipulated the same model on the screen of a tablet though? A good example to consider here is Kubity (which I wrote about late last year) as it allows students to pull their SketchUp models into either an AR or VR experience. The AR experience means that their 3D model can be sat on a table top in their classroom. The VR experience however, allows them to actually move around inside the model, getting a sense of the scale and design as if it was real. It’s clear that in this instance VR offers a far more powerful way for them to interact with their content.
This isn’t to say that there aren’t some excellent AR-focused creation tools out there. HP Reveal (ie Aurasma), Blippar and Zappar all offer educators and students the ability to augment their own physical content for example. This can mean bringing images of student work to life around the classroom or creating AR-enriched tours of a school. Metaverse is another good example of a newer platform for building AR experiences and offers a very student-friendly creation studio that allows students to link AR content dynamically.
Actually creating from within an AR experience is definitely more limited though. This is in part due to the physical limitations of the medium right now – you need to be holding your device for the most part which limits your ability to interact beyond simple clicks. World Brush is an interesting new ARKit app that essentially offers a Tilt Brush like experience but in AR. Whilst fun to use though, the depth of the app is nowhere near that of Google’s VR counterpart and the need to interact via the screen leaves the user one step removed from their creation.
In conclusion, well designed educational AR can definitely offer students valuable opportunities to consume and create learning content. The scope of these experiences probably does not offer as much as those of the VR format though.
For the sheer range and depth of experiences availed to students within virtual worlds, I think this one has to go to VR. Long-term, we may see a shift - especially once AR moves more fully towards wearable tech - but for now, VR definitely allows students to interact and engage with learning content in more meaningful ways overall. It's a more visceral experience and resonates on a more personal level for each learner.
So that's things tied up at two rounds each going into the final round. Join me next week for the finale of the series.
In the meantime, please do share with colleagues and on social media and feel free to shoot me any feedback, questions or comments.