Tag Archives: emergent technologies

Connecting on virtual reality through the UCISA bursary

David Vince
Senior Product Development Manager, Learning and Teaching Innovation
The Open University

Six months on from Realities 360 2018, San Jose

This year the UCISA bursary scheme enabled me to attend Realities 360. The conference, only in its second year, attracted an international audience of hundreds of colleagues working in education and interested in augmented, virtual and mixed reality. Having searched for an event closer to the UK, without success, it simply wouldn’t have been possible to attend the conference without UCISA’s support.
At the time of the conference, I was in the early stages of a project exploring the affordances of VR in education. As a distance university, our students at The Open University are geographically dispersed and study asynchronously. This poses us some unique challenges, particularly when introducing new or emergent technologies like VR. At the time, we had undertaken two small scale VR pilots to refine our VR production process. One of these pilots was a presentation practice tool for law students which gave them the opportunity to present virtually and take questions from a virtual audience. This gave students the opportunity to practice applying the law to near real-world problems and receive analytics data to aid their reflection before undertaking the task in person. I soon learned that VR demanded a new approach to design and construction over established media due to its uniquely immersive properties. I’ve used my experience of the conference to learn from other’s practice and refine our production processes. I’m now leading the project at a phase where we need to transition from a discreet R&D project to operating at greater scale.
One of the key takeaways from Realities 360 was to see how others were approaching the design and creation of VR experiences. It helped me to better understand immersion as a new form of narrative. There is a need to consider how users might interact (i.e. the interface being used – not forgetting voice, gesture and haptic interfaces) with virtual objects, as well as how those objects behave, so as to go beyond the affordances of established media.
Sharing my conference experience with my immediate team has led us to consider how we can enhance the design of our VR experiences. There’s a gap in evaluation of VR in education and we’re exploring how analytics might infer where students are becoming more proficient with tasks, and therefore eliminate the need to present them with text-based questions and interrupt their experience.
Learning from the conference has been shared internally at our annual university-wide Learn About Fair. This has enabled us to connect with faculty staff who see the potential for using VR in their discipline. It’s also helped us to attract support from a senior stakeholder!
Last week, my team presented at the ‘Immersive Environments’ event organised by UCISA’s Digital Education Group. This gave us the opportunity to share an output of the project, which is a VR suitability toolkit intended to support the design and creation of pedagogically viable VR.
Undoubtedly, the biggest benefit from the bursary has been the opportunity to connect with, and learn from, colleagues both nationally and internationally. This has given us a forum to share our experience and develop a support network, and learn how others are solving some of the technical challenges and issues of scale associated with producing VR.
Interested in finding out more about a UCISA bursary, then visit UCISA Bursary Scheme.

What’s the reality with Virtual Reality?

David Vince
Senior Product Development Manager, Learning and Teaching Innovation
The Open University

Realities 360

As a senior product development manager in the Learning Innovation team at the Open University, my role is to work with colleagues to enhance teaching and learning through developing new products (i.e. tools and platforms) and supporting processes.
Earlier this year, I received a UCISA bursary enabling me to attend Realities 360. It bills itself as a hands-on event for early adopters and learning technologists to investigate first-hand Augmented Reality (AR), Virtual Reality (VR) and other simulations for learning which fall under the umbrella term of Extended Reality (XR).

What’s the reality with Virtual Reality?

Here are my reflections from Realities 360:
  1. What’s the problem VR can solve?
VR technology is still emergent. So, how do we use this new technology to do something existing tools, tech and media, don’t already enable without risk of being accused of ‘technology drive’ (as opposed to ‘pedagogy driven’) solutions? My personal take is that neither are desirable and, in fact, they need to be mutually supportive which leads nicely on to the following…
  1. Human-centred design
Find your problem. Opt for a user centric approach. IDEO have a design kit to get you started developing empathy with users and gain better insights into their needs/context. If your product has value to your users, they’re more likely to adopt it.
  1. Start small, pilot, evaluate and (re)iterate
It’s easy to be critical of emergent technologies. Best practice hasn’t emerged so we’re all learning: start small, learn and then (re)iterate.
  1. ‘Just because you can, doesn’t mean you should’
This is something that has been said within our team but something Linas Mockus and Joseph Scott, Instructional Designers at Penn State World Campus, Penn State’s online campus, pointed out twice in their presentation entitled ‘Is online education ready for VR and 360 video’. Linas and Joseph are and plan to make their research findings public. In the meantime, you might want to take a look at the news pages of Penn State’s website.
Higher education has been slow to adopt VR but there seemed to be plenty of like-minded colleagues from higher education in this session. At present, AR/VR simulation conferences seem to have a bias towards the training sector but there’s an obvious need for mechanisms for educators to share practice and learn from each other.
  1. xAPI might be your new best friend
VR experiences generate a lot of data as they’re computer mediated. Some of this is structured data, such as responses to in-experience questions however, there’s also unstructured data, such as what users are looking at, determining the meaning of their responses (e.g. sentiment analysis) etc. The ‘x’ in xAPI is short for “experience,” and gives a deeper level of behavioural insight taking things that aren’t structured and giving them structure, e.g. by recording who did what, what was done, what it was done to (i.e. an object) and a host of contextual data.
xAPI is well worth considering to get a better insight into what your learners are doing and gauge that learning has taken place by designing in activities/tasks that you set out to monitor. This will improve the experience and reduce reliance on those in-experience questions which I’ve seen lots of over the past few days.
Thanks UCISA for the bursary enabling me to attend Realities 360. During my time here, I’ve met colleagues travelling from as far away as South Africa who, like me, haven’t found conferences closer to home that fit the bill.
This blog first appeared on the Open University, Learning Innovation blog
Interested in applying for a UCISA bursary? Then visit UCISA Bursary Scheme.