Built a VR Museum using Unity and the Google VR SDK for users to explore cutting edge VR in the education industry. It was a VR experience completely built from scratch using VR locomotion, interactivity, and appropriate scale of objects.
A mobile VR experience
This is a VR project I made as part of the Udacity VR Developer Nanodegree. I was asked to look around and research a topic for the project, which could include: one or more VR companies, one or more VR applications (how VR is used) and one or more industries that have the potential to be impacted by VR.
I decided to focus on the Education Industry, and developed Night at the Museum as a Unity App that uses the Google VR SDK to create an immersive experience in which you enter a virtual reality museum with ‘information booths’ which include both visual and audio feedback for users.
STATEMENT OF PURPOSE
Night at the Museum is a mobile VR application for new VR users which showcases the impact of VR in the education industry in a fun and creative way through a series of five display points.
It is important to understand the primary end-user before we start building an application. The end users for ‘Night at the Museum’ are people who are new to VR, but have experience interacting with 3D games and videos or people who are already interested in learning about the Education Industry in VR. They are probably going to be in the age range of mid-twenties to mid forties and own a next-gen smartphone.
This is an early sketch of the VR experience, it shows the basic elements of the interaction. The idea is that a user stands inside the museum, and can move between 5 information booths which include both visual and audio feedback for users inside the space. The user travels by clicking on waypoints and can play/pause videos by clicking on a controller.
I arranged the 5 information booths and their content as follows:
– Google Maps VR, island.
– AMD VR in Education, futuristic classroom.
– Engage VR Platform, robot laboratory.
– Apollo 11 Experience, space both.
– Google Expeditions, city.
For this app, I created most of the 3D models in Google Blocks. With Blocks, you can create real, volumetric objects on VR and then export them for Unity. I also used some models from poly.google.com imported them directly to Blocks and built my individual scenes directly in VR using Oculus.
When designing VR content it’s critical to go into the VR experience early and often. Scale, movement, stereoscopic rendering, spatial audio, and color need to be tested on the VR device as a flat screen can’t convey all the information you need to test and experience.
USER TEST 1: THE INTERFACE
For my first user test, I focused on the goals of Scale, Lightning, Distance, and Comfort. Which were represented in the following questions:
Does the scale feel appropriate?
Is the experience comfortable?
Is the mood well established?
The results of the test were:
People describe the scale as appropriate and comfortable.
The mood is exciting and inviting, but the lighting and textures of the models look too bright and pixelated.
Based on my results, I went back to Blocks and changed the scale of the objects, I reimported my objects in Unity and also updated the UVS settings and scale of the objects until the user tests showed that the lighting issues were resolved.
USER TEST 2: MOTION MECHANICS AND INSTRUCTIONS
For this test, I focused on navigation and instructions. Because people are actually moving around in a physical space, ergonomics and physical comfort are a real factor. As the app incorporates waypoints as the movement mechanics, it was very important to test if all the points were visible and accessible to the user. I also asked the user about text distances and legibility to ensure a correct treatment, scale, and distance of the signs.
The following questions were used in this test:
How do you feel about the start screen?
How do you feel about the instructions?
How do you feel about the speed of the movement?
Would you describe yourself as feeling sick in any way?
How do you feel about the waypoints placement?
Did you notice anything disorienting from the movement?
The findings of this test were: The text was easy to read, people felt the speed was appropriate and didn’t feel sick or disoriented, but some waypoints were not accessible at all times or hard to click.
Based on the user test conducted, I realized I was running into a known issue tied to the raycaster range. The range of the ray thrown from the camera was too short and didn’t hit the collider applied to the waypoint. This prevented the reticle to react. I fixed the bug located on the GvrReticlePointer.cs script and tested again until all the waypoints were accessible to the users. I also decided to add more waypoints to give the user a bit more of freedom to explore.
USER TEST 3: FINAL USER TEST AND ITERATION
For this user test, I asked users to go through the experience from start to finish. And test all the waypoints and player controls. One of the bugs that we found was that some videos were displayed backward and needed to be rotated 180 degrees.
BREAKDOWN OF THE FINAL PIECE
SCENE SETTING, MOOD AND LIGHTING
The environment created helps establish the mood for the scene and in turn, helps get the users to feel a certain way. For this scene I wanted the users to feel interested in the content of the museum. I started accomplishing this with the use of lighting and sound.
I added lights on top of each of the viewing stations to showcase the objects, I also added individual sounds to each individual booth.
MOBILE MOVEMENT MECHANIC
For ‘Night at the Museum’, I used Waypoint movement which allows us to control where the player will be when viewing the exhibits and is unlikely to induce simulator sickness. The movement is activated when the player focuses and clicks on a waypoint sphere.
AUDIO, VIDEO CONTROLS AND FEEDBACK
In this final stage, I sat up the mood with audio. I added a general audio atmosphere and GVR Audio Players for each individual scene. I also implemented the video players and coded the controllers that provide feedback and let the user interact with each of the videos. I made sure that if a user plays a video, the atmospheric audio stops. I also coded a function that stops the video from playing and resets the video controllers if the user leaves the individual scene without stopping the video.
This VR project was more complex than previous ones. I learned a lot about VR platforms and applications and took this a step further by investigating VR companies/applications that are impacting the Education industry right now.
I also experimented with creating objects directly in a VR app using Oculus and Google Blocks which was challenging but very exciting.
VR is in the early stages, and new development is happening every day. Learning how to stay abreast of the industry development and keeping the eyes on the horizon of the future for VR will be critical for VR content creators.
If I wanted to take this experience further, I would add more rooms and work more on the objects that are inside the scenes, as well as the museum interior. Thanks for reading!