Travelling is an experimental virtual reality experience where the user is transported through a fantasy landscape. They have the ability to control their speed and to plant trees and also grow apples on the trees they have planted.

Video demonstration and user testing

Try it out at https://aubergene.github.io/travelling/

Development

The app was designed and built for the Oculus Quest using the A-Frame JavaScript library which is built upon the new WebXR web standard and also upon Three.js a WebGL rendering library. I used VS Code as my text editor and live-server to locally host the files. I was then able to access the app through the VR mode of the Quest’s browser.

Background and inspiration

For my individual VR project I wanted to create an experience inspired by Rez, a game initially released for PlayStation 2 by Sega in around 2001. I first played Rez many years ago and it really stuck with me and I feel it has lots of elements with great potential that I think could be developed further.

Rez has a fairly minimalistic aesthetic which uses low polygon 3D models often rendered as wireframes with blending effects and trails. To me the style captures the zeitgeist of the cyberpunk ideas around the time of the new millennium.

Rez (PlayStation 2 - 2001)

The player is represented by a humanoid avatar and they progress through the game on rails, encountering enemies which can be destroyed. This is the primary mechanism of interaction and is performed by holding the X button, moving the onscreen cursor over the enemy and then releasing the button to shoot. You can select up to eight enemies simultaneously, and there is also a power up called overdrive, which will automatically target and shoot enemies.

As the form of interaction is quite simple, the player can relax and just drift through the game, especially on the mode which gives the player unlimited health which called Traveling and is from where I take the name for this project.

In 2015 Rez was re-released as Rez Infinite which has support for VR and was made available for the PlayStation 4 and later the HTC Vive, Oculus Rift and Google Daydream HMDs.

I tried out Rez Infinite on the Google Daydream and I enjoyed it a lot. As it is a port of the original game, the graphics are practically identical, however VR gives it a very different mode of interaction.

One benefit of playing in VR is that it is much easier, since previous you had to control the cursor using a joystick. Now you can simply point at the enemies which is a much more natural gesture and something you can perform very quickly and easily.

However a big problem with Rez Infinite is that the movement of the player is on rails and this also hasn’t been updated since the original which means that the user’s viewpoint is suddenly rotated without their control when moving through the scenes. I am fairly seasoned user of VR but I found it so disconcerting that I kept my free hand on a stationary object to help me cope with the in game direction changes. It would be much better if the user stays on a straight trajectory (as happens for much of the early stages) or perhaps if the changes in direction were on a smooth curve as they are currently snap too quick to the new angle.

Design & Implementation

My approach to design was very iterative. I knew I wanted certain elements, such as the user to follow a fixed path, and for it to feel smooth and natural. I wanted there to interactive elements within the world that the user could control with their hands. However I didn’t have a particular idea visually that I wanted to achieve since I was also completely new to 3D modelling and wasn’t sure what I’d be able to achieve or what tools I could use to get there.

Hello World! of A-Frame

I started with the Hello World example in A-Frame and gradually adapted it as I started to understand how it worked. I began by adding VR controls with laser pointers and then understanding how to write code to deal with objects as you intersected them.

I then worked on adding objects to the world dynamically. I initially used the pool for creating objects, since it is efficient with memory, however I later found that it imposed various constraints and that actually memory and performance wasn’t an issue as my world was quite simple.

I used the A-Frame Environment Component to create a basic world. I found that since it has a lot of configuration parameters that I really began to like it and ended up using it as the basis for the environment. Ideally if I had more time I would have investigated the code further so that I could have even greater control of the environment.

I found that colour was very important to the mood of the game. Initially I had been using a black background with highly saturated coloured objects, somewhat in the theme of Rez. However, I found it really difficult to make it feel good, the lighting was complex and it just didn’t feel immersive, or that I was fully getting the place illusion.

I tried out other environments with the plugin and found the trees felt much nicer and also it gave me the idea of being able to plant trees in the environment. I picked a calming pink colour pallette and set the user so they started on one side of the world and would progress towards the other. As the speed of travel was slow I didn’t work on trying to stop the user from reaching the edge of the world, but ideally I would have a much longer generative world that wouldn’t have limits.

I tried modelling trees, plants and rocks in Tilt Brush on the Quest and it was really fun and I was happy with the results, however I had problems importing them as assets in to my app. In the end I settled to use existing assets created by Google as they worked well, and also a basic geometric shape I had created in Blender.

WebXR

The idea of VR for the web has been around for a while with VRML being an early incarnation. I was vaguely aware of A-Frame before starting the course and I became more interested as I researched it. I am already quite proficient in JavaScript so I wanted to make use of my skills and knowledge there. I also thought it would be more likely that I would continue to use a web based VR platform once my course had finished.

I was worried that it was a risk to try and learn a new framework in a short time, but I quickly found a lot of the concepts we had learned in Unity carried over to A-Frame and writing JavaScript was easier for me.

A-Frame was also a good choice for development as I had bought an Oculus Quest which although I had managed to compile for using Unity there was no live view available so each change required compiling and uploading which was very slow. A-Frame could be served on my development machine and then I could use the Oculus browser to visit the page and it would be immediately reload whenever I saved changes.

I used VS Code as my editor and Firefox and Chrome for testing on my development machine. I also used the A-Frame Inspector which made debugging much easier.

Keita Ikeda testing Travelling on the Oculus Quest

User testing

I spent a full day user testing Travelling on fellow students and recorded six of them. I made some adaptation to the app as the day progressed based on their feedback and fixed some bugs.

I didn’t have any specific goals that I wanted the users to achieve and I was happy to just see how they enjoyed the experience and what thoughts they had.

I would initially set up the VR scene and enable recording within the headset and on a video camera. unfortunately I had technical issues recording some of the sessions so wasn’t able to sync HMD footage with all the participants. I let users explore the space unassisted to see what they did and if they saw the instructions, and would then guide them through a second run.

Evaluation

I had tried out my app myself as I went along and felt it had improved a lot but I was really interested to see what other people thought of it.

When I had been using Tilt Brush (a 3D drawing application by Google) it had a very nice tutorial which introduced a concept where help messages for the tool you were using are displayed when you looked on the back of the controller. I successfully managed to place instructions in the same location within my app. I placed text telling the user to “Turn your hands towards you to see instructions”.

Google Tilt Brush - turn for hidden help info

During testing I found that nobody understood this instruction, however once shown it everyone seemed to like the idea that instructions were placed in this location. Perhaps this will develop to become a standard mode of interaction with VR to find help. Many actions aren’t intuitive in a 2D environment, such as double clicking, but make sense once you have learned the behaviour. I think users would have found it if I’d had an animation showing them how to rotate the controllers and gave instructions to “look on the back of the controllers”.

Turning controls within my app

The users had a lot of feedback regarding the virtual environment and interaction. The most commonly requested feature was to be able to add more types of trees or plants, and then to be able to control the colour of them. The users reported that the world seemed plausible even though the tree model was very basic and they liked the simplicity.

There was mixed feedback about the control of movement in the game. The controls were not very intuitive and I would have liked to improve them but found the coding was difficult. I made it so that the user could only progress forwards, this was partly to avoid them going in the wrong direction and off the edge of the world early in the experience.

It would be quite easy to allow two way travel, but I think forwards only is a key part of the experience as it makes the world very simple for the user. Perhaps timeline scripted app could have resting points where the user would be given more time to interact with the direct environment.

Conclusion

Overall I am really pleased with the outcome of the app. I had a lot of very positive feedback from my test group and everyone played with it for around seven minutes, which was far longer than I was expecting.

There were a lot of features I wanted to add, which were more around interaction, but the most requested feature was to simply add more variety of trees and plants that you could create. Doing this user testing was very helpful as I probably wouldn’t have concentrated on that feature otherwise.

I found using A-Frame was really good, and is a tool which I will continue to use. It had a nice simple system which was familiar to coding HTML and had great results.

I’ve learned a lot about VR from the course, the theory side is very interesting. I didn’t realise before how nuanced interaction in VR is, and that on the one hand we are very sensitive to physical sensations relating to rendering such as frame lag and not having 6DOF, but on the other hand users will easily accept variations in scale. I was also very interested in how key our perception of shadows is to VR and would like to investigate that further and continue reading about VR research.

Addendum

Creative commons references for libraries and assets used within this game

Models

Libraries

Sounds