Duration 8 weeks
Team Size 19
My Role(s) UI/UX Designer, Scrum Master
Platform PC (
Engine Unreal Engine 4
Status Released in July 2018


Bioside was the final project in my 2nd Year of University.

Recognition: NHTV - Best Visuals Y2

Bioside is a VR game combining shooting and Zero-G movement into an action-packed experience. The main goal was to create innovative movement in virtual reality while merging elements of the shooter genre. Therefore, we took inspiration from other VR games like Lone Echo and Robo Recall, to create interesting movement, while still delivering action-packed combat.

My Contribution

During the project, I was working as a UI/UX Designer and Scrum Master.

My tasks on this project included:
  • Creating a concept with the team based around a project brief
  • Pitching the game concept
  • Researched and balanced movement to prevent motion sickness.
  • Researched, designed and implemented UI/UX elements like weapon holsters, bloody screen
  • Gained experience on how to implement and balance haptic feedback in UE4.
  • Facilitated agile development in the team using scrum and JIRA.
Whiteboard concept

All concepts we brainstormed and pitched within the team.

Player Interface

UI Mockup

A Visual mockup of the UI created in Affinity Designer. In the game, the elements are placed in 3D space to ensure the player can read important information with ease.



A special challenge was to create 3D UI elements in UE4 since the UI tools are only supporting transparency in 2D. To create a simple Bloody Screen one of the Artists had to create a Sphere with a hole that is placed in front of the camera.

bloodyscreen sphere wireframe visual bloodyscreen sphere

Motion Sickness

Considering the main inspiration came from the already successful "Lone Echo Multiplayer" and "Robo Recall" our team was able to create a good base.

To reduce motion sickness even further, I researched the topic and had an in-depth look at other titles to find out which good and bad techniques they used and applied them to our game.

An assumption at the start was that a "helmet visual" would help the player have a point of reference when moving in the Zero-G environment. However, it turned out that the player's movement speed needed different speed limits for A: When the player moves due to a physical action like grabbing a wall and pulling himself, and B: when the player would use thruster via the joysticks on the VR controllers. With this change and multiple balancing passes to know at which speed limits motion sickness "begins" the game was even playable by professors at our University who were more prone to motion sickness.