Technology

How to Develop Oculus Rift Games

Oculus Development Guide

Getting Started

Choose an Engine

To develop for the Rift or Gear VR, you need to use a game engine (even non-gaming VR apps are made with game engines). While you can technically write your own game engine, the time and expertise needed to do so are immense, and thus the vast majority of VR apps & games are made using either Unity or Unreal Engine.

Both engines are free to use, however Unity requires a $125 per month subscription if you make above $200K per year revenue (you are very unlikely to ever hit this).

The first step you’ll want to take to develop for VR is to choose between these 2 engines. Unity is easier to use for beginners and has more resources and support available, however Unreal Engine is considered more powerful- if you’re able to take advantage of its power.

Once you’ve decided, download and install your engine of choice.

It is then highly recommended that you run through the engine’s non-VR basic tutorial to make a basic non-VR game with the engine, so that you become familiar with the UI and operation of the engine.

Unity Integration

While Unity has basic support for VR built in, to make Oculus development vastly easier and take advantage of all the hardware and software features, if you’re using Unity, you’ll want to both of the following plugins to your project:

These are plugins for your Unity Project, that should be added to each VR project, not extensions for the Unity application itself.

(Unreal Engine has this all built in)

Native SDK

If you are using a custom open source engine or writing an engine yourself, you’ll want to use the native Oculus SDK instead: Rift | Gear VR | Platform SDK

Example Projects

Unity Samples


Basic VR Integration (Hello World)

Unity

NOTE: You must first import the Oculus Utilities and Platform SDK as shown above.

  1. Create or open your Project & Scene
  2. Create a terrain or floor if there is not already one
  3. Import both the Oculus Utilities and Oculus Platform SDK .unitypackage files (see above)
  4. Edit –> Project Settings –> Player –> Other Settings
    • tick ‘Virtual Reality Supported’
    • change ‘Stereo Rendering Method’ to ‘Single Pass’
  5. Delete the ‘Main Camera’ object in the Scene Hierarchy
  6. In the Project tab, open OVR –> Prefabs and drag OVRCameraRig onto the Scene
  7. Click on the OVRCameraRig in the Scene. In its properties, do the following:
    • change the Position Y value to 0 (this is very important – no matter where you move the camera in your scene, you must keep the position Y value at 0)
    • tick ‘Use Recommended MSAA Level’
    • tick ‘Enable Adaptive Resolution’
    • change ‘Max Adaptive Resolution’ to 1.2
    • change ‘Tracking Origin Type’ to Floor Level
    • tick ‘Reset Tracker On Load’

That’s it! If you click the Run/Preview button and put the Rift on, you’ll be in your Unity scene in VR!

Unreal Engine

  1. Create or open your Project & Map
  2. Add a Camera Actor to the map
  3. Click on the added Camera Actor’s properties, and tick ‘Lock to HMD’
  4. Add a ‘Set Tracking Origin’ blueprint from the ‘Head Mounted Display’ function library, and set Origin to ‘Floor’
  5. Open Edit > Editor Preferences > Engine > Rendering:
    • tick ‘Forward Shading’
    • change ‘Anti-Aliasing Method’ to TAA
    • tick ‘Mobile Multi-View’
  6. In your project folder, open Engine/Config/BaseEngine.ini, and set the following values
    [Oculus.Settings]
    PixelDensityMin=0.7
    PixelDensityMax=1.2
    PixelDensityAdaptive=true
    
    [GearVR.Settings]
    bEnableDirectMultiview=True
    
  7. Save and close BaseEngine.ini

That’s it! If you click the arrow beside the Launch button and then VR Preview and put the Rift on, you’ll be in your Unreal Engine map in VR!


Guides

Basic Input

Your user could be on Rift, using Touch controllers, the Remote, or a gamepad- or on Gear VR, using the touchpad, the controller, or a gamepad.

You can handle all of these input devices in a common way using OVRInput. This is the name for the virtual input device that handles generic buttons & axis from all possible input devices.

https://developer.oculus.com/documentation/game-engines/latest/concepts/unity-ovrinput/


Touch Controllers

There are 3 options for how to bring the Oculus Touch controllers into your app:

  • Hand Presence – the player will see a visual hand, where the thumbs and fingers will react corresponding to their real digits- you can use the finger as a pointer, or implement a grabbing system to allow the user to manipulate virtual objects or pick up and use virtual tools (eg. First Contact, Dead and Buried, The Unspoken, Robo Recall)
  • Controller Representation – the player will see models of the Touch controllers themselves, with or without their hands holding them (eg. Tilt Brush, Oculus Medium, Virtual Desktop)
  • Tool-Only Representation – the player will not see their hands or the controllers, but instead just the tool/weapon that they would be holding (eg. Space Pirate Trainer, Fruit Ninja VR)

Picking which you want to use is the first step to adding Touch to your app. Hand Presence provides the best, most immersive experience but is harder to develop for, controller representation is suitable for functional/utility apps, and tool-only representation is the easiest to develop for.

Hand Presence

TBD

Controller Representation

TBD

Tool-Only Representation

Unity

The OVRCameraRig prefab has the transforms LeftHandAnchor and RightHandAnchor as grandchildren, representing the position & orientation of each controller.

To have an object move at the position of the controller, simply attach it to the relevant transform by dragging it onto it in the Scene Hierarchy. You can test this with even a basic object such as a sphere.

NOTE: The object must have Position value of [0,0,0] or it will be offset

Haptic Feedback

TBD


VR Audio

VR Audio is just as important as VR visuals- sound needs to sound like it truly came from the direction and distance of the object it was emitted from. Oculus provides the Oculus Audio SDK for this purpose, which goes far beyond the quality of traditional spatialized game sounds. It models 3D direction, distance and reflection.

Unity

Firstly, you’ll need to download the Audio SDK and import OculusNativeSpatializer.unitypackage into your project.

https://developer3.oculus.com/documentation/audiosdk/latest/concepts/ospnative-unity-spatialize/


Store Features

Cloud Saves

You should save all user options and game save data to Oculus’ cloud storage. With this, it doesn’t matter if a user reinstalls the game, or their OS, or their hard drive fails, or they get a new device- they’ll always have their saves (and/or options) available.

https://developer.oculus.com/documentation/platform/latest/concepts/dg-cc-cloud-storage/

Achivements

Achievements are a fun way to give users challenges to complete within your game, increasing replayability

https://developer.oculus.com/documentation/platform/latest/concepts/dg-achievements/

Leaderboards

Leaderboards can spur competition and increase replayability of singleplayer games.

https://developer3.oculus.com/documentation/platform/latest/concepts/dg-cc-leaderboards/


Social/Multiplayer

Basic Networking

TBD

Matchmaking

https://developer3.oculus.com/documentation/platform/latest/concepts/dg-cc-matchmaking-a-simple/

Avatars

Oculus Avatar SDK Download

Voice Chat

https://developer3.oculus.com/documentation/platform/latest/concepts/dg-cc-voip/

Parties/Rooms Support

TBD


Best Practices

  • Your app/game should have either no camera motion (other than the user’s real life head tracking movement), or should only have movement via a vehicle with a full 3D cockpit (a rigid helmet can count, like an EVA spacesuit). This means that your game will not cause motion sickness, and will be playable by everyone.
  • On Rift, your app/game should support either the Touch controllers, Oculus Remote, or gamepad (or all 3), not a mouse and keyboard. On Gear VR, you should support both the touchpad and the controller.
  • Never artificially control head pitch or roll, and rarely ever artificially control head yaw- don’t even allow the player to do these things through controllers (allow the player to do this naturally with their head!). This only applies to the player head- you are free to move the vehicle itself in any way in a cockpit-based game.
  • Menus and UI should be objects in world space, never locked to the player’s face. Even in loading screens and pause screens.
  • Do not use traditional cutscenes or lock the player’s head, simply have the scene occur in-engine (and lock the player’s vehicle stationary in a cockpit game if you need).
  • Unless your game is abstract, in the clouds, or a cockpit game, use floor level tracking, not eye level. This is critical for presence!

Performance

Performance in VR is incredibly important to a degree that it never was in non-VR content.

These requirements are not optional. Without meeting them, you will not be able to get on the store, and give your users a poor experience that could even make them sick. Nearly all of your customers will be using PCs with the recommended or minimum specifications.

Rift

Your app should run smoothly at:

  • 45 FPS absolute minimum (and that means minimum– not average!) on the Rift minimum specs
  • 90 FPS 1% minimum on the Rift recommended specs

Recommended & Minimum PC Specs

Gear VR

Your app should:

  • Run smoothly at 60 FPS almost all of the time on a Galaxy S6 level device
  • Run for at least 45 minutes without triggering the overheat warning

Getting on the Store

So you’ve finished your app (or got it to a stage where you feel it’s ready for “Early Access”), and now want to get it out to VR users. The best way to do this by far is to get it onto the Oculus Store- this is where Oculus users find, purchase, install, and launch their VR content.

You can submit your app files for review (or manage your current store apps) at dashboard.oculus.com

Sections

There are 3 fundamental sections of the Oculus Store. Choosing the right one for your app/game is essential:

  • Main Store – this is for complete, polished apps/games with a decent production value that meets modern consumer expectations
  • Gallery – this is for less polished apps/games, especially interesting experiments and 1-man indie content with low production value
  • Early Access – this is for apps/games which are currently in active constant development, and will one day be suitable for the Main Store, but are not yet

In the Main Store, there are then 3 categories of apps: Games, Entertainment, and Apps. Entertainment encompasses VR experiences and stories that are not games, and Apps is for apps with a functional purpose, rather than simply entertainment.

Comfort Rating

TBD

Passing Review

To pass the Oculus Store review and be approved to distribute/sell on the store, you TBD

In-App Purchases

TBD


FAQs

Do I need a “development kit” to create content?

The Rift has everything you need as a developer to make content for Rift, and the Gear VR has everything you need to make content for Gear VR (other than the PC of course). No development kits are needed.

How do I decide whether to develop for PC or mobile?

Mobile VR will give you access to the largest audience (5 million vs 500K), however PC VR will give you access to create a totally new kind of experience unlike anything else.

The best solution is to develop for both– create your app in a way that it can run on mobile VR, but also take advantage of the power & feature set of high end PC VR. However, if your app takes heavy advantage of hand prescence in its fundamental mechanics, this may simply not be possible.

Why are there multiple different Unity Packages?

  • Oculus Utilities helps with things like VR rendering, tracking, input, and other core functionality
  • Oculus Platform handles store features like cloud saves, achievements, leaderboards and helps with multiplayer, with features such as Networking, Matchmaking, Voice Chat, and more
  • Oculus Audio lets you create spatialized VR audio sources
  • Oculus Avatars requires Oculus Platform, and gives you virtual hands for the player’s Touch controllers, as well as Avatar models for other users in multiplayer/social.

I developed an app/game for HTC Vive, what should I know about porting to Oculus Touch?

Porting your already existing HTC Vive app/game to Oculus Touch is a great idea, as you could potentially double your sales/userbase (or even more than that), for a relatively small amount of effort compared to creating a whole new app. Following this page as a guide to the Oculus SDK stack is all you should need, but it is highly recommended that you actually purchase the Oculus hardware to be able to test, instead of developing blindly.

  • Gripping Differences – you probably aren’t using the HTC Vive button’s grip buttons to pick up virtual objects, and you may even be using the trigger for this instead. The Oculus Touch controllers have 2 separate triggers, one of which is a grip trigger and is very comfortable to use. The grip trigger should be used for picking up objects instead of the index trigger- which should be used for using/firing (this is the most important difference in developing for the two different controllers)
  • Oculus Store – whereas your app is probably currently on Steam, most Oculus users get their content from the Oculus Store (for example, 90% of Oculus users that purchased the game Grav|Lab chose to get it from Oculus Store). It is absolutely vital that you launch on the Oculus Store if you want to reach Oculus users.
  • SteamWorks – if you currently use SteamWorks to handle your multiplayer components, this cannot be used on Oculus Store. The Oculus Store equivalent is called Oculus Platform, and guides for it are listed above.