• Angad V Raghavan

The 101 of Designing for the Oculus Quest

Updated: Aug 14

This article is part of our series on “Designing for the Oculus Quest”, where we give our readers some tips, tricks and to-do lists on creating realistic, high-fidelity VR applications for the Oculus Quest. We strongly believe that in the accelerating landscape that is xR, it is important to pay it forward. We acknowledge that we would not be where we are were it not for the largesse of so many among the developers & designers in the xR community. These articles that we put up is our attempt at paying back in kind.



That the Oculus Quest is the game-changing device in VR, is now quite widely acknowledged by most observers. It is the first wireless, standalone, all-in-one VR headset that comes this close to matching Rift/Vive standards (though not quite), and offers the same interactive fidelity (two 6 DoF hand-held controllers) that was, until now, the preserve of the wired headsets.


The key point to be noted is fairly fundamental – in the Oculus Quest, all the processing happens on-site (ergo, inside the headset itself, powered by a Qualcomm SD 835 chip). So, without an uber-powered, high-end GPU-fueled machine to “outsource” all the processing to, there are certain restrictions on the environment/FX you can deploy in your applications.


Yes, with arrival of the Oculus Link, you have a little more flexibility; but hey, that still requires a VR-Ready Laptop at the other end, and frankly buries the real reason why Quest is an excellent device – it’s self-reliance. So, while the Link is real boon for Quest users waiting to try out games/applications typically designed for wired headsets; from a designer standpoint, it is best operate under the assumption that this is a native Quest application, and that your end-user may not have access to a VR-Ready laptop.


So, without further ado, let’s jump right to it – how do we design high-quality, premium VR experiences for the Oculus Quest? What performance parameters do we measure to ensure the end-user experience is seamless? We answer all this, and a bit more, as you read on.



It's all about the Draw Call.


A draw-call is basically an instruction that contains information for your graphics processor on how to render (or “draw”) objects, textures & effects on your screen. The CPU plays the role of taking this data and preparing this instruction to the GPU, and allocating resources for the GPU to “compute” and “render”. Now, depending on the complexity of your scene (or rather, “each frame of your scene”), the weight that the CPU has to carry increases; consequently, leading to a clogging of computational resources, which manifests itself as “frame drops” (low FPS) in your in-game experience.



Factors affecting Draw Calls.


Number of Materials – The number of draw calls depends on the number of materials visible per frame. Higher the number of materials, greater will be the draw calls. It gets worse if you change the underlying shader program as well.


Number of Meshes – In addition to the number of materials per frame, the number of separate meshes also determines of the number of draw calls in the scene. Hence, it is generally recommended that you merge as many meshes as possible. Use this as good practice while designing a scene; merge “non-essential” and static meshes in your scene wherever possible.


Additional Textures – It is a more complex process if you are changing material, and also using multiple textures. As this graph illustrates, it would better for you to reuse the same material, and switch between different textures within it. This is a noticeably less expensive process.


Factors that don’t affect Draw Calls.


Texture Size: The cost of using different texture-sizes doesn’t have a significant impact on processor time. That said, as good practice when designing a scene, prioritize optimizing the texture resolutions you use. Your primary assets (the “hero” of your app) can have a texture resolution up to 4K (the limit for the Oculus Quest), whereas your secondary meshes may be restricted to 1K.


Texture Filtering & Compression Algorithms: Contrary to what certain sources may tell you; changing the compression algorithm (ASTC, ETC1 et al.) or the texture filtering algorithm (Point, Bilinear, Trilinear etc.) has almost zero impact on Draw Call Count.


Altering Material Colour: If the only change you need to effect is the colour, then use the same material, and just alter the Colour Parameters. This draws from the same principle of reducing the number of unique materials in the frame/scene.



Best Practices for Level Design


1. Mesh Optimization:

  • Polycount - In a scene built for the Oculus Quest, it is important to optimize for total polycount (sum-total of the polygonal faces on all meshes). Ideally, a single frame should have no more than 75,000 polygons per view. If push comes to shove, this number could be taken up to a 100,000 polygons per view, but make sure you draw a hard line there.

  • Prioritizing your "Hero Asset" - Usually, while designing an environment, the focus should be given to the "hero asset" of the experience, that is, say you’re building a car-configurator, then the main car mesh takes up that role. The other assets in the scene should be very low poly. Distant objects such as trees, buildings (in an outdoor) scene such out be replaced with image planes. However, while doing so, make sure you stick to principles of avoiding aliasing! The frame here shows you how the level of detail is prioritized among the meshes in the scene. The “hero assets” in the scene (in this case, being the LNG Fuel Trains) have a higher polycount than the secondary meshes (such as the warehouse in the distance).

A snapshot from the AutoVRse Fire-Response Training Demo
  • Reducing Draw Calls by merging meshes: As mentioned earlier, the number of distinct meshes also determine the number of draw calls generated in a scene. To limit this, it’s important to merge as many meshes as possible.

  • Criteria for grouping meshes: The designer needs to check how close meshes are in a particular frame of view. Hence, based on the proximity of these meshes, they can be grouped and merged into a single mesh.

A snapshot of the "secondary-meshes" and "props" from the same experience.

Notice how the warehouses in this scene are placed. Being at a slight distance, with user rarely, if ever, moving close to them, it is likely they will appear “grouped together” in the user’s FoV. Therefore, it makes sense to merge them together into a single mesh anyway. However, be careful and do not try to overcompensate and minimize any/all secondary meshes. Those located at a slight distance from each other, may never actually fall within a single frame; and if they are merged together in practice, every time looks at “part” of the newly-merged mesh, the headset will still try have to render the “full mesh”.




2. Material Optimization:

In addition to merging the meshes, it is important that the now-merged mesh has only a single material slot. A texture atlas is a single super-texture (an albedo, normal, roughness map etc.) that has the textures of various parts of the mesh mapped onto its respective UV shell. It is created by merging material slots of several meshes.

Texture-Atlases like this can be created on Blender or Substance Painter.

3. Lighting:

  • The lighting that you will see inside a packaged build on the Quest is different from what you see in the scene on the UE4 viewport. The Android ESL 3.1 previewer will give a slightly more accurate/closest representation, but even that cannot give you the exact picture. Therefore, while we would advise to always shift the render previewer from UE4’s default shader model to the Android previewer, the best step would be to buy a Link Cable, and keep testing the environment regularly by using the “Launch” option. See for yourself, the different between the two images here. The image to the left is the scene rendered with UE4’s default shader and the one to the right is the scene rendered in Android preview mode.

A comparison between the Default Shader/Preview Mode vs. Android ESL 3.1

  • After every significant tweak done on the Lighting, make sure you make a preview Lighting Build (you can opt for “Production” only for the final 1-2 rounds of testing; “Preview” is more than enough for rapid iteration while designing the level), and test it on the Quest. Often, until you actually build the lighting, a lot of textures have a washed out, flattened, smooth look.

  • Direction of Normals: In contrast to the default UE4 shader, in Android preview, if the normals on a mesh are inverted or facing the wrong direction, you will see that the mesh are textured in a default, flat black material, after you build the lighting. So, prior to exporting a mesh out of Blender and into UE4, the direction of the normals must be checked. Do note, using a two-sided material here is not the solution, and doesn't fix this issue.

Screen-grab of the face-orientation on the "hero asset", taken on Blender.

Here, you see a screenshot of the Face Orientation view-mode in Blender. The blue colour indicates that the normals of the meshes are facing outwards; this means that it is safe to assume that the outer surfaces of the meshes will get rendered.


4. Creating LODs for Meshes:


  • LOD (Level of Detail) can be described as multiple versions of a single mesh having different polycounts. LODs of a mesh switch based on the distance of the currently-possessed camera from the mesh. So, for instance, at a distance of, say 10 metres from the mesh, a particular LOD shall be spawned and at a distance of 50 metres, a different LOD shall be spawned. By creating LODs, the user can drastically improve performance of the system by spawning only a required number of polygons for the mesh based on their proximity to certain meshes. Simple concept – if something is far off and not in focus, why would you waste processing time/power in rendering it in full detail?

  • Naming Convention: LOD0 – LOD with the highest detail and polycount (the original mesh) LOD1 – The second level of optimization of the mesh, with a reduced number of polygons and less-detail. LOD2 – Whatever LOD1 is, but even more optimized. LOD3 – The mesh with the lowest resolution and details.

  • Automatically generating LODs in UE4: With the mesh selected, open the Static Mesh Editor. In the Details tab, under LOD Settings, select the type of LOD group. Once a group is selected, a specific number of LODs based on the group selected are created.

Selecting LOD Group Type in the Static Mesh Editor

  • By default, Auto Compute LOD Distance is checked. To manually override, and ask UE4 to spawn LODs at specific distances, it is required to uncheck this setting.


  • In order to manually change settings of each LOD, under LOD Picker, check Custom.


  • Now you can adjust settings such as the Percentage Triangles and Screen Size (distance from the camera at which an LOD is spawned).



5. Creating Custom LODs:


  • Modelling the LOD Meshes - Since custom meshes are going to be used as LODs, the mesh-variations can be created on any 3D modelling software that you are comfortable with (although we would point you in the direction of Blender!). Once the variations are created, they need to be exported out individually.

  • Import the original mesh into UE4 and open the Static Mesh Editor. Under LOD Settings, click on the LOD Import drop-down and select Re-import LOD Level 1. Repeat the same procedure to import all the remaining LODs.


  • Once the LODs are imported, play around with the reduction settings as per your requirements.


So, having gone through this literature, you should have a fair idea about the central tenets of level design for the Oculus Quest. However, at no point do we claim that this is an exhaustive list. As we keeping working on innovative applications and projects, we will be back here to update anything new that we learn in the fast-evolving field of VR design & development.

As a closing note, we strongly encourage you to refer to the following links that helped us glean a basic understanding of LODs specifically, and hope that you enjoy the same too!

References



7 views

© 2020 by TDW Technologies Private Limited

  • LinkedIn
  • Black Twitter Icon
  • Black Facebook Icon
  • Black Instagram Icon