This post is about mobile VR (untethered VR) development starting from choosing the Oculus API app versions, and some design issues and problems to watch out for in the development process, as well as performance optimisation, testing and publishing in the Oculus shop.
Before we get started, let's think about why such development is needed in the first place.
There are currently two main areas of VR development - tethered and untethered, i.e. computer-based and mobile-based. Tethered tethered VR uses powerful desktop graphics cards to create images in high-end VR devices such as the Oculus Rift and HTC Vive. Mobile tethered VR uses the user's mobile device inserted into a VR device made of plastic, cardboard or foam; it only uses the computing resources of the user device, whose characteristics can vary widely.
Thoughts on mobile VR
Mobile VR is currently the cheapest way for consumers to immerse themselves in virtual reality; there are many supported mobile devices and low-cost glasses for it. Despite some problems (we'll discuss them below), current and future models of mobile devices are powerful enough to provide a compelling VR experience comfortable for most users (up to a respectable 60Hz on the Samsung Gear VR). More than 5 million Google Cardboard glasses have been sold, and Samsung claims a million users in the Gear VR ecosystem.
VR development with Oculus
The range of glasses is very wide, from the cheap Google Cardboard v2 (often given away for free at various events) and the more robust plastic variations, as well as foam versions such as Merge VR, to the Samsung Gear VR and a host of other mobile-connectable goggle options to be released in the near future. A wide range of devices and variations is good, but VR development ideally needs a precise set of learned hardware limitations and capabilities to provide the end user with a reliable, smooth and comfortable gaming experience. So we'll focus on the best-in-class devices to date: the Samsung Gear VR and Google Daydream VR.
Thanks to the low cost of entry for end-users and consumers, the mobile VR market is growing rapidly, with hundreds of new apps appearing every month. In 2016, with the release of the Oculus Rift and HTC Viveo, and the soon-to-be-launched Sony PlayStation VR, the availability of virtual reality has finally become known to the general consumer; VR appears at many public events and shows.
Even though Cardboard is only supported on Android (iOS is mutually incompatible with it), developers can conveniently publish their apps on the Play Store; there we already see a bunch of cheap, low-quality VR apps that provide no enjoyment of the gaming experience. So the Gear VR and Daydream online app shops (once it's released) will have quality control.
Of course, this creates additional barriers to publishing apps, but it provides end users with a certain level of quality assurance, which will accelerate the spread of VR: due to high quality, users will play their games more often and try out new ones. In the short term, quality control is a headache for developers, but overall it is better for everyone, because it helps VR get to the mass market.
Mobile VR has its own limitations that need to be considered before choosing a supported platform. Firstly, as the name implies, mobile VR content is delivered by mobile devices, not powerful computers. Although the newer devices have much more processing power, there are still limitations to the depth and complexity of the scene rendered in 3D (if you choose to go that route). Of course, this doesn't mean that devices are completely ineffective: a 3D-generated scene isn't required to be convincing to users in order to be photorealistic. Low-polygon worlds generated on mobile devices are interesting enough and effective enough in VR, while still delivering high performance.
Oculus OpenXR architecture
Another aspect is that, despite being computer independent and giving the user freedom of movement without cables, it is not yet possible to track the movement of the glasses in space: no matter how the user moves, the camera position is only updated in three directions. This means that many of the full VR experiences one gets with tethered VR (crouching, approaching characters from any angle, moving freely within a room) are not available for mobile VR. However, hardware development is underway, which should solve this problem sooner rather than later.
So, mobile VR has less processing power and no head position tracking there; it's understandable why panoramic photo and video apps are prevalent and well supported. However, they lack any means of real interaction, the user can only look at something, so some don't consider it 'proper' VR. Others also believe that, thanks to improvements in mobile VR and the rise in computing power, developers should aim for full, high-quality VR, because the gap will only narrow, and if you develop mobile VR apps today, they will soon become obsolete within months or years due to the rapid development of hardware.
Preparing for VR development
Publishing apps on the Google Play Store is easier, and it's a great place to start mastering if you're happy with the release of free content, but I'll mainly focus on development for the quality-controlled shop offered by Oculus for Samsung Gear VR; all recommendations will also hold true for the Google Daydream VR goggle ecosystem, which will release later this year.
Development environment, SDKs, and how to become an Oculus developer
To create a 3D VR application, you need a 3D development environment that supports the necessary Software Development Kits (SDKs) for VR. In our case this is the Oculus Mobile SDK. The best 3D engine for this is Unity, which has free and paid versions. The free version has certain limitations and usage requirements, while the paid version already contains everything you need to publish mobile VR for Android devices (such as Samsung Gear VR and Google Daydream VR) and iOS (which doesn't really have strong VR support yet, although VR apps for iPhone are supported by Google Cardboard); you won't have to pay for these features separately as with the older versions.
After installing Unity, you will need to download the Oculus Mobile SDK for Unity and become an Oculus developer. This requires a simple registration process to create an Oculus ID to login to the website and allocate an area where you will then write your applications for testing before publishing. The Mobile SDK has gone through many iterations, making it ready to go and has very detailed documentation. Along with the Mobile SDK, you can also download the Audio SDK to make the best use of Oculus' audio positioning features, which add to the immersive experience for VR app users. It is also now user-friendly and well-documented.
After setting up your Oculus ID, you can also register on the Oculus (as well as Unity) forums to find answers to any questions and problems that arise, and to become part of the growing community of VR developers.
We've covered what's involved in software, but for professional development and creating quality apps you'll also need hardware so you can build, test and run your apps locally before sending them to the shop. If money isn't an issue, try to find as many supported Samsung mobile devices as possible to test your app and check its performance. If money is tight, buy the "lowest common denominator", the Samsung S6, to make sure the app runs at least to the minimum supported requirements. Having testing hardware also means that you can let users test the app before release and get feedback. Usability and user-friendliness are key elements of VR development, it's very important that your VR apps don't make the user feel "seasick"! But that's a topic for another post...
Samsung Gear VR
Make sure that the mobile device is switched to developer mode and that it allows the installation of applications from unknown sources. To activate developer mode in Android, go to Settings > About device > Build number and tap the Build number seven times. Then go to Settings > Application Manager, select Gear VR Service and then Manage Storage. Tap VR Service Version several times until the Developer Mode switch appears. Toggle it to enable Oculus Gear VR Developer Mode. Finally, we need to allow the device to install and run apps from unknown sources, because when testing an app, it is not downloaded from the Oculus or Google Play Stores. To do this, go to Settings > Security and enable the Unknown sources option, then select OK in the dialog box. The mobile device will then be able to run applications under development for testing and checking demos. Note. If the Gear VR Service option is not available, you will need to insert your device into the Gear VR glasses and then follow the installation and configuration process for the Oculus Gear VR software, drivers and app - yes, this means you will have to buy the glasses!
The final element to create the hardware/software connection is to get OSIG a mobile device to insert into Samsung's test builds so that they work in the Oculus Gear VR system without being a verified app. To do this, first download the Device ID app from the Google Play Store and then run it to get the Device IDs of your devices. Once you have them, go to the Oculus OSIG Generator website and enter the Device ID to obtain a unique OSIG file. Once downloaded, paste it into the app bundle in Unity to ensure that the app's builds will run on your mobile device.
With hardware and software it's clear, but what about design and development?
Of course, not every app you design needs to be released and available to the general public, so a good approach (especially in the case of VR) would be to create a first series of prototypes and implement simple ideas before moving on to a full development cycle of a polished app. Google has created a huge number of interactive prototypes for Daydream VR, simple and effective for developers to understand the features that will become available with the new hardware and input controller. If you're new to VR development, you'll need to study them too, to understand what works and - most importantly - doesn't work in VR, as well as to be aware of the current limitations of mobile VR. If your app is slow or uncomfortable to use, it will not be published.
Keep it simple
Because of the limited input options in mobile VR, simple ideas and interactions work well. Of course, there are bluetooth controllers for Android, and they are supported by the Samsung Gear VR, and it is claimed that Google Daydream VR will come with a compatible controller, but most users do not yet have a bluetooth gamepad. So if you're developing an app (usually a game) that only works with a bluetooth gamepad, and doesn't support a touchpad on the sides of the glasses, you're significantly reducing the size of your potential buyer audience market.
The touchpad on the side of the Gear VR (v1) is embossed and resembles the D-pad of a game controller, so it's harder to do diagonal swipes and interactions with it, but for new VR users it's easier to master swiping forward, back, up and down with the protrusion marking the centre of the button touch area. Available for pre-order the new Gear VR 2, which will be released later this year, has reverted to the flat touchpad design as in earlier Innovator Editions. It's a good decision from a design point of view, because this way more complex user finger movements can be tracked.
The main drawback of the touchpad on the side of the device is that new VR users tend to grab the glasses, holding them with both hands and getting used to the VR experience. This often leads to accidentally exiting or pausing the app, depending on the design, which is confusing and makes demos difficult to show because, unlike tethered VR, you can't see what users see unless what's happening on the phone is displayed on some kind of display.
Keep in mind that for most users trying out your app, this may be their first VR experience: if the input method is simple, they will be much quicker to master the technology, which could be awesome for them in and of itself. Fortunately, VR is growing in popularity, more and more people have access to VR, so this factor will need to be taken into account less and less; hopefully next year we won't have to worry about it at all.
For an app to be approved and user-friendly, it must consistently deliver 60 frames per second (FPS) on the Samsung Gear VR. This is necessary because such a bar is recognised as the minimum comfortable VR frame rate for most users. Any reduction in frame rate, even for a short time, can lead to 'motion sickness' for users, because the virtual world will start to move jerk and twitch, trying to match head movements.
This can be a challenge if you're not used to using 3D engine optimisation or geometry simplification. Any frame of your VR scene should display 50,000 polygons (maximum 100,000), so you need to think things through, calculate in advance and use the tricks of Unity to ensure a good picture, without forgetting stability.
Fortunately, the latest version of Unity 5.4 supports rendering in one pass, so you can get the same result with less hardware resources; the VR elements are taken care of by the engine - it doesn't render everything twice, but renders the frame from slightly different angles to create each eye's field of view and the necessary 3D depth effect.
Mobile VR development with Oculus and Gear VR
John Carmack, one of the creators of DOOM, now works at Oculus and spends a lot of time on tools and development for mobile VR. Because of this, Gear VR has long supported asynchronous timewarp, a technique used by the SDK to smooth out dropped frames and allow developers to get rid of image jerks on mobile hardware. But that doesn't mean you can use it as a "crutch" and not optimise your code! It has its limitations and will not always be able to save you and users.
Oculus says to consider the following limitations when developing mobile VR for Gear VR glasses:
- 50-100 rendering calls per frame
- 50,000-100,000 polygons per frame
- As few textures as possible (but their size can be huge)
- 1-3ms for scripts execution
- Effective CPU and GPU trolling to control heat, battery consumption and scene speed
Note. All other Android APIs and SDKs (such as Google Cardboard) typically do not provide access to direct control of the mobile device's CPU and GPU, only Oculus provides this capability for selected Samsung device models through partnering with the Gear VR and mobile SDK.
What are Oculus APIs?
The Oculus API provides everything a developer needs to create virtual reality games and applications in combination with his Oculus Rift headset. Oculus Indirect API was released 12.19.2015.
The Oculus API provides everything a developer needs to create virtual reality games and applications in combination with his Oculus Rift headset. This standup API and subsequent SDK will teach developers how to install and configure the Oculus Rift, build games using the game engine, and create "immersive audio" to complement virtual reality games. is shown. Developers must purchase developer signatures through Oculus in order to publish. A temporary signature is provided.
Does Oculus use C++?
Knowledge of C++ is mandatory when developing native experiences for certain VR headsets. For example, the Occulus Rift PC SDK is written in C++.
Oculus API app versions
Learn how to get started with the PC SDK to create great experiences with Oculus Rift.
Learn the steps necessary to get your game or experience working with the Unity game engine.
Learn how to install, configure, and use the Oculus Audio SDK to deliver immersive audio. Platform SDK
Learn how to use the Platform SDK to enhance your app with social features, matchmaking, leaderboards, and other features.
Learn how to integrate avatars into your game or experience using native SDKs or game engines.
Learn how to develop games and experiences for Oculus Quest using Mobile SDK.
General design tips and tricks
It's time to look at various useful tips and tricks related to general VR tips and design to give beginners and experienced users a great VR experience.
The VR market is still fairly small, although it's already possible to sell 100,000 copies of a popular VR game on the Oculus Gear VR shop. Don't expect to become a millionaire overnight, not yet the popularity level of Angry Birds.
The VR developer community is open, welcoming, friendly and helpful. If you get confused, there are plenty of forums on Unity, Oculus, Gear VR and Android, as well as VR community channels on Slack. Find and attend a local VR developer meeting to meet and discuss your findings and challenges.
Don't be intimidated by Unity 5's VR tutorials, there are quality, simple and straightforward VR design examples ranging from objects, explanations of scale, performance, interaction types and pretty much everything you need to understand the basics of VR development.
Oculus' guides to VR design and the use of mobile and sound SDKs are also invaluable sources of information, tips and detailed code examples for high performance and VR optimisation.
You'll need all of your target hardware, or at least a model with minimum supported specs, such as the Samsung Galaxy S6, to develop and test before publishing the app. (Strictly speaking, some Gear VR models support the Samsung Note 4, but only experienced developers should build and support apps for this model, because very serious optimization will be required to ensure a stable 60 FPS in VR for that device's chipset.)
Make the game duration convenient for those new to VR; provide short (around 15 minutes) and comfortable gaming sessions.
The length of gameplay segments will also help users keep an eye on the battery drain and heat of their mobile device, allowing them to recharge or cool down if necessary, without losing progress in the game.
If you're making a game in the 'horror' genre, make this clear, especially when testing, so users can decide whether or not to continue playing. Immersion in VR feels very real; surprises and sudden shocks have a much stronger effect on users.
Level of interaction: initially provide users with only simple controls so that newcomers can enjoy the process; don't overwhelm them by forcing them to figure out the controls, they are already very overwhelmed by their first experience in VR. Complex or advanced controls can be introduced later by tweaking the game design to suit; increasing the complexity with gradual skill levels allows the user to feel experienced and proficient while mastering more complex controls.
For the user's comfort, the stage should always respond to the user's head movements, even in menus and cut-scenes.
Also, don't self-manage the user's field of view and 'move' their head, this is extremely unpleasant for the experience.
Avoid moving the user around the scene, and if they do move, they should move at a constant speed with no acceleration or deceleration. If the user needs to be moved, place them in the cab of a vehicle, for example, if possible, and let them focus on something in the foreground.
Avoid changing the user's point of view. Don't switch between the first and third person cameras: the user will feel like the world is "floating" out from under them.
If possible, allow the user to adjust their comfort level, so that players who are prone to 'motion sickness' can enable features that reduce discomfort and more experienced users can disable them. The following features can be used to reduce discomfort:
- Discrete rotation of the user's character;
- Darkening of the visibility area when moving and teleporting to a new location;
- Darkening and disappearance of data visible to the peripheral vision when turning.
Don't forget that you are developing a stereoscopic application! All elements should be rendered twice and should be embedded in the world to be immersive, not on top of it as in a traditional UI.
If you have to use a traditional UI, then project it on top of the surface rather than directly onto the screen to maintain a sense of depth in the world. It is generally considered most comfortable to place it 1-3 metres away from the user.
If possible, "embed" the UI into a logical, appropriate 3D world object, such as a book, scroll, mobile phone or wrist display, so the user can interact with it in a natural way.
Design the UI layout so that it sits naturally within the user's comfortable line of sight and he doesn't have to move his head much to view all the menu options or to navigate; he can move his head to select items, but moving to items far left or right results in neck strain.
As said before, input is usually limited to touchpad interaction because most owners will not have a bluetooth gamepad with buttons to select menu items. Therefore, you need to think about alternative input mechanisms that make use of VR properties.
Mobile VR applications often use gaze interaction, where a virtual cursor that follows the user's head movements is displayed. If it is enabled, the user only needs to look at a menu item to interact with it. A pie chart is usually present, which gradually fills up as the user looks at a particular menu item; once it fills up, the item is selected.
Although gaze interaction is easy to use, consider tapping selections for experienced or impatient users: multi-level UI menus can be tedious and slow to navigate with a gaze.
If the UI is displayed on top of a running application, clearly let the user know they are in a menu and not in the world. Depending on the purpose of the menu, you can pause the game when it is displayed, or at least adjust the lighting and focus when in the menu.
A static icon in one corner of the screen, following the user's head movements, constantly reminds them that they are in the menu, even if they glance too left, right or back and lose sight of the panel. Even better, the entire menu follows the user's head movements so that the user always sees it.
Use the Gear VR glasses' physical [BACK] button to allow users to go backwards in multi-level menus, or even enable front-to-back swiping, although this can be confusing if the menu is library-like in which the user can scroll through multiple cells of content, such as in a photo or video catalogue.
Performance optimisation and testing
If you've made it to this point where we've already covered why you should create mobile VR apps for the Samsung Gear VR (and prepare a bit for Google Daydream VR), how to set up a development environment and general VR design guidelines, and are still with us, then you should be interested enough! So now we'll assume you have at least a basic knowledge and understanding of 3D Asset creation and app development, because I'll move on to more specific terminology to cover key aspects of design and development in detail. So bear in mind, I've warned you...
Optimising the performance of a mobile VR app is a key factor in ensuring user comfort (let's forget about horror games for now) and passing validation as a shop publishing process.
There are several aspects of performance: overall app performance, 3D optimization and battery consumption. All of these play an important role in keeping users playing the game as long as possible, leaving good reviews and telling their friends and acquaintances about it.
- Optimise the game to deliver 60 frames per second. Frame skipping is unacceptable: although Asynchronous TimeWarp does allow you to hide and smooth out some difficult scenes, don't rely on it completely.
- Don't rely on the frame counter in the Unity editor, because it does double work when running the scene on the computer. While it does give a good indication of performance levels, build and test the application on the target hardware to ensure smooth gameplay.
- Users often fail to recognise whether an image or scene at a virtual distance greater than 20 metres away is stereoscopic or monoscopic. Take advantage of this to unload skyboxes of distant areas to reduce the rendering load on the mobile device.
- Use Unity's built-in tools: the Profiler and Frame Debugger. These will show you if your app has lags or if it's too "heavy", allowing you to learn the construction of the scene step by step through rendering calls. It's very likely that you'll find objects that don't need to be rendered, which will reduce the total number of rendering calls.
- Also, whenever possible, batch render calls with Unity Editor's built-in Static Batching and Dynamic Batching tools.
- Batch off geometry surfaces of 3D models that are never visible, to get rid of unnecessary polygons.
- Also use occlusion culling to avoid rendering meshes that aren't visible, like the geometry of the room behind a door that isn't open yet.
- Simplify 3D meshes as much as possible to have the lowest level of detail of objects, but without losing the necessary information.
- Make the number of overdraw operations as small as possible, so that fewer objects are rendered on top of each other. The Scene View Control Bar in Unity will give you an idea of what you can optimize.
- Also, use lighting maps (lightmapping) to bake shadows on objects instead of using resource-intensive dynamic shadows.
- If you need to render a 3D object that is far away, use a model with a low level of detail (LOD) and fewer triangles, and replace it with a model with a high LOD as you get closer to the user.
- Make sure CPU and GPU throttling is enabled, because failure to initialise these values will cause the application to execute by default in an environment with a reduced clock speed. Gear VR applications are usually limited by CPU speed, so targeting the CPU rather than the GPU can often improve performance. However, if an app is well optimised, it is possible to reduce the frequency of both the CPU and GPU, resulting in longer battery life, and therefore longer game session times.
The key to building a quality app is regular testing, with iterations between each session including suggestions and improvements to gameplay, UI, development and design instead of putting off all changes until the last minute when you think everything is 100% ready. This way you can continually make small improvements that will generally require less effort than suddenly discovering a major design flaw found too late and requiring a huge amount of rework and fixing.
As a developer you are too immersed in the process to notice problems and bugs in the application, so user testing, starting early and continuing throughout the development process, is critical to ensure that you don't miss something obvious that the user will immediately notice. However, there are tests that you can and should perform before showing the application to others.
The main types of testing you will be doing are related to functionality and performance to ensure that the application is at a basic level of correct operation sufficient for comfortable use by players. You can write unit tests to test various aspects of the functionality, but you will still need to do manual testing and find problems yourself.
If you choose to manage your development process using Agile methodology, you can create test suites based on your own feelings and user stories so that the application will work and contain all the features you need. Otherwise, you'll need to think about test data sets that can effectively test and cover all possible conditions and actions: not just expected user behaviour and actions, but also what the user might do to break the process and potentially get "stuck" in the game (not checking the right box or not scoring enough to get to the next level without the option of trying again).
Testing VR for functionality is more difficult than testing a standard, flat app because it is better done in a device with VR glasses, but this means you won't be able to quickly switch to a spreadsheet or notepad to quickly write down problems you find. It is therefore recommended to do paired testing, with one tester being in the app and doing all the tests, while the other tester writes down the remarks spoken aloud by the first.
But until you get to that point, you could run the app directly in the Unity editor to test its functionality and performance without having to build and install the app on the mobile device. As written in the performance optimisation section, the profiler, Frame Debugger and Scene View in Unity allow for initial performance testing, and the editor itself informs the developer of marginal exceptions and bugs in the code.
User testing requires more preparation and time to guarantee quality feedback specifically about the app, not the technology. As I mentioned in my post earlier, about 9 out of 10 people still haven't tried VR, so using them as novice testers requires management to prepare in advance.
When setting up a user testing session, each tester should have time to familiarise themselves with VR and "acclimatise" themselves, after which they can start trying out the app. If testers are new to VR, it gives them a chance to not only be overwhelmed by the technology and the immersive experience, but also to be useful by providing feedback directly about your VR application. Once they know what VR can do and how it works at a basic level, they will be ready to test your app, with a clearer understanding of how it should work and what feelings and sensations it should evoke. Good examples of introductions to Gear VR apps include Samsung Introduction to Virtual Reality (free) and Welcome to Virtual Reality by SliceVR (paid).
Prepare a set of questions to ask users about the testing session of your app, for helpful feedback and information about how they felt, how easy it was to understand what to do, where they had problems or difficulties, and aspects that were uncomfortable for them (in terms of performance rather than content, especially in the case of horror games).
Remember that content shown on a mobile VR device is more difficult to display on a separate monitor, and so you won't be able to see in real time what they are looking at (and pointing at in the air). Make a set of printouts of important application screens and menus so that after a test session users can look at them and describe the different screens and panels they saw (but they won't necessarily refer to them by the same names you gave to those items).
If your budget is large enough, you can use companies offering VR testing services to reduce your own labour and time. Testronic Labs offers a paired VR testing service for functionality and compatibility, and Player Research are leaders in user research and user testing, this company creates and provides developers with detailed reports after tests that can be used as part of the service.
So, your app is already running stably at 60 FPS and error free (as far as software goes), it has been tested and found to be comfortable and easy to use by its target audience of end users. Now it's time to submit the app to the Oculus Store and get ready for release!
Sending to a shop
Submit to Store Process
To be able to sell a Gear VR app on the Oculus Store, the app needs to be checked by the Oculus Store team for comfort, performance and overall compliance, and then the green light is given to release the app.
It's a fairly straightforward process, but it does require a bit of effort on your part for internal customisation. The amount of labour required for customisation depends on the features used in the app (e.g. in-game purchases (IAP), organising multi-user matches, achievements, leaderboards, etc.). Many items require internal API and ID configuration, so you need to go back to the app project in Unity to make sure the correct values are used to unlock each achievement, in-game purchase, etc.
You should have already set up your Oculus ID, but if you haven't already, go to oculus.com. Under Developers > Dashboard you need to select Create new organisation to be able to create an application profile. Make sure to enter important information about your organisation correctly (address, financial info etc.) so you can get the hardware and more importantly the monthly purchase payments for your app.
Once your organisation is set up, you can start setting up your app: go to My Apps > Create New App and enter the required information.
IMPORTANT: There's currently no option to completely delete your app once it's created, so make sure all the details are correct while you're creating it - you can go back and change your app info at any time, but if you're like me and like a clean dashboard, get it right!
The first step is to choose a platform: we're dropping the mobile VR app, so choose Gear VR and enter the app's full name to create the first entry.
Once you've created your first app profile with name and platform, Oculus will generate a unique app ID to use in your Unity project to initialise the various Oculus APIs, especially those relating to in-game purchases and license availability checks for the final version.
You can also create any IAP tokens and IDs after entering the financial information, which can then be invoked in the Unity project for appropriate actions via Edit Details > Platform > IAP.
App shop information
The basic information of your app on the Oculus Store can be found in Edit Details > Submission Info. Here you enter a full and brief description, select genre, features, supported peripherals, age restrictions and cost.
Some of these items you choose yourself, some require third-party approval (e.g. age restrictions); regarding cost the Oculus Store staff will contact you and negotiate an appropriate amount. Don't forget that by default all apps are considered to be free, and if you want to sell them, you need to change the price before shipping!
Another time-consuming element, as with any shop, is the artwork that needs to be created for the app to have its own record. Various shapes and sizes of images are required, depending on how and where the entry will appear in the shop, but the instructions are clear and easy to follow (e.g. about the need and positioning of the logo so that the included dynamic sales banners don't cover it).
There is one interesting asset that can be added to Gear VR apps (and which is not yet supported by Rift apps): it's a cubic texture image, so that potential customers can see a panoramic screenshot of your app in Gear VR when they visit the shop.
Of course, the information is important, but you also need to upload an app build for verification and for download after purchase. Before submitting, you need to run the Oculus Submission Validator tool for your APK file and make sure the following is available:
- The XML manifest file and installation location are correct - Gear VR apps should be installed into the device and not onto an external drive.
- A version code is specified - typically 1.0 if this is your first version of the app, or a higher value in the case of a new build after checking a previous version.
- A signed APK so that once it has been verified and confirmed ready for release, it can exist without the need for further downloads.
The build management section allows you to upload builds for different channels: Alpha, Beta, Release Candidate and Live. Keep in mind that many journalists have access to the Release Candidate channel, so if you weren't working on PR or marketing the app before the release, be prepared for them to accidentally stumble across it and release a review article without warning. So before uploading a build to this channel, it's best to get in touch with them to iron out any issues if the app isn't 100% ready for release!
The app is ready to be submitted
Once you've decided that all the data has been entered, you can submit under Submission Info > Submit, which has a handy list with the current status of each section you want. When a nice row of green checkboxes finally appears, you will have one last opportunity to check the entire list and all details before clicking the [SUBMIT FOR REVIEW] button. Once the app has been submitted, the Oculus Store Gear VR staff will review the app information, test it and contact you with any suggested changes, after which the app will be deemed ready for release.
THAT'S IT! You've come all the way - great job, and good luck with your mobile VR development.