We were recently tasked with creating a VR experience in a high school class room setting. Each person needed to be represented by a unique avatar, but the project had a tight budget. We decided to piggy back off all the hard work that Meta had done and use the Oculus Avatar2 SDK.
We set up our project in the Meta Developer console.
Each person who uses the Quest 2 already has an account, and can quickly create a personalized avatar.
We created our API endpoints for log ins, class rooms, and various other odds and ends that our class room would need.
We signed up for Exit Games Fusion multi-user networking to handle multiple users in a session.
Issues that we faced during development
To work with Unity and the Meta Quest Developer Hub, you need to register the project with Meta as a Quest 2 app and then another as an Oculus Rift app. The Rift app registration allows you to side load it on the device. Doing this however, had a side affect of displaying all avatars as the same user when calling "LoadLoggedInUserCdnAvatar()". I found that if you build and distribute the apk/aab to a distribution channel and everyone logs on with different quests, each avatar is correctly displayed.
Fusion set up is different than Photon. The idea is the same, but it functions differently. Since this was not a game, but what Exit Games considers as "Industry", the pricing model is different. I always liked how quick it was to implement Photon into muit-player functionality into projects, so this works out for me.
The Android project doesn't export with the proper pieces in place. However, using Unity's Oculus plugin, it takes care of building a functioning apk/aab without too much trouble. The only reason why I think that I would like to export it as an Android project would be to implement another library like OpenCV to work on some runtime volumetric ideas. Finally If anyone is interested in seeing any code or how I have Unity set up, let me know.
Comentários