In attempt to acquire an accurate estimation of how the hardware may deal with the high-quality 3D model of the display homes I had witnessed in the prototype, I decided a good place to start would be to see if I could get my studio project onto the HTC Vive Focus. Although my models were nowhere near as impressive as the professionals’, my scene had multiple rooms, particle effects, and shadows, so I decided it would be a fine tool for testing.
The Focus is a mobile Android device, so it requires a different approach and alternative SDKs than what I had experience with when developing for the HTC Vive. Also, developing on a Mac meant extra fun when searching for tutorials and information, especially as the Focus was only very new at this stage. Using Android Studio, I installed the required SDKs, and following some tests using different virtual reality prefabs, I implemented the Vive Input Utility. This asset was great; it offers cross-platform compatibility, prefabs, and sample scenes that made the process relatively simple.
Through the use of Android File Transfer, and with a few tweaks to the quality settings, we had a functional virtual reality scene running on the HTC Vive Focus! I must say, the Focus allowed for a very streamlined workflow when developing and carrying out testing. Pictured above is a staged photoshoot of me recreating what I would look like when testing the project, although, in reality, I’m just posing for my webcam to see how futuristic I look to colleagues and clients. The Focus allowed me to transfer the new project file from my Mac to the device, and have it installed and running within a matter of minutes, without even needing to stand up.
By observing the sample scenes that came along with the Vive Input Utility package, I was able to implement teleportation and various methods of interaction. In the images below, you can see two examples I was testing.
The example seen in the image on the left required the user to teleport up close to the photographs of the houses and then reach the controller out so it would collide with the picture frame, before pressing the trigger to initiate a scene change. This system was awkward, especially with the Focus, as its trackers are not as flexible as the other Vives. As the application will be used by a wide audience, it needed to be as intuitive, simple, and user-friendly as possible, so I kept investigating until I figured out an alternative approach.
The image on the right is the second solution I developed. In order to interact with this menu system, the user needs to press one button to switch on their controller’s laser, they can then point it at any one of the photographs and press the trigger to select it, thus initiating a scene change. This approach was more user-friendly and flexible, as users could navigate to a different house from across the room if they wanted to. The laser also provides visual feedback when it is pointed at something that can be selected, so the user experience is more intuitive.