We used ovr integration & ovr utilities for unity(MIT???),
Along with NavMeshComponents(MIT) to make a game.
These packages along with Unity’s support for oculus enabled implementing head and hand motion tracking. Grabbing items on contact and at a distance. And Enemies that patrol specified areas of the map.
We implemented a state-machine to handle creating enemies that change behavior. The two main states are Patrolling and Hunting(when player is in view). These states use a navmeshagent component in unity and a NavMeshComponents sdk navmeshsurface.
Hunting also uses a collision shape, raycast pair to ensure the player is not seen through walls.
The crossbow instances arrow objects and these arrows damage enemies. The enemies with an outer shell should be vulnerable to range weapons when the outer shell is dissolved.
The New LightWeightRenderPipeline was used to access a shader graph, resulting in easier shader/material creation. Some of the shaders made are seen through the players holographic hands, the crossbows scan-line shader, and the enemies shader’s for grid-scrolling and dissolving
Blender was used to make the level, crossbow, and arrows.
A private github repo was used to handle version control