Week 9 Dev
Time used 12 hours (2+4+4+2)
Last week, I spent two hours watching Austin Krauss’s speech “From Commodore to Oculus: My life in video games”. https://umdearborn.edu/alumni/get-involved/alumni-resources-keep-you-connected/um-dearborn-homecoming/speaker-series From his conversation, I gained some professional thoughts regarding game design. He mentioned the difference between working on the multiplayer mode, the competitive mode, where you are playing against other people, versus zombie mode, which is more like a co-op, where you are attacking the AI. In competitive multiplayer, we don’t worry about the gameplay too much because the players are making their own gameplay, since they are hunting each other down, whereas in the zombie mode, game designers should make sure the gameplay is sustainable for a significant amount of time.
He also introduced the content creation pipeline of his VR games, modelling stuff in Maya or Blender and importing them into unreal. When it is running slow on the Oculus Quest headset, he uses RenderDoc to debug, analyzing where things can be slower or where things can be optimized.
He also explained his perspective on the impact of VR on gaming in the future. The devices are getting there slowly, but there is a barrier to entry with the hardware. They have to be much smaller and more accessible, like putting on a pair of glasses. He didn’t know if we would see a huge shift like we had in the past with certain types of hardware. He recalled his experience years ago when display shifted from standard definition to high definition. The clarity was unbelievable. It is not sure whether we will see the same thing in VR, but in VR we can feel much closer to our multiplayer counterparts. It no longer feels like an avatar, but like a person, and we can interact with that person, and our own actions are felt at a greater, deeper connection in VR with those people.
To create a simple game loop, I created some targets and planned to cast fireballs when the user pressed the trigger on the left controller. I spent 4 hours implementing it. I added a custom reference to track input action. When the player pressed the left trigger button, the character fired fireballs from the crosshair so that it looked just like shooting from our own hands. Video: https://vimeo.com/manage/videos/641467423
But now the flying direction of fireballs seemed to be random and hard to control. I spent 4 hours debugging it and tried to integrate with the XR Ray Interactors implemented week 7 & 8. I haven’t figured it out due to the tricky rotation input of the controllers. And there was another issue related to particle effects. I found a free asset for projectile effects. But when I imported it into VR, it was visible in the editor. However, in the game, within the headset view, the particle effects disappeared. I tried to test with the built-in demo within the asset, and the fireballs were still not visible via VR head-mounted display. When I disconnected the WMR headset, switched back to the normal camera, the particle effects reappeared. So I asserted that it should be caused by the difference between the two camera systems. And I spent 2 hours trying to figure out the fix. But I haven’t found any solution, so I had to change the prefab and switch fireball to meteor-like. Video: https://vimeo.com/manage/videos/641467461
This week I am going to optimize the shooting direction, and explore fight mechanics in VR and create dummies that can be hit. If available, I will also try to solve the problem of the particle effects in the VR camera or making my own fireball prefab.