XR UX-UI
Last updated
Last updated
One of the things we learned was that there are multiple ways you can increase immersion in VR. By playing with human senses you can 'trick' people to thinking they its more real.
For example with our Sea turtle experience we tested with water ambience and without. Those that tested with ambience had a more immerse feeling of being under water than those that played without.
This can be added for all the senses (Smell, sight, hearing, touch and taste).
This week Lili gave us a lecture on user experience. During this lecture we were shown multiple ways to make a user experience more pleasant in VR. One example for this is Inverse Kinematics. If a user has arms in vr it is possible to use IK to connect body parts in such a way that they move more realistic. An example is making the arm bend in such a way that the user thinks its real.
InteractML is a tool (which also has a Unity plugin) the is based on machine-learning. This can be used to teach your game gestures which can be linked to events. An example of this could be a flick of a wand in harry potter creates a spell. I think you get the idea of what is possible.
Today we received another XR workshop on how to work with the Final IK and InteractML plugins. Unfortunately Final IK costs +/- 80 euro which means we probably will not be able to use it.
However Final IK is a very nice way to 'predict' how body parts move and interact with one another. For example you can use this to rotate the elbow of an avatar when the hand moves or rotates to make it more realistic.