1

Improve VR interaction for Scripters

There is no good way to have any kind of good interaction with my musical instruments in VR beyond creating independent, small Trigger Volumes each with a script overlaying the instrument. Which is a pain for anyone to set up in an Experience as you can't combine this into a single model for sale in the store.
I have spent countless hours trying to trick, cajole, urge and hack a solution. So, rather than complain (I don't complain, I rant), here is the results of my many tries with suggestions on how to improve it for better, more natural interaction in VR with objects.


Case 1 - hand contacts object directly. Every frame, I think, reports that something is contacting the object. This could be the hands, feet or other body parts. It doesn't honor Control Point Type, so, you don't know what part is touching it. It is not event/interrupt driven, so, the script has to be in a polling loop seeing if anything is touching it currently. You could execute the interaction logic (i.e. play a note on an instrument) the first time it sees a collision, don't do anything until it sees there is nothing collilding, then reset it so that the next collision executes in the interaction. The problem with this for my instrument playing is that in order to be close enough to strike it with my hand, other body parts are touching it and since I can't filter on Control Point it isn't usable.

Case 2 - using Trigger Events with object directly. A non-starter since it only appears to recognize Control Point Type and Enters and Exist with Trigger Volumes and not Collision Volumes associated with a mesh.

Case 3 - User a Trigger Volume overlaying the object and use Control Point Type to register hits. Only registers hits if you open and close the hands (use VR controller between hits). It reports that you hit the Trigger Volume. When using data.HitObject.Position only returns the position of the trigger volume not the position on the trigger volume that was interacted with by the Control Point Type (i.e. the hand) like interaction does. That means that you have to have individual Trigger Volumes overlaying each area of the model you want to interact with.

My wish to make this better would be that every time that a collision of a hand happens with an object collsion volume, it triggers an enter event and identifies the ControlPointType. When it leaves the collision volume of the object it records a Exit ControlPointType Event. It would work whether the hand was open or closed (in that way I could hold a mallet or drum stick as a prop while interacting). This makes interacting much more natural.

0 comments

Please sign in to leave a comment.