OK, this one isn’t strictly an Oculus Rift-only post, but it’ll be useful for a lot of Rift developers, as the Razer and the Rift go together extremely well.
I’ve been doing some development work with the Razer Hydra recently, and it’s less than easy to persuade it to work with Unity 3D. So, here’s a quick guide to getting the usual Razer “Hands” working with your controller in Unity.
By the way, I’ll freely admit I’m not the most expert developer out there, so if there’s a better way to do this, I’d love to hear about it!
*Note: You’ll need Unity Pro to use the Razer Hydra, as it requires a .dll plugin. To the best of my knowledge, you can’t develop with the Hydra in Unity Free.
Step 1: Get The Razer Hydra Plugin
Go into the Unity Asset Store in Unity, and search for “Sixense Hydra”. Install and download the plugin.
Note that you don’t need any drivers for the Hydra installed - it’ll work without them.
Step 2: Open the sample scene.
Save your current scene.
In your “Project” window, go to the Sixense Sample Scenes folder:
Open the SixenseHands project.
Step 3: Copy The Hands
OK, this might seem like cheating, but it’s the easiest way I’ve found to get Hydra support into another project.
From your Hierarchy window, select and copy the “SixenseInput” script, and the Left Hand and Right Hand trees of assets.
Now, close the SixenseHands project, open the project you want to use Razer Hydra control in, and paste in all the things you just copied.
Step 4: Assign parent.
If you’re using a FirstPersonController in your project (as you probably will be) drag the Left Hand and Right Hand onto the Main Camera, which is a child of the FirstPersonController.
Now, double-click one of the hands to center the view on them, and move them next to your First Person Controller. Move and scale them until they’re placed where you want them relative to the controller’s camera - you can click on the MainCamera to get a look at where they’ll be when the game starts up.
Step 5: And that’s it!
You can now start your game up, and after running through the on-screen prompts, you’ll have hands controlled by your Razer Hydra in your game. You can also now add scripts to the hands - collision, triggers, etc - and they’ll work as usual in Unity.
Accessing the buttons is a slightly different kettle of fish, and I’m still figuring out how to do it. If people are interested - let me know! - I’ll do a second part of this tutorial about that.
I hope that was useful! What would you like to see a tutorial on next?