When I'm playing a game normally I'm using a keyboard/mouse or a gamepad. Either way, I'm _mostly_ looking at the screen, but occasionally looking down to work out why I'm pressing all the wrong buttons, or reminding myself which button "A" is.
You can't do that with a Rift - your eyes are completely enclosed, and there's no way to see the controller, so re-centering yourself after you've lost your bearings is really hard. And if you put the controller down for a second...
Meanwhile, back in the dark days of 2010, the "Leap Motion" was launched. Essentially a small, chocolate-bar sized, detector that can tell where your fingers are, it failed to find any real traction. There just weren't enough uses for it, it was a bit glitchy, and frankly you were better off with a mouse/keyboard or a touch-screen for most uses.
And then some people had a genius idea - strap a Leap Motion to the front of an Oculus Rift and use it to detect the hands of the person wearing it. Like this:
And now you've got something which is constantly detecting exactly where your hands are, works entirely intuitively, and gives you an actual workable control system for virtual reality.
Workable enough, in fact, that you can strap the office cleaner into it, and she picks it up in moments:
More details, and a good video of it being used to control something more complex here.
*Other than, I wonder how they're going to stop version two from making people sick.
Original post on Dreamwidth - there are comments there.