The minimum requirements are:
The Stereoscopic checkbox in the Player Settings is strictly for DirectX11.1’s stereoscopic 3d support. It doesn’t currently use AMD’s quad buffer extension. Make sure that this sample works on your machine. Stereo support works both in fullscreen and windowed mode.
When you launch the game, hold shift to bring up the resolution dialog. There will be a checkbox in the resolution dialog for Stereo3D if a capable display is detected. Regarding the API, there are a few options on Camera: stereoEnabled, stereoSeparation, stereoConvergence. Use these to tweak the effect. You will need only one camera in the scene, the rendering of the two eyes is handled by those parameters.
Note that this checkbox is not for Oculus (or other VR headsets to my knowledge) at this time.
Note: Currently, setting Unity to render in linear color space breaks stereoscopic rendering. This appears to be a Direct3D limitation. It also appears that the camera.stereoconvergence
param has no effect at all if you have some realtime shadows enabled (in forward rendering). In Deferred Lighting, you will get some shadows, but insconsistent between left & right eye.
The Unity Free integration for Oculus is available from the Oculus site. You can use Unity 4.6 upwards and the Oculus integration package to deploy all of your VR content to the Rift.
Getting started: import the Unity 4 Oculus integration package into Unity, then open up the demo scene and you’re good to go.
The release supports Windows, Mac, Linux and Gear VR with full access to LibOVR through a pure C# wrapper.
Features include:
See also the Oculus Unity Integration Guide.