I'm trying to integrate OGRE (1.9 version) with the latest Oculus Rift SDK (0.3.1 version).
In particular, I have to use the Direct3D9 render system and I'd like to use the SDK distortion rendering approach.
What I've done so far is:
- Create a custom render loop where buffer swapping isn't called
- Create two RTT texture where the scene is rendered from two different cameras (left and right eye)
- Call ovrHmd_BeginFrame and ovrHmd_EndFrame in frameStarted and frameEnded callbacks
- Call ovrHmd_BeginEyeRender and ovrHmd_EndEyeRender in preRenderTargetUpdate and postRenderTargetUpdate callbacks
I can see my scene rendered distorted for a moment, but after that the output is a mess.
Sometimes I see a completely black window, sometimes I see this: http://i.imgur.com/mALc04F.jpg
(blue and white are the background colors of left and right eye viewports).
Digging in the Oculus Rift SDK sources, I've discovered that the DistortionRenderer::RenderBothDistortionMeshes (inside CAPI_D3D9_Util.cpp) method, which is invocated in the ovrHmd_EndFrame function, causes the issue.
In particular, I suspect all is related to the setting of pixel and vertex shaders which happens in that method.
Adding these two lines at the end of the DistortionRenderer::RenderBothDistortionMeshes method fixes the issue:
- Code: Select all
//Revert render state
Is it a bug?