Multiple Scene Managers and Viewports

Germanunkol

23-03-2015 13:26:43

Hi,

For my Oculus Rift application (http://www.ogre3d.org/forums/viewtopic.php?f=11&t=82363) I have a relatively complicated setup. It was working before, when I created all the layers manually using .xml files. Here's an example layer, which worked:


<Layer type="RTTLayer" name="ScreenLayer1" pick="true">
<Property key="TextureSize" value="1280 300"/>
<Property key="TextureName" value="ScreenTexture"/>
<Property key="SceneManager" value="RoomSceneMgr"/>
<Property key="Camera" value="LeftCamera"/>
<Property key="Entity" value="ScreenEntity"/>
<Property key="Material" value="ScreenRTTMaterial"/>
</Layer>


Now I am creating the layers at runtime, and it no longer shows my GUI on the textures.
What do I need to do in order for it to render the GUI?

Here's the setup:

A SceneManger (MainSceneManager) is set up with two cameras, one for the left and one for the right eye. In this scene manager, I set up multiple meshes and RTTLayers, and display the RTTLayers onto the screen.
When I connect the two cameras to the window via two viewports, everything works fine.

But here is where it gets complicated: For the Rift, I need to distort the viewports. So instead of rendering the contents of the cameras directly to the window's viewports, I render them onto two rendertextures. Then I create a second ScreenManager (RiftSceneMangaer). Into this new Scene, I put two meshes, on which I display the (distorted) Textures, one for each eye/camera.
Then I add a new camera to the RiftSceneManager and render that camera's viewport onto the main window.
In this scenario, the
- If I initialize MyGui using the MainSceneManager, the GUI does not show (i.e. the RenderTextures are transparent. Only the last created layer has some white pixel noise on it).
- If I initialize MyGui using the RiftSceneManager, then the GUI is drawn onto the main window, below the distorted meshes (but not onto the meshes in MainSceneManager).

I have tried to set setEnableOverlays to true on all involved viewports as per this wiki entry, but no luck so far...

TL;DR:
How do I set up MyGUI so that it renders to texture, put those textures on meshes, then render those meshes onto another texture, then put this new mesh into a second scene manager and then render that to the RenderWindow?

Any hints on how to set up such a Pipeline?

Germanunkol

24-03-2015 12:44:03

Update:
When rendering with the dedicated graphics card (I have two), this is the result:


When rendering without the oculus rift module (i.e. no second scene manager), then this is the result:


Why is it rendering only white where it should be rendering the UI?

Ogre.log and MyGUI.log show no errors.

Germanunkol

21-05-2015 21:53:05

After a long time, I finally got it working!

It's a somewhat ugly hack, though. I had to comment out three lines of code:
Platforms/Ogre/OgrePlatform/src/MyGUI_OgreRenderManager.cpp
In void OgreRenderManager::renderQueueStarted (lines 196 ff. ):


if (mWindow->getNumViewports() <= mActiveViewport
|| viewport != mWindow->getViewport(mActiveViewport))
return;


If I do this, it works.

I know too little about the rendering queue to fix it myself - for now I'll just leave it commented out.
However, this means that MyGUI doesn't render when in a scene/viewport which is not directly connected to the current window, I believe - which is, in my opinion, a missing feature.
Maybe this can be cleaned up somehow?

Cheers,
Micha