Gamma Correction on MyGui?


28-11-2010 10:14:42

Hi Folks,

Is there a way to turn gamma correction on for MyGui? I've got it on for the rest of my materials and they look fine except for the MyGUI texture that looks washed out, is there a way to enable it?

Thanks for your help
All the best,

29-11-2010 10:35:24


29-11-2010 10:43:23

Sorry, should have mentioned, I am using DirectX


29-11-2010 21:56:20

Could you show how you enable gamma correction in your materials, how you set it up. And screen shots if possible, to understand what do you mean by "washed out".


30-11-2010 09:09:37

Could you show how you enable gamma correction in your materials, how you set it up. And screen shots if possible, to understand what do you mean by "washed out".

Hi Altren,

I am attempting to use linear (non-gamma corrected) colours in my game as they produce a more photorealistic lighting - ... _ch24.html

This is supported by ogre by allowing to use hardware to (un)gamma correct textures when you load them in. This is preferable to pre-processing the textures as it doesn't loose any detail. There are 2 ways that ogre supports this

1. The use of the gamma parameter in Root::createRenderWindow. This converts the linear colours to gamma corrected colours when rendering to the renderWindow

2. The use of the gamma parameter in the material script in order that the gamma corrected texture is converted into linear space when loading the texture.

With MyGUI support for loading in textures into linear space is needed using TextureManager::load ( hwGammaCorrection parameter == true). Also when widgets and sub-widgets are set to specific colours they need to be converted to linear colours (as the RenderWindow gamma parameter will gamma correct them when rendering).

I made this change: - mTexture = manager->load(_filename, mGroup, Ogre::TEX_TYPE_2D, 0,1.0f,false,Ogre::PF_UNKNOWN,true); in OgreTexture::loadFromFile and it seemed to work. It would be good if there was a way to supply the hwGammaCorrect option through the MyGui api

Here is a screenshot before: -

And with gamma correction: -

(The colour on the right is where I have been experimenting with converting colours to linear so it matches the right hand side of the texture)

Hope this helps
All the best,


02-12-2010 09:35:18

Is it possible to assign a custom pixel shader to change the colours manually?


02-12-2010 10:56:15

I'm not sure, but I think you can do that by modifying MyGUI::OgreRenderManager::begin() - add mRenderSystem->bindGpuProgram(pointerToPixelShader)


02-12-2010 14:51:57

I'm not sure, but I think you can do that by modifying MyGUI::OgreRenderManager::begin() - add mRenderSystem->bindGpuProgram(pointerToPixelShader)

Hi Altren,

Thanks for the suggestion, but it doesn't seem to work. It crashes when binding the GpuProgram. I am getting the GpuProgram in initialise: -

_gpuProgram = (Ogre::GpuProgram *)Ogre::HighLevelGpuProgramManager::getSingleton().getByName("MyGuiShaderPixel").get();
MYGUI_LOG(Info, _gpuProgram->getName() << " Shader Program");

And the debug comes out as expected (therefore the _gpuProgram pointer is valid).

Any ideas why it would be crashing?
Thanks for your help
All the best,


03-12-2010 14:27:47

Hi Altren,

Did a bit more investigation on why it was crashing. Here is a stack trace: -

RenderSystem_Direct3D9.dll!std::_Tree<std::_Tmap_traits<IDirect3DDevice9 *,IDirect3DVertexShader9 *,std::less<IDirect3DDevice9 *>,Ogre::STLAllocator<std::pair<IDirect3DDevice9 * const,IDirect3DVertexShader9 *>,Ogre::CategorisedAllocPolicy<0> >,0> >::find(IDirect3DDevice9 * const & _Keyval=0x05510060) Line 1424 + 0x3 bytes C++
RenderSystem_Direct3D9.dll!Ogre::D3D9GpuVertexProgram::getVertexShader() Line 319 + 0x16 bytes C++
> RenderSystem_Direct3D9.dll!Ogre::D3D9RenderSystem::bindGpuProgram(Ogre::GpuProgram * prg=0x040b2e60) Line 3282 + 0x10 bytes C++
Orc.exe!MyGUI::OgreRenderManager::begin() Line 321 C++
Orc.exe!MyGUI::OgreRenderManager::renderQueueStarted(unsigned char queueGroupId=0, const std::basic_string<char,std::char_traits<char>,std::allocator<char> > & invocation="", bool & skipThisInvocation=false) Line 193 C++
OgreMain.dll!Ogre::SceneManager::fireRenderQueueStarted(unsigned char id='d', const std::basic_string<char,std::char_traits<char>,std::allocator<char> > & invocation="") Line 4032 + 0x12 bytes C++
OgreMain.dll!Ogre::SceneManager::renderVisibleObjectsDefaultSequence() Line 2293 + 0x1e bytes C++
OgreMain.dll!Ogre::SceneManager::_renderScene(Ogre::Camera * camera=0x040a7610, Ogre::Viewport * vp=0x05f10680, bool includeOverlays=true) Line 1527 C++
OgreMain.dll!Ogre::Camera::_renderScene(Ogre::Viewport * vp=0x05f10680, bool includeOverlays=true) Line 419 C++
OgreMain.dll!Ogre::Viewport::update() Line 215 C++
OgreMain.dll!Ogre::RenderTarget::_updateViewport(Ogre::Viewport * viewport=0x05f10680, bool updateStatistics=true) Line 200 C++
OgreMain.dll!Ogre::RenderTarget::_updateAutoUpdatedViewports(bool updateStatistics=true) Line 179 C++
OgreMain.dll!Ogre::RenderTarget::updateImpl() Line 155 C++
OgreMain.dll!Ogre::RenderTarget::update(bool swap=false) Line 614 C++
Orc.exe!Sirocco::OgreGfxEngine::render() Line 129 C++
Orc.exe!Sirocco::OgreGfxEngine::update() Line 173 C++
Orc.exe!Sirocco::RacingGameController::start() Line 470 C++
Orc.exe!WinMain(HINSTANCE__ * hInst=0x00400000, HINSTANCE__ * __formal=0x00000000, char * strCmdLine=0x00bc4dbd, HINSTANCE__ * __formal=0x00000000) Line 66 + 0x2d bytes C++
Orc.exe!__tmainCRTStartup() Line 547 + 0x1c bytes C
[Frames below may be incorrect and/or missing, no symbols loaded for kernel32.dll]
Orc.exe!boost::foreach_detail_::next<std::vector<boost::shared_ptr<Sirocco::RecordLapInfo>,std::allocator<boost::shared_ptr<Sirocco::RecordLapInfo> > >,boost::mpl::bool_<0> >(const boost::foreach_detail_::auto_any_base & cur={...}, boost::foreach_detail_::type2type<std::vector<boost::shared_ptr<Sirocco::RecordLapInfo>,std::allocator<boost::shared_ptr<Sirocco::RecordLapInfo> > >,boost::mpl::bool_<0> > * __formal=0x00000020) Line 732 C++
Orc.exe!Sirocco::Matrix3::QDUDecomposition(Sirocco::Matrix3 & kQ=, Sirocco::Vec3 & kD=, Sirocco::Vec3 & kU=) Line 783 + 0x16 bytes C++

I don't really know the Ogre codebase well enough to know exactly why it is crashing, but it looks like it could be a bad cast.


I wrote the shader in cg, perhaps I need to write it in hlsl? What object does a cg program become? HighLevelGpuProgram rather than D3D9GpuVertexProgram perhaps? Maybe it's not possible to set it at this lower level?

Any suggestions?
Thanks for your help!
All the best,