Unable to set desired RGBA Pixel Format

Problems building or running the engine, queries about how to use features etc.
Post Reply
def87
Halfling
Posts: 90
Joined: Wed Sep 19, 2012 1:41 pm
x 1

Unable to set desired RGBA Pixel Format

Post by def87 »

Hi,

after repeatedly trying to create a Texture with PF_A8B8G8R8 pixel format, which always returns a PF_A8R8G8B8 format texture I delved into the OpenGL (not using Direct3D) source and found the function:

Code: Select all

PixelFormat GLPixelUtil::getClosestOGREFormat(GLenum fmt)
This function determines the pixel format by the "interal format" (which only defines the number of componets and their size).
Because of this a new GLTextureBuffer always returns PF_A8R8G8B8 for a 4 component 8bit internal color format.
This does not make sense to me at all.

Anyway, I am looking for a way to define a PF_A8B8G8R8 texture. It is a dynamic texture and I don't want to reorder my texture data on the cpu.

Any help would be appreciated.
User avatar
WilliamKappler
Kobold
Posts: 33
Joined: Mon Feb 16, 2015 8:37 pm
Location: Raleigh, NC
x 2
Contact:

Re: Unable to set desired RGBA Pixel Format

Post by WilliamKappler »

I believe this is because OpenGL just treats those channels essentially arbitrarily. You have to have RGBA, in that order, no matter what. So it's impossible to reorder them without a memory copy going on somewhere.
https://www.opengl.org/wiki/Image_Format wrote:The components​ field is the list of components that the format stores. OpenGL only allows "R", "RG", "RGB", or "RGBA"; other combinations are not allowed as internal image formats.
It seems Ogre is actually translating RGBA -> ARGB:

Code: Select all

        case GL_RGBA8:
        case GL_SRGB8_ALPHA8:
            return PF_A8R8G8B8;

OgreGLPixelFormat.cpp
Not sure if it's then shifting those values around somewhere else in Ogre or what, exactly. Note RGBA8 != A8R8G8B8.

You could write a shader that maps the colors into different channels. But all in, it's probably a lot simpler to just rearrange the data.
Software Designer • World Builder • Game Designer • Engineer
def87
Halfling
Posts: 90
Joined: Wed Sep 19, 2012 1:41 pm
x 1

Re: Unable to set desired RGBA Pixel Format

Post by def87 »

In my long experience with OpenGL on modern hardware there is no real speed difference when uploading data as RGBA or BGRA ( using glTexSubImage2D() for example )

Of course I could change the behaviour inside Ogre to switch to GL_RGBA as default (now it's GL_BGRA), but I would like to have both.
I've been doing this for years in pure OpenGL...

Swizzeling components in shaders etc. can be done, of course. But it seems unneccessarry to me.
Making Ogre use both and getting rid of getClosestOGREFormat() seems to be a major effort...

Another advantage would be that I could react to different data formats just inside of the Ogre GL code for optimizations... (because the format IS stored in the Texture Class)
( one dynamic texture getting RGBA data, another is getting BGRA data )
It seems Ogre is actually translating RGBA -> ARGB:
GL_RGBA8 -> GL_BGRA (a relation from internal format to externat data format)

This is the basic flow of things:

- the user creates a texture with some 4-component 8bit format, let's say PF_A8B8G8R8
- Ogre returns GL_RGBA8 as the internal format (correct)
- from this point on Ogre returns PF_A8R8G8B8 (which translates to GL_BGRA format) for this internal format

As long as everybody is happy to use whatever data format Ogre is "suggesting" all works well and optimized, I am not critizing that, I am just looking for reasonable way to extend this behaviour to do what I want to do. :)
Post Reply