I haven't tried this, but I think there shouldn't be any problems. QuickGUI shouldn't interfere with CEGUI, so you can create all the required components, and then just inject input into both systems. So if you were using OIS, you'd be injecting mouse positions into both CEGUI and QuickGUI. Also note that if you have overlapping buttons and inject a click into both systems, you will fire events for both buttons. I remember that I don't inject inputs if the mouse cursor isn't visible, so I may need to change some things, if you wanted to support the CEGUI cursor (or even an OS cursor) and still inject input to the GUI.
As far as drawing goes, you can move QuickGUI to a higher render Queue, if there are zOrder issues between them. Currently it defaults to the Overlay Render Queue Group, but there is one above that.
A brief outline of the skinning, as it is implemented right now (it will be upgraded):
You create 3 sets of textures for every button type you want. Each widget has a *skin component* that it uses to determine the texture to apply. A button's default skin component is ".button". So the default qgui skin is made of the following textures:
Skins are loaded via the skinset manager, and are represented by 2 files: <skinname>.skinset and Skinset.<skinname>.png. If the files are not present, they are built for you.
If you want to support multiple buttons, you could have a set of textures like:
To load the skin and create the 2 files, you would call
Since buttons have a skin component of ".button" by default, you will need to modify the skin component where you want the buttons to use the other defined skin parts:
I need to fix this so you don't have to make the call to apply the texture after setting the skin component.