Hi JohnJ, I just found a bug : If i add 40,000 entity with a page size about 2000, i got that :
but if i reduce the number of entity to 20,000 or set the page size to 1000, i got the normal result :
is it normal?
can you reproduce the bug?
Thanks for reporting this. It looks like a bug in BatchedGeometry - I'll try to reproduce & fix soon.
Ok, the problem has been reproduced and should be fixed in the CVS version.
The problem was that when you use batch sizes large enough, the number of vertexes per batch exceeded 65535 (or 0xFFFF), the maximum value which can be stored in an unsigned 16-bit integer. BatchedGeometry was trying to use 16-bit integers to store polygon connection data, which obviously won't work when there are more than 65535 vertices to connect. Now BatchedGeometry will upgrade the index format to 32-bit if you use batches large enough to require it. However, if your batches are this large it's recommended that you consider reducing them: not all cards support 32-bit indexes I think, and batches this large probably won't load smoothly in real-time.
just a question can you add functions to load that I call "dynamicColorMap" in a previous post, i think it's not in contradiction with your conception and coding standard, and it'll be very cool (just for me at the moment I think), because when cvs version change i must always put my code again ...
In addition, it seems to be an usefull functionality, that permit you can easely bind an Ogre::Image::getData() tab.. where this one can be created very simply for lots of usage, isn't it?