I am currently trying to change the RenderToVertexBuffer sample
from being a particle-system to a GPU-sided mesh-deformation.
The mesh holds position, normal and texture-coordinates for each vertex.
(I use a modified version of the manual sphere generation method from http://www.ogre3d.org/tikiwiki/ManualSphereMeshes for that.)
Code: Select all
manual->position( x0, y0, z0); // position
manual->textureCoord((float) seg / (float) nSegments, (float) ring / (float) nRings); // actual texture-coordinates (uv)
Ogre::Vector3 normal = Vector3(x0, y0, z0).normalisedCopy();
manual->textureCoord(normal.x, normal.y, normal.z); // normal
Current vertex-declaration for the R2VB-object:
Code: Select all
Ogre::VertexDeclaration* vertexDecl = r2vbObject->getVertexDeclaration();
std::size_t offset = 0;
offset += vertexDecl->addElement(0, offset, Ogre::VET_FLOAT3, Ogre::VES_POSITION).getSize(); // position
offset += vertexDecl->addElement(0, offset, Ogre::VET_FLOAT2, Ogre::VES_TEXTURE_COORDINATES,0).getSize(); // actual texture-coordinates (uv)
offset += vertexDecl->addElement(0, offset, Ogre::VET_FLOAT3, Ogre::VES_TEXTURE_COORDINATES,1).getSize(); // normal
Code: Select all
void deform_vs(
inout float3 pos : POSITION,
inout float2 uv : TEXCOORD0,
inout float3 normal : TEXCOORD1,
uniform float4 bvspheres[2]
)
{
for (int i=0; i<bvspheres.length; ++i)
{
pos = push_vertex(pos, bvspheres[i]);
}
}
being able to make changes to the vertex data via my deformation-shader or do I have to write everything mentioned above myself?
Currently, normals (VES_NORMAL) don't seem to be supported as a semantic in the R2VB-implementation and the ParticleGS-sample does rendering itself. (be it very simple, its just particles)
The only way I see it working is, if the semantics of my vertex-declaration are what they actually represent, and not just texture-coordinates that Ogre doesn't understand.
Any ideas?