zarfius
10-09-2009 03:32:37
This is more of a general PhysX question as I'm not using NxOgre. I am however using Ogre with PhysX so I thought this might be an appropriate forum anyway (the PhysX forums are crap btw
)
So, I'm implementing a character controller with all your typical first person behaviour. I've implemented walk (including stairs, etc) and jump without any trouble and to some degree pickup and throw. For the pickup code I've used a Fixed Joint and attached one end to the character controllers actor and the other end to the object actor.
Then, when I rotate the player I manually rotate the controllers actor so that the object stays in front of the camera as if the player is holding the object (nothing special here, just think about how every other first person game works)
The problem is that the character controller actor randomly stops rotating sometimes. After looking at the documentation it says that you should never change the properties of the controller's actor manually. So how am I supposed to ahieve this without doing that? The first thing I tried was creating another actor attached to the player's actor by another fixed joint and rotating that instead. The problem is that the extra actor's sole purpose is to get around the original issue. I don't need this extra actor to have a shape or body but I'm required to do so by PhysX.
Surely there is a better way to acheive this typical behaviour of a first person character controller?
betajaen
10-09-2009 10:56:26
Basically. How it works is your your NxCharacter doesn't have a particular direction it's facing. It's faceless it's not facing any direction.
This yaw should be applied to the visualisation of the Character only. To move the character then you just multiply the yaw value (as a quaternion) to the local movement vector of the character (step forward, left, right, backwards). You've done this already and the PhysX tutorials teach it this way. So I assume your implementation is like this.
As for carrying, I wouldn't use a joint at all. I remember years ago, I was working with a project with a friend and he suggested an additional character controller acting as a hand. Although it worked pretty well for physically knocking things over, pushing open doors with it, it wasn't so good with carrying.
What I've seen from Source based games; Half-Life 2, Fallout 3. It seems that the carried object is just "hovered" then projected directly in front of the actor. You have the yaw, and the characters position. So if you just "project" the vector a bit you have a 3D vector of where the object should be in space. You can also have pitch of how far up/down the character is looking at, you can directly take this from the mouse - and tweak it a little so it feels right.
Now moving the object to that position, then keeping there is a little tricky. If you replace it with a dummy kinematic NxActor or a Character controller, you have a risk of it clipping through other NxCharacters or static NxActors.
What you could do, is make a semi-Kinematic Actor. That responds to collisions one-way, and doesn't move around unnecessarily.
This is NxOgre code;
class kactor : public NxOgre::Machine
{
public:
kactor(Shape* shape, float mass, const Vec3& position, OGRE3DRenderSystem* render_sys, Scene* scene)
: mScene(scene), mRenderSystem(render_sys)
{
NxOgre::RigidBodyDescription description;
description.mMass = mass;
description.mBodyFlags |= NxOgre::Enums::BodyFlags_DisableGravity;
mActor = mRenderSystem->createBody(shape, position, "cube.1m.mesh", description);
mScene->registerMachine(this);
}
~kactor()
{
mRenderSystem->destroyBody(mActor);
mScene->unregisterMachine(this);
}
void simulate(float user_deltatime)
{
mActor->setLinearMomentum(NxOgre::Vec3(0,0,0));
mActor->setAngularMomentum(NxOgre::Vec3(0,0,0));
}
NxOgre::Scene* mScene;
OGRE3DRenderSystem* mRenderSystem;
OGRE3DBody* mActor;
};
But you can pick it out what it means. Basically, every frame it zeros the linear and angular momentum of the Actor (the thing your carrying). It can collides with other NxActors in the scene, and it self for a small instances responds to that collision. You can tweak it so it isn't so rigid by setting the momentums to 1/100th of what they was a frame ago, so there is a little leeway.
Since you have a semi-kinematic actor that does two-way collisions (almost), and you have a 3D coordinates of where the actor should be at that correct moment. It's just a matter of setting the correct amount of velocity that it exactly moves to the correct 3D position before the next frame, when the momentum from that velocity is cancelled out. I'll leave you to that math, although I can't imagine it would be too difficult.
Hope this helps.
zarfius
10-09-2009 13:01:01
This yaw should be applied to the visualisation of the Character only. To move the character then you just multiply the yaw value (as a quaternion) to the local movement vector of the character (step forward, left, right, backwards). You've done this already and the PhysX tutorials teach it this way. So I assume your implementation is like this.
Yep, the character controller moves a scene node (bodyNode) which has the camera attached to it (representing the head). Moving the mouse left and right changes the yaw of the body node, and moving the mouse up and down changes the pitch of the camera. Keeping them separated like this makes it easy to calculate direction of movement and still allow the player to look up and down.
As for carrying, I wouldn't use a joint at all. I remember years ago, I was working with a project with a friend and he suggested an additional character controller acting as a hand. Although it worked pretty well for physically knocking things over, pushing open doors with it, it wasn't so good with carrying.
An interesting idea. One thing that I've just come to realise is that you don't really want to give the player "extra strength" while holding an object. In my current implementation the player can't push over a stack of objects when not holding something but while holding something it gives the object the ability to push anything out of the road.
What I've seen from Source based games; Half-Life 2, Fallout 3. It seems that the carried object is just "hovered" then projected directly in front of the actor. You have the yaw, and the characters position. So if you just "project" the vector a bit you have a 3D vector of where the object should be in space. You can also have pitch of how far up/down the character is looking at, you can directly take this from the mouse - and tweak it a little so it feels right.
It's funny you mention that. I'm modelling my current implementation on the behaviour of Source based games, specifically Portal (I'm not trying to get it exactly the same, but using it as a reference). Using the joint I've got it really close the the same behaviour with a fairly simple implementation, if it wasn't for the weird issue where rotating the character controllers actor randomly stops working half way through a rotation I would be sticking with it.
Now moving the object to that position, then keeping there is a little tricky. If you replace it with a dummy kinematic NxActor or a Character controller, you have a risk of it clipping through other NxCharacters or static NxActors.
Good point. When using the joint I find that pushing the object against a wall causes it to move around all jittery because the joint is trying to pull it back into position. The logical solution is to make the joint breakable, but so far I haven't been able to find values that work well. Either the joint is too strong and won't break when it should, or the joint is too weak it breaks simply by moving around.
What you could do, is make a semi-Kinematic Actor. That responds to collisions one-way, and doesn't move around unnecessarily.
Yeh, I see what you mean. I'll have a play around and see how it goes.
But you can pick it out what it means. Basically, every frame it zeros the linear and angular momentum of the Actor (the thing your carrying). It can collides with other NxActors in the scene, and it self for a small instances responds to that collision. You can tweak it so it isn't so rigid by setting the momentums to 1/100th of what they was a frame ago, so there is a little leeway.
It looks like I'm going to have to do a lot of tweaking to get this right. I guess that's why they say 90% of game development is polish
Thanks for your help.
zarfius
11-09-2009 01:30:53
I sorted out this issue.
It turns out that it the actor was falling asleep after some time and that's why the rotations stopped responding. Moving the character controller would wake the actor up but standing still and rotating wasn't. I'm not sure why setGlobalOrientationQuat wasn't waking up the actor automatically but it appears that if I call wakeUp(1) manually just before setting the orientation it solves the problem (unless I'm just getting really lucky in my testing).
It still makes me curious why the PhysX documentation says that you shouldn't manually change the properties of the character controllers actor, but I assume they are talking about moving the actor rather than rotating. This would make sense because the controller->move() function does a lot more than just moving the actor. Also, I don't think rotating the controllers actor actually rotates it's bounding shape, ironically this side effect actually makes things a lot easier for me.
So for now I'm very happy with the results. I'll explore this issue more only if it becomes an issue again
Thanks anyway for the help, at least I have a number of options in the future.