Arcanor
10-03-2008 03:07:57
To move my character I set a movement vector in my code, then call setRawNextMovementDirection().
When I run in Debug mode I've got the speed of the character more or less where I want it to be.
The problem happens when I run my application in Release mode. My application's frame rate is about 6 or 8 times as high (about 200fps or so, as compared to about 30fps in Debug mode). When I move my character in Release mode it moves much slower than I want it to.
To correct for this, I've added in a multiplier on my move vector to compensate for how much time goes by each frame. This works perfectly well for the character's rotation speed when turning left/right. However, it doesn't seem to work when moving the character.
I suspect what's going on is that setRawNextMovementDirection() is normalizing my move vector and thus ignoring the multiplier I'm using. If this is the case, then how am I supposed to regulate the speed of the character based on my frame rate, as I would normally do?
Thanks in advance!
When I run in Debug mode I've got the speed of the character more or less where I want it to be.
The problem happens when I run my application in Release mode. My application's frame rate is about 6 or 8 times as high (about 200fps or so, as compared to about 30fps in Debug mode). When I move my character in Release mode it moves much slower than I want it to.
To correct for this, I've added in a multiplier on my move vector to compensate for how much time goes by each frame. This works perfectly well for the character's rotation speed when turning left/right. However, it doesn't seem to work when moving the character.
I suspect what's going on is that setRawNextMovementDirection() is normalizing my move vector and thus ignoring the multiplier I'm using. If this is the case, then how am I supposed to regulate the speed of the character based on my frame rate, as I would normally do?
Thanks in advance!