[ACCEPTED] Z up?

Marc

02-12-2007 21:35:05

Hi !

I read through the tutorials and looked at the example code and it looks really impressive.

All examples/tutorials are written for Y up. Is this done to be in line with the main ogre examples or is there some technical limitation that prevents PagedGeometry from being used in a Z up world? E.g. the pages, are they always aligned in the xz plane or can they be aligned to xy? What about those Loaders that take a function for calculating the 3rd coordinate? Does that only work for Y? I guess this could be addressed by writing my own Loader though. But the page aligning is something I can't get from the doc. Are there other things to be concidered using PagedGeometry in a Z up scene?

Best regards
Marc

JohnJ

02-12-2007 22:18:22

Unfortunately, PagedGeometry's use of Y as the up vector is hard-coded into almost every subsystem of the engine. The reason I did this was:

A. It's a lot simpler and easier when you don't have to constantly perform up-vector independent calculations.

B. It never occurred to me that anyone would actually use Z as an up vector in a game - in every mathematical and logical representation of 3D space I've ever seen (except in some quirky modelers), X is the width dimension, Y is the height (up/down) dimension, and Z is the depth dimension.

You could probably convert PagedGeometry to use Z as an up vector if you wanted, but it would take a bit of boring work swapping Z's and Y's, and would be very difficult to keep up-to-date with the latest releases and bug fixes.

Marc

03-12-2007 13:59:43

Well that's completely contrary to my opionion. I can't understand why y should be up? Most games are somewhat landscape-human-walking-on-the-floor-like. It feels aboslutely natural for me to have x and y as the positional coordinate of entities on such a scene, having the 3rd koordinate z additionally for the height. You'ld use lots of algorithms like pathfinding that work in 2D on top of that and it's strange to always ommit the value Y in between when connecting to those algorithms. And since all your examples are landscape-human-walking-on-the-floor-like, that's really strange to me. It's different for space flight simulators where all dimensions are somewhat equal.

The only reason I understand why y is up because of the way you start working programming in 3d. The screen coordinates of your display are x and y from 2d and then you add z as your death. Sind you are a human and look horizontal all day and you display stands in front of you y is up and z is the depth. But as you turn around, z isn't the depth in world coordinates anymore ...

Anyways. I agree swapping y/z myself in the code everywhere isn't an option if I have to patch every update you do for z up again and again. I can imagine a lot of strange things happening and not knowing whether it's because of my z-up modifications or your new patch features. :/

Are you ever going to concider supporting z up? I would be willing to work on that but not if such changes are clearly out of chance getting in the supported release already.

Best regards
Marc

JohnJ

03-12-2007 15:06:30

Obviously it all depends on your opinion, and what you're "used" to.

The only reason I understand why y is up because of the way you start working programming in 3d. The screen coordinates of your display are x and y from 2d and then you add z as your death.
Yeah, I think that's the main reason why most people use x as width, y as height, and z as depth - whether you're using 2D or 3D, you always know that x/y represent width and height - it's consistent. Similarly, if a Z component exists, it will always represent depth - always (likewise, terms like the "Z-buffer" won't ever confuse you, whether you're using 2D or 3D coordinates).

It feels aboslutely natural for me to have x and y as the positional coordinate of entities on such a scene, having the 3rd koordinate z additionally for the height. You'ld use lots of algorithms like pathfinding that work in 2D on top of that and it's strange to always ommit the value Y in between when connecting to those algorithms.
Good point, that makes sense. But in a way, you might sometimes want to use x/z coordinates in a pathfinding algorithm, for example, if not simply to be aware of the fact that it's being applied to 3D data, not 2D data.

Are you ever going to concider supporting z up? I would be willing to work on that but not if such changes are clearly out of chance getting in the supported release already.
I'm not sure. I wouldn't completely rule out the possibility, but I don't expect to be able to add z-up support any time soon.

I've got an idea: what if you rotated Ogre's root scene node up by 90 degrees, so even though to you you're using Z-up, it will actually render in a Y-up orientation . It might work, but I'm not sure.

Marc

03-12-2007 16:34:36

I've got an idea: what if you rotated Ogre's root scene node up by 90 degrees, so even though to you you're using Z-up, it will actually render in a Y-up orientation . It might work, but I'm not sure.

I'm never sure if it's a good idea to do that. I remember sinbad saying something about don't do that with the terrain scene manager for example because then the lod calculation might get confused. But then again, I might add another scenenode, attach all my stuff to that one and rotate it.

What is the y up ccordinate system you are actually using? x to the right, y up, z into the screen? That would be left hand coordinate system?!? negative z into the screen?

From your tutorials it seems the only thing that connects PagedGeometry to Ogre is the camera. So what if I subclass ogre's camera and overload it's positional methods to to the transformation? In the constructor of pagedgeometry, you get the sceneMgr from the camera. Hard to tell without reading all the rest of the code if that one would have to be wrapped the same way to make it work - and other objects you get from that again as well ...

JohnJ

03-12-2007 17:13:47

What is the y up ccordinate system you are actually using? x to the right, y up, z into the screen? That would be left hand coordinate system?!? negative z into the screen?
I think Ogre uses a right-handed coordinate system, so:

+X = right
+Y = up
-Z = forward (into the screen)

Personally, I used to prefer working with DirectX's default right-handed (+Z = forward) system (which seems more intuitive to me), but I don't mind -Z so much.

So, using right-handed coordinates, I assume you're using this:

+X = right
+Y = forward (into the screen)
+Z = up

Actually that makes a little more sense to use Z as up in a right-handed coordinate system, as far as the signs of directions goes, but Z still makes more sense as a depth component to me.

From your tutorials it seems the only thing that connects PagedGeometry to Ogre is the camera. So what if I subclass ogre's camera and overload it's positional methods to to the transformation?
I don't think it would be that simple, because PagedGeometry is also tied to Ogre through the root scene node, which it uses to attach all entities. Now that I think about it, it might not be so hard after all to support Z as an up-vector if I were to have PagedGeometry create it's own root scene node, which could then be rotated/transformed to accommodate any coordinate system you want. The global camera coordinates of course can be easily converted to a value local to the PagedGeometry root node's coordinate system, so that's not a problem either.

It looks like official Z-up support is a definite possibility now . I'll see what I can do.

Edit: I just got PagedGeometry working with Z as the up-vector (or practically any other direction you can think of) . I'll test it a little more and upload to CVS for you to try out.

Marc

03-12-2007 22:15:32

DirectX's default coordinate system is

+x = right
+y = up
+z = forward ???

That wouold be a left handed coordinate system? DirectX's default coordinate system is left handed?

Actually what I use as a my coordinate system is

+x = forward
+y = left
+z = up

This might look strange at first but makes perfect sense if you look at it from above. Because if you place an entity in the scene with a transformation matrix equal Identity you have the entity look at the positive x direction. From there on you can rotate it against the clock like you learned it in school. So actually not only z up or y up is the whole question. Also where do your entities look at if you place them with identity transformation in your scene is also somehow related to the question of how to generally set up your scene.
Another benefit is that y and z create the actual screen coordinates and are more next to each other that way then having y as forward in between.

But I guess that all depends on personal preferings. I just found that using this coordinate system I have a lot less trouble finding where things are and placing them where I want them to be.

Lol, I actually hoped for talking you into supporting z up somehow but your edit line is really surprising. Great success! Thanks. Notify me when it's in cvs and I'll experiment with it.

JohnJ

04-12-2007 00:52:44

Sorry it took so long - I was away for a few hours.

Anyway, try the CVS now. Any right-handed coordinate system should work fine, since you can supply an up-vector and a right-vector. For example, to configure PagedGeometry to use your coordinate system, simply add this one line:
`pagedGeometry->setCoordinateSystem(Vector3::UNIT_Z, Vector3::NEGATIVE_UNIT_X);`
That will set Z as the up vector and -X as the right vector. Let me know if it works

DirectX's default coordinate system is

+x = right
+y = up
+z = forward ???

That wouold be a left handed coordinate system? DirectX's default coordinate system is left handed?

As far as I know, yes, DirectX uses a left-handed coordinate system.

Edit: Here's the proof:
http://msdn2.microsoft.com/en-us/library/bb204853.aspx
Typically 3D graphics applications use two types of Cartesian coordinate systems: left-handed and right-handed. In both coordinate systems, the positive x-axis points to the right, and the positive y-axis points up. You can remember which direction the positive z-axis points by pointing the fingers of either your left or right hand in the positive x-direction and curling them into the positive y-direction. The direction your thumb points, either toward or away from you, is the direction that the positive z-axis points for that coordinate system. The following illustration shows these two coordinate systems.

Direct3D uses a left-handed coordinate system. (...)

Actually what I use as a my coordinate system is

+x = forward
+y = left
+z = up

Wow. That's pretty much the most non-standard coordinate system I've ever seen. Sure seems confusing and counter-intuitive to me, but I guess that just goes to show just how important opinion is in coordinate systems

But I guess that all depends on personal preferings. I just found that using this coordinate system I have a lot less trouble finding where things are and placing them where I want them to be.
Yeah. I find the X/Y/Z = right/up/back or right/up/forward to be the most intuitive for me, and that's probably because it's what I've always seen used and therefore what I've always used myself.

Marc

05-12-2007 19:30:28

`pagedGeometry->setCoordinateSystem(Vector3::UNIT_Z, Vector3::NEGATIVE_UNIT_X);`

That'ld result in a left hand coordinate system, but

(0) pagedGeometry->setCoordinateSystem(Vector3::UNIT_Z, Vector3::NEGATIVE_UNIT_Y);

or

(1) pagedGeometry->setCoordinateSystem(Vector3::UNIT_Z, Vector3::UNIT_X);

should work. I experimented with modifying Example 7 by removing the terrain loading and letting the heightfunction return 0 always. (1) works if I also change the tree positioning

position.x = Math::RangeRandom(0, 1500);
position.z = Math::RangeRandom(0, 1500);

to

position.x = Math::RangeRandom(0, 1500);
position.y = Math::RangeRandom(0, 1500);

. But (0) places all trees along one line - even if I try

position.x = Math::RangeRandom(0, 1500);
position.y = Math::RangeRandom(0, 1500);
position.z = Math::RangeRandom(0, 1500);

Strange. The grass works in both cases though.

I found it. Because of the negative y axis, 0,1500 is out of bounds after applying the inverse transform at the beginning of addtree. Then some coordinate gets set to 0. So

position.x = Math::RangeRandom(0, 1500);
position.y = Math::RangeRandom(-1500, 0);

does the trick. While this is somewhat confusing in combination with

pagedGeometry->setMapBounds(TBounds(0, 0, 1500, 1500));

, it works for me. I can live with (1)

Thanks a lot
[/edit]

JohnJ

05-12-2007 21:57:05

I'm glad you got it working

While this is somewhat confusing in combination with

pagedGeometry->setMapBounds(TBounds(0, 0, 1500, 1500));

, it works for me.

I can fix that problem, but in order for it to be coordinate-system independant, all the setBounds() functions would have to be changed to accept two Vector3's (for minimum and maximum bounds) instead of Ogre's TRect<Real>. So, for example, instead of calling:
`(...)->setBounds(TBounds(0, 0, 1500, 1500));`
You would call:
`(...)->setBounds(Vector3(0, 0, 0), Vector3(1500, 0, 1500));`
Which is probably a little more confusing. Which would you prefer?

Marc

06-12-2007 00:17:22

`(...)->setBounds(Vector3(0, 0, 0), Vector3(1500, 0, 1500));`

Isn't that what you'ld have to call? Since I have z-up I want the grid to have some expanding along x and y so I would call

`(...)->setBounds(Vector3(0, 0, 0), Vector3(1500, 1500, 0));`

Anyways, I think that's even more complicated because it creates the impression you can specify the plane of the grid once again - additionally to the setCoordinateSystem() you added for me already. I think it's ok the way it was mentioning somewhere that the bounds with

`(...)->setMapBounds(TBounds(0, 0, 1500, 1500));`

have to be given in the pagegeom local coordinate system. If I set one of the coordinate axis to negative then it's logical the bounds in world coordinates are transformed like that. Maybe it'ld help to let the vectors you give to setCoordinateSystem are not up and right but the vectors that correspond to the plane the bounds refer to. So in your case x and z and in mine (0) -y and x - or - (1) x and y.

Vectrex

06-12-2007 01:29:32

does any of this slow things down at all? Because if they do it'd be better to use a #define or something.

Actually what I use as a my coordinate system is

+x = forward
+y = left
+z = up

..that's CRAZY

JohnJ

06-12-2007 04:20:40

does any of this slow things down at all? Because if they do it'd be better to use a #define or something.
Not much - basically it attaches all PagedGeometry to a SceneNode instead of directly attaching it to the root scene node. But a #define is still a good idea, so I think I'll do that.

Isn't that what you'ld have to call? Since I have z-up I want the grid to have some expanding along x and y so I would call (...)
Yeah, that's right (my mistake).

Maybe it'ld help to let the vectors you give to setCoordinateSystem are not up and right but the vectors that correspond to the plane the bounds refer to
That's a good idea for helping with the bounds, but I think overall using an up and right vector is less error prone; for example if someone mistakenly supplied UNIT_Z as the forward vector instead of NEGATIVE_UNIT_Z with a right vector of UNIT_X, all their trees would be upside down (otherwise a left-handed coordinate system would have to be used, which is impossible in Ogre I think).

> pagedGeometry->setCoordinateSystem(Vector3::UNIT_Z, Vector3::NEGATIVE_UNIT_X);

That'ld result in a left hand coordinate system, but

Not really - given any two axes, there are two main types of coordinate systems: one right-handed, and one left-handed. The setCoordinateSystem() function will work with practically any two vectors (so long as they are at right angles from each other) - it simply calculates the appropriate forward vector automatically in a way that results in a right-handed coordinate system.

For example, the above example (setCoordinateSystem(Vector3::UNIT_Z, Vector3::NEGATIVE_UNIT_X)) results in a valid right-handed coordinate system because it calculates that the forward vector needs to be NEGATIVE_UNIT_Y (not UNIT_Y) in order to be right-handed.

Marc

06-12-2007 13:20:13

Not really - given any two axes, there are two main types of coordinate systems: ...
I meant it in relation to my (0) and (1).

What do you bean by #define? Instead of setCoordinateSystem hard compile the coordinate system in the lib?

JohnJ

06-12-2007 15:12:03

What do you bean by #define? Instead of setCoordinateSystem hard compile the coordinate system in the lib?
No, just a #define that will remove the setCoordinateSystem() code from the library, for people who don't use it and don't want it.

kungfoomasta

06-12-2007 17:42:13

I agree with Vectrex, that is crazy!

In math classes I was always taught using 3d graphs with y up, so it feels natural. I wonder how many users would actually prefer to change from this.. well its already implemented it seems, great job on that.

jeng

14-12-2007 14:22:12

This is very handy, thanks... A number of game engines use Z up including ours

oracle1124

21-12-2007 02:59:42

The only reason I understand why y is up because of the way you start working programming in 3d. The screen coordinates of your display are x and y from 2d and then you add z as your death. Sind you are a human and look horizontal all day and you display stands in front of you y is up and z is the depth. But as you turn around, z isn't the depth in world coordinates anymore ...

Here is my two cents ....

Y is up regardless of 2D or 3D. If you go from 2D to 3D, x, y are screen corordinates and z is the depth. Back on the 2D screens, x was from left to right and y is from up to bottom. Same holds true for 3D games x is your left/right, y is your up/down and z is your forward/back.

The same holds true for mathematics and physics. When your at school learning about coordinates, the x is the left to right and the y is up and down.

The Y is up is sort of an "unofficial" standard across many fields not just Ogre/PG/Video Games.

Cheers
o