Hardware Support

Topper

14-02-2011 16:25:06

Hi,

i've got a Nvidia Gefroce 9600 GT (http://www.nvidia.de/object/product_geforce_9600gt_de.html) and it supports physX. But NxOgre's hasHardware() retuns false. If i set it either way on true my program crashs. Does i miss something with this?

proof

14-02-2011 18:13:08

Well, my 9400GT supports physx too (at least that's what it says), but I can't run any of the PhysX SDK samples in hardware mode, so it may not be NxOgre, but something wrong with physx.

betajaen

14-02-2011 18:48:10

See if it's enabled in the NVIDIA control panel.

Hardware is overrated anyway, it's only useful for many cloths/fluids.

Topper

14-02-2011 19:34:21

Hi,

yes i have it activated for my graphics card. The Fluid demo from Nvidia runs with Hardware support on my pc with about 60000 particles and with software it is extremly slow. I would realy like to use the Hardware support for my fluids, so is there something i can do to turn this on?

betajaen

14-02-2011 19:48:08

If the PhysX samples do not work under hardware, then you should contact NVIDIA. Probably the PhysX forums. They can help better than I can. ;)

Topper

14-02-2011 20:11:44

Hi,

i tried the forceField demo from physx and there i can switch between hardware and software without problems. so i think it's a nxogre problem or some settings i don't set.


mWorld = NxOgre::World::createWorld();

mWorld->getRemoteDebugger()->connect();
mWorld->setCCDEnabled(true);
mWorld->setCCDEpsilon(0.01);
mWorld->setSkinWidth(0.0001);

NxOgre::ResourceSystem::getSingleton()->openProtocol(new Critter::OgreResourceProtocol());

NxOgre::SceneDescription scene_description;
scene_description.mGravity = NxOgre::Constants::MEAN_EARTH_GRAVITY;
scene_description.mUseHardware = mWorld->hasHardware();

mScene = mWorld->createScene(scene_description);

mMeshManager = NxOgre::MeshManager::getSingleton();

mDefaultMaterial = mScene->getMaterial(0);
mDefaultMaterial->setRestitution(0.1f);
mDefaultMaterial->setDynamicFriction(0.9f);
mDefaultMaterial->setStaticFriction(0.9f);


Ogre::SceneManager* mgr = Ogre::Root::getSingleton().getSceneManager("Default");
mRenderSystem = new Critter::RenderSystem(mScene,mgr);

betajaen

14-02-2011 20:34:36

Did you turn on Hardware in the WorldDescription?, it's off by default.

Topper

14-02-2011 21:16:21

Yes i tried that. But the mNoHardware description said it is off by default. so if i set it on false that would do nothing because the flag is in physx already there and the used OR Operator would do nothing.

I change this in the NxWorld class, because there was no other way to get at this flag.

betajaen

14-02-2011 21:56:07

Set it to true. In BuggySwires/Detritus (I think), it's just a boolean value.

I know hardware works, as I tried it a week ago.

Topper

14-02-2011 22:17:27

Hi,

your right in Detritus its only a bool variable, but at the moment it is does nothing. The OR(|=) don't work because physx set in 2.8.4 the NX_SDKF_NO_HARDWARE flag at default. if it is true the OR doesn't remove the flag because its already set(it must be a XOR) and for false its also set by default.


if (description.mNoHardware)
sdk_description.flags |= NX_SDKF_NO_HARDWARE;


I've change the code and now it works, belief me. Maybe you changed this already in the newest version(BuggySwires), but not in Detritus 1.6.3330.


if (description.mUseHardware)
sdk_description.flags ^= NX_SDKF_NO_HARDWARE;

betajaen

14-02-2011 22:44:40

Weird.

In BuggySwires, the code is like this. I'm positive it's the same as Detritus, but apparently not.

sdk_description.flags = 0;
sdk_description.flags |= NX_SDKF_PER_SCENE_BATCHING;

if (description.mNoHardware)
sdk_description.flags |= NX_SDKF_NO_HARDWARE;


So by default hardware is set to on (via = 0), then the mNoHardware flag, which is by default is on, switches the the hardware flag off.

Topper

14-02-2011 23:04:38

Yeah i see this two lines are missing in Detritus.


sdk_description.flags = 0;
sdk_description.flags |= NX_SDKF_PER_SCENE_BATCHING;

dirkmitt

14-04-2013 11:57:38

I'm sorry to bother you with such a petty question. I realize that hardware support for NxOGRE may strike you as pointless. But I have a GeForce GTX 460 GC, and have just updated my nVidia drivers to version 314.22 . That means I have PhysX system software version 9.12.1031 now. I'm noticing for the first time, that even if I switch mUseHardware = true in my NxOGRE demos, I get only CPU mode.

I have Buggyswires.

As a matter of principle, I'd like to be able to get my build to do GPU support.

Could you please tell me how I need to fiddle with the NxOGRE source code to make this possible?

I cannot deduce, what solution you two have found from this thread.

This is the code that I put into NxOgreWorld.cpp :


sdk_description.flags = 0;
sdk_description.flags |= NX_SDKF_PER_SCENE_BATCHING;

sdk_description.flags |= NX_SDKF_NO_HARDWARE;

if ( !(description.mNoHardware) )
sdk_description.flags ^= NX_SDKF_NO_HARDWARE;


After all, 'mUseHardware' was not a member of 'NxOgre::WorldDescription' .

Dirk

P.S. I have a vague suspicion, that within the header file NxOgreWorld.h , the declaration


static World* createWorld(const WorldDescription& = WorldDescription());


could be flawed, because it might fail to call the default constructor for the class WorldDescription. After all, createWorld wants to receive a reference. A reference to what, a temporary object that goes out of scope?

But the world would still be created: just not on the basis of a valid world description. And hence, no HW context may be created... (?)

P.S.S. Nope. I only got rid of a harmless compiler warning by turning that into a pass-by-copy.

Also, there were a few places sprinkled about NxOGRE and Critter code, where mUseHardware defaulted to false, and where I changed the default to true. But, the following is one piece of code within NxOgreWorld.cpp , which I am going to leave commented out until nVidia plays nice:


bool World::hasHardware() const
{
if (mDeadSDK)
return false;
// return true;
return mSDK->getHWVersion() != NX_HW_VERSION_NONE;
}


Finally, because Critter was meant to be statically-linked, it's not just necessary to place the recompiled NxOgre.dll into the OGRE executable's folder, but also to recompile Critter, then to recompile the NxOgre Tutorials project, each of which is set up on my box to put its target files into the correct folders afterward. I'm assuming that people have also placed the required 'physxcudartxx.dll' file where it belongs.

And as it stands I do not have GPU support with this project. What I now suspect nVidia may have done, is to drop support for 'physxcudart_20.dll' explicitly, and in so doing to land certain of their own, old SDK tools in pure CPU mode as well. I.e., 'cudart32_30_9.dll' might be white-listed. But then to use it, you'd need the general PhysX 3.x DLLs, that have been compiled against the newer dynamic libraries.

Dirk

dirkmitt

15-04-2013 18:40:34

What has happened, is that the nVidia Developer Forum, where I'm a minor member, has in fact given me tech support. This is what they wrote:


Posted on 4/15/2013 9:57 AM
Dirk,

Apex 1.0 PhysX Lab does not have any GPU-accelerated physics, so it will always run in "CPU mode."

If you want PhysX to run on the GPU and you are developing software (as you indicate), there are a fewsteps that may be required.

When initializing the SDK:
1. Clear the NX_SDKF_NO_HARDWARE bit in the NxPhysicsSDKDesc::flags member.
2. Set the GPU Heap Size: NxPhysicsSDKDesc::gpuHeapSize = 64;

Then, when you create a fluid or cloth effect, ensure that you set the hardware flags in those descriptors:
1. Set the NX_FF_HARDWARE bit in the NxFluidDesc::flags.
2. Set the NX_CLF_HARDWARE bit in the NxClothDesc::flags.

Also, if your _old_ PhysX 2.8.4 is _old enough_, it may not have support for the newest hardware, so you may want to update it.


I have followed all their advice, and now my NxOGRE demos, which were actually Betajaen's demos, do run with Cloth and Fluid in GPU mode. :D

I did catch the fact that under PhysX 2.8.x , only cloth and fluids ill receive H/W support, because they use the particle-oriented method of doing Physics (which conventional rigid-bodies do not). And, soft-bodies are really an extension of cloths. Also, since my frame-rates for the OGRE demos tend to be in the thousands of FPS, under virtually no load, it's understood that they won't change much if I switch on H/W support. But for any more serious uses, this could still become handy.

What I've done now is to add my own custom 'unsigned int' member in NxOgreWorldDescription.h, which I named 'mGpuHeapSize', and then in NxOgreWorldDescription.cpp , I added a default of 64 (MB) for mGpuHeapSize to the function WorldDescription::reset() .

Next, in NxOgreWorld.cpp I put:



NxPhysicsSDKDesc sdk_description;
sdk_description.cookerThreadMask = description.mCookerThreadMask;

sdk_description.flags = 0;
sdk_description.flags |= NX_SDKF_PER_SCENE_BATCHING;

sdk_description.flags |= NX_SDKF_NO_HARDWARE;

if ( !(description.mNoHardware) ) {
sdk_description.flags ^= NX_SDKF_NO_HARDWARE;
sdk_description.gpuHeapSize = description.mGpuHeapSize;
}


And finally, in NxOgreCloth.cpp, NxOgreFluid.cpp and NxOgreSoftBody.cpp respectively, I simply added a line thus:



NxClothDesc desc;
::NxOgre::Functions::PrototypeFunctions::ClothDescriptionToNxClothDesc(description, desc);

desc.flags |= NX_CLF_HARDWARE;

(...)

fd.flags |= NX_FF_HARDWARE;

(...)

NxSoftBodyDesc desc;
::NxOgre::Functions::PrototypeFunctions::SoftBodyDescriptionToNxSoftBodyDesc(description, desc);

desc.flags |= NX_SBF_HARDWARE;



to avoid requiring a per-object and redefined ClothDescription, FluidDescription, or SoftBodyDescription object.

Yet I felt that the need might arise, to set a non-default GPU Heap Size. And finally, I changed the default for mUseHardware to = true in SceneDescription::reset() . And then amazingly, I also discovered that my header file declaration and source file use of WorldDescription fully works, implying no silly mistakes.

Dirk

P.S. Now, one plausible reason not to have SceneDescription::reset() activate mUseHardware by default, would crop up if any part of NxOgre was to call this function outside initializing a default Scene Description, for example if an animation had crashed and wanted to restart. Yet, because my code lobs off 64MB of GPU heap anyways, regardless of whether hardware is going to be used, I decided to have it enabled by default.

The NxOgre demos would all override it to false as-is, for which reason I edited each one.