Nvidia selling lies with PhysX?

Discussion in 'Hardware & Software' started by Asus, Jul 13, 2010.

  1. Asus

    Asus Newbie

    Joined:
    Aug 22, 2003
    Messages:
    10,350
    Likes Received:
    0
    The gist is that the code used for PhysX to run on CPUs is mostly X87 and single threaded while it could have been SSE instructions and multithreaded. Currently when run without a Nvida GPU and having the PhysX done on the CPU instead PhysX is very slow. But it COULD be very fast...possibly faster than when run on a Nvidia card (vs a CPU w/4 cores free and current double-precision SSE compared to older CPUs).
    And support for the old Ageia PhysX cards is no more with current drivers so compatibility can't be holding Nvidia back.

    So did Nvidia pick the slowest code for CPUs to make their cards look like they run PhysX better for a selling point or do they just not care?
     
  2. VirusType2

    VirusType2 Newbie

    Joined:
    Feb 3, 2005
    Messages:
    18,192
    Likes Received:
    0
    Yeah, you must have an Nvidia card installed or they will nerf the PhysX code. On purpose. It's part of the agreement; otherwise you aren't licensed to use PhysX at all. (without it, you can't play certain games)
     
  3. Asus

    Asus Newbie

    Joined:
    Aug 22, 2003
    Messages:
    10,350
    Likes Received:
    0
    example of scaling with PhysX (last graph).

    So that ATI card + CPU is being held back since the CPU is running the PhysX code as a single thread and using X87 code.
    SSE would double performance (from TR article). Multi-threaded would help out more.
    If you had a 6 core CPU running a game that did not take advantage of quad core (like using 2 w/ 4 free) but had physx you could run the game as well as the benchmark with 2 Nvidia GPUs.
     
  4. Dinnesch

    Dinnesch Space Core

    Joined:
    Jan 31, 2009
    Messages:
    1,285
    Likes Received:
    6
    I've never really liked this PhysX stuff, surely those realistic-looking cloths look great.
    But compared to a lot of CPU-based physics engines, the 8FPS framerate I get in Mirrors Edge(when PhysX are on) on my i7-860 and HD5850 doesn't really convince me.
     
  5. VirusType2

    VirusType2 Newbie

    Joined:
    Feb 3, 2005
    Messages:
    18,192
    Likes Received:
    0
    Yeah, I'm not a fan of proprietary computational code. This is bad for competition.

    I suppose licensing physics engines like Havok cost much more money for a developer, so they may opt for PhysX (unfortunately for ATi owners). EDIT: and it sounds like (after reading the response below) that PhsyX is free for developers to use, further encouraging them.
     
  6. Tee Kyoo Em

    Joined:
    May 19, 2003
    Messages:
    892
    Likes Received:
    0
    I don't see how NVIDIA has been "selling lies" to anyone regarding PhysX.

    They have never proclaimed PhysX as a platform-neutral standard with a software-fallback path that rivals the hardware-accelerated one. It was pretty obvious to me that they would optimize PhysX to run best on their hardware.

    On top of that, the PhysX SDK is freely available. Anyone interested in implementing physically-based simulations using PhysX can and should measure the performance of the software fallback path and then decide whether PhysX is the right solution for their project.
     
  7. Asus

    Asus Newbie

    Joined:
    Aug 22, 2003
    Messages:
    10,350
    Likes Received:
    0
    What is funny about this speculation is that it might be possible to get not just equal but better performance on a PC with 4-8 cores if they switched what they optimized for... (CPU running physx instead of GPU)

    See, when AMD bought ATI they could have made crossfire 'run faster' on AMD machines (AMD CPU, ATI chipset and ATI GPU) which really means slowing it down on others. But no they kept their departments separate. They even let Intel's chipsets run Crossfire...
    Why does Nvidia disable the ability to run an ATI GPU with a Nvidia GPU as the PPU? No you need 2 Nvidia cards now to have a dedicated PPU since Vista. PhysX could have been kept separate when they bought Ageia. Maybe the licensing could have been where they got the money from but allowed it to be run on their PPU card or any GPU or CPU rather than tying it to Nvidia graphic cards.
     
  8. VirusType2

    VirusType2 Newbie

    Joined:
    Feb 3, 2005
    Messages:
    18,192
    Likes Received:
    0
    No, they aren't playing fair. It's called The Network Effect. For example, with Apple IPhone's 'Facetime' video chat app, it requires both parties to be using an Apple IPhone.

    The more people that have an IPhone, the more people will want an IPhone so they will be compatible. Outsiders (Android users, for example) will be out of the loop.

    Ebay did it, Twitter did it, etc. When The Network Effect works, it works amazing. But costumers suffer.

    EDIT: http://en.wikipedia.org/wiki/Network_effect
     
  9. Asus

    Asus Newbie

    Joined:
    Aug 22, 2003
    Messages:
    10,350
    Likes Received:
    0
    Exactly, virus.
    I deleted part of my reply above since this sums it up better than what I wrote.

    "Kanter notes that there's no technical reason not to use SSE on the PC—no need for additional mathematical precision, no justifiable requirement for x87 backward compatibility among remotely modern CPUs, no apparent technical barrier whatsoever. In fact, as he points out, Nvidia has PhysX layers that run on game consoles using the PowerPC's AltiVec instructions, which are very similar to SSE. Kanter even expects using SSE would ease development: "In the case of PhysX on the CPU, there are no significant extra costs (and frankly supporting SSE is easier than x87 anyway)."

    So even single-threaded PhysX code could be roughly twice as fast as it is with very little extra effort.

    Between the lack of multithreading and the predominance of x87 instructions, the PC version of Nvidia's PhysX middleware would seem to be, at best, extremely poorly optimized, and at worst, made slow through willful neglect. Nvidia, of course, is free to engage in such neglect, but there are consequences to be paid for doing so. Here's how Kanter sums it up:

    The bottom line is that Nvidia is free to hobble PhysX on the CPU by using single threaded x87 code if they wish. That choice, however, does not benefit developers or consumers though, and casts substantial doubts on the purported performance advantages of running PhysX on a GPU, rather than a CPU.(hence the question about Nvidia selling lies)

    Indeed. The PhysX logo is intended as a selling point for games taking full advantage of Nvidia hardware, but it now may take on a stronger meaning: intentionally slow on everything else.
    "
     
  10. VirusType2

    VirusType2 Newbie

    Joined:
    Feb 3, 2005
    Messages:
    18,192
    Likes Received:
    0
    It really does look more like dirty tactics by gimping their code on non-nVidia chipsets. However, they did have the option of not letting it work at all, which is what most companies do. But it makes sense that they let it work somewhat. The reason is, if they did completely drop PhsyX support [for non Nvidia chipsets], less developers would use PhysX code in their apps, since less people would be able to play the game. Imagine a game that does not support ATI cards at all. [Very roughly] half of gamers wouldn't be able to play it, and thus, wouldn't buy it. So that would be a terrible idea on nVidias part.

    Right now, they give you an advantage for running Nvidia GPU. If they can lock-in a large portion of customers (for example Nvidia outnumbers ATI 3 to 1), they may pull the noose, and make it completely incompatible with ATI, which would have a snowball effect, putting ATI out to pasture.

    However, like Tee Kyoo Em pointed out, the developer of a program can tone down the PhysX effects to just be a compliment, so a strong CPU is not required to run the PhysX code.
     
  11. Tee Kyoo Em

    Joined:
    May 19, 2003
    Messages:
    892
    Likes Received:
    0
    Well, what I've actually said is that you can measure the performance of PhysX for free without having to rely on "lies that are being sold to you".

    I've always seen the PhysX SDK as a wrapper for NVIDIA GPUs and not as a general purpose API that adheres to an open standard with the guarantee of running performant on any hardware or even in software.

    Besides, you might as well blame all the developers who chose to go with PhysX for pretty much knowing how their game might perform on hardware lacking an NVIDA GPU. "The way it's meant to be played", I guess.

    Ultimately, I can see PhysX eventually going the way of the dodo with efforts such as OpenCL and free alternatives like Bullet Physics.
     
  12. Asus

    Asus Newbie

    Joined:
    Aug 22, 2003
    Messages:
    10,350
    Likes Received:
    0
  13. BabyHeadCrab

    BabyHeadCrab The Freeman

    Joined:
    Dec 2, 2003
    Messages:
    13
    Likes Received:
    392
    physics processing unit

    haha
     
  14. Tee Kyoo Em

    Joined:
    May 19, 2003
    Messages:
    892
    Likes Received:
    0
    A whole lotta words for ya.

    I didn't get around to mention this yesterday, but here it goes:

    I wouldn't actually be surprised that you can outperform a GPU with a multi-core CPU. The problem is that just because you can show that a certain CPU outperformed a certain GPU in a certain test case it doesn't mean that this holds true for the general case. Such test cases can usually afford to spend all available CPU cycles on processing the test alone. I'll go out on a limb here and say that you'd get rather different results in a "real-world test" using a game that has to process all kinds of other tasks in addition to physically-based simulation, especially if it plays to the strengths of a GPU with a massive numbers of cores.


    Anyway, according to the article about market shares AMD is at 20% while NVIDIA is at 24% (not surprised about Intel leading the market by merrily churning out massive amounts of crappy graphics chips that are shipped out with motherboards). However, if you have a look at the Steam Hardware Survey results you get a slightly different picture. As of my writing, a whopping 59% of people are sporting an NVIDIA chipset of some sorts, while "only" 33% are sporting an AMD/ATI chipset of some sorts. I'd like to pretend that Valve's results are the more relevant ones since they are directly gathered from gaming machines and with Intel being the sore loser at about 6% I almost feel as if world is still okay (at least in the Valve universe).


    Lastly, baby head crab.

    Ha ha ha.
     
  15. Druckles

    Druckles Party Escort Bot

    Joined:
    Dec 13, 2004
    Messages:
    10,656
    Likes Received:
    10
    More relevant if you're looking at a specific market. ATi get quite a bit from people not willing to spend £300 on a graphics cards. It wasn't until last year you could spend quite that much on a single ATi card.

    Just because 'everyone does it' and it's a viable business scheme doesn't mean it's far or right. In the same way that Valve put their consumers first, I feel PhysX is not a reason to purchase an NVIDIA card. Encouraging business which is unfair on me is a silly idea.

    Basically, I am a consumer, and I don't give a **** whether its a good tactic for them, it puts me at a disadvantage.
     
  16. Dinnesch

    Dinnesch Space Core

    Joined:
    Jan 31, 2009
    Messages:
    1,285
    Likes Received:
    6
    I hope Nvidia will do something about this 'scandal' and improve their CPU support, after reading the following as a ATI-using graphics whore my heart is broken:

    Code:
       
    [B]Mafia 2 system requirements[/B]
    
    MINIMUM SYSTEM REQUIREMENTS
        Operating System: Microsoft Windows XP (SP2 or later) / Windows Vista / Windows 7
        Processor: Pentium D 3Ghz or AMD Athlon 64 X2 3600+ (Dual core) or higher
        RAM: 1.5 GB
        Video Card: nVidia GeForce 8600 / ATI HD2600 Pro or better
        Hard Disc Space: 8 GB
        Sound Card: 100% DirectX 9.0c compatible sound card
        Peripherals: Keyboard and mouse or Windows compatible gamepad
    
        RECOMMENDED SYSTEM REQUIREMENTS
        Operating System: Microsoft Windows XP (SP2 or later) / Windows Vista / Windows 7
        Processor: 2.4 GHz Quad Core processor
        RAM: 2 GB
        Video Card: nVidia GeForce 9800 GTX / ATI Radeon HD 3870 or better
        Hard Disc: 10 GB
        Sound Card: 100% DirectX 9.0c compliant card
        Peripherals: Keyboard and mouse or Windows compatible gamepad
    
        PHSYX/APEX ENHANCEMENTS SYSTEM REQUIREMENTS
        Operating System: Microsoft Windows XP (SP2 or later) / Windows Vista / Windows 7 Minimum Processor: 2.4 GHz Quad Core processor
        Recommended Processor: 2.66 GHz Core i7-920 RAM: 2 GB
    
        Video Cards and resolution: APEX medium settings
        Minimum: NVIDIA GeForce GTX 260 (or better) for Graphics and a dedicated NVIDIA 9800GTX (or better) for PhysX
        Recommended: NVIDIA GeForce GTX 470 (or better)
    
        Video Cards and resolution: APEX High settings
        Minimum: NVIDIA GeForce GTX 470 (or better) and a dedicated NVIDIA 9800GTX (or better) for PhysX
        Recommended: NVIDIA GeForce GTX 480 for Graphics and a dedicated NVIDIA GTX 285 (or better) for PhysX NVIDIA GPU driver: 197.13 or later.
        NVIDIA PhysX driver: 10.04.02_9.10.0522. Included and automatically installed with the game. 
     
  17. BabyHeadCrab

    BabyHeadCrab The Freeman

    Joined:
    Dec 2, 2003
    Messages:
    13
    Likes Received:
    392
    What PC games on the pipeline that utilize half the power of these GPUs will actually take advantage of PhysX? The market isn't there. It's not a loss for consumers or ATI, really. Nvidia, however, appears to have ****ed up.
     
  18. Tee Kyoo Em

    Joined:
    May 19, 2003
    Messages:
    892
    Likes Received:
    0
    It seems that some people have a need for controversies in their life and would say anything to justify their purchase, despite rational and unbiased arguments having been brought into the debate.

    Since I've seen you spazzing out over the announcement of the Natural Selection 2 Alpha in another thread, I'd like to point out that Unknown Worlds are actually using PhysX and to top it all off they haven't even taken advantage of any hardware acceleration as of recently (if they're going to at all). I wonder what you're gonna make of this.
     
  19. DEATH eVADER

    DEATH eVADER Space Core

    Joined:
    Nov 10, 2003
    Messages:
    8,147
    Likes Received:
    14
    Is PhsyX even desirable in multiplayer? How would the physics calculations be done, on the client or the server? As long as the PhysX debris is for visuals only and doesn't have any effect on the gameplay itself (i.e. get killed by a piece of flying shrapnel), otherwise that would generate more useless data to clog up the network connection. Its hard as it is to find a decent server that doesn't spaz out every 120 seconds
     
  20. Tee Kyoo Em

    Joined:
    May 19, 2003
    Messages:
    892
    Likes Received:
    0
    PhysX, Havok and Bullet all provide similar APIs and physics models to work with. The bare gist of it is that they'll let you set up scenes consisting of collision shapes, rigid bodies, joints, etc. that are then integrated over time. Additionally, PhysX allows you to offload the processing onto the hardware.

    The integration of the physically-based simulation with the networking model of a game isn't provided by the physics library and must be dealt with by the game developer. It doesn't matter if it's PhysX or any of the other libraries, you need to be careful with how much and what kind of data you're going to send across the wire in order to maintain the illusion of a shared, virtual environment. To that end, techniques such as client-side prediction, client-side (only) simulation, etc. are still applicable today.

    Last but not least, NVIDIA recently responded to the current hubbub over the performance of PhysX, which will hopefully help to calm the hysteria down: http://www.thinq.co.uk/2010/7/8/nvidia-were-not-hobbling-cpu-physx/
     

Share This Page