AGEIA PhysX - useless?

Discussion in 'Graphics Cards & Displays' started by Max_87, Jul 24, 2006.

  1. azcn2503

    azcn2503 Newbie

    Right.. so you guys would rather see the physics processing put back on to the GPU/CPU? Why would you want to bottleneck like that? Do you work for Intel or something?

    Ageia have a much more powerful physics solution than Havok, ATI and Nvidia. There are currently 65 titles under development, and there will be many more by the end of the year.

    55 million sphere collisions a second cannot be performed by a CPU and/or a GPU without any loss in frame rate. PhysX, however, will enable you to make this possible.
     
  2. Papercut

    Papercut Newbie

    Would YOU pay US$250 for it?
     
  3. Max_87

    Max_87 huehuehue

    read: "under developement"

    I will never buy PhysX unless until I see all those titles released (which is now under development)/PhysX supports a lot of game titles AND the PhysX is really doing something in those games. Oh, look at the CPU utilization graph posted in hardforums, PhysX doesn't seem to do much at all. They can even run the game without the card (with physics acceleration enabled) and almost without any difference in performance. :lol:

    It's not that I'd rather see the physics processing placed on GPU/CPU or whatever. ROFL :lol: It's just not worth spending $300 for PhysX now. Oh, I don't care it can do how many million/billion sphere collisions a second in the demos, I want to see they actually do it in real games. I will wait until PhysX proves itself to be worth for its prcie :mrgreen:

    Ohh, I work for intel :liar: :roll: yay! :dance: :haha:
     
    Last edited: Jul 26, 2006
  4. azcn2503

    azcn2503 Newbie

    Yes. I bought it today and it's on next day delivery.

    There is obviously demand for physics in computer games. I would like to see as many games as possible using Ageia's PhysX API's. I'm glad to be one of the people who will experience this technology as it advances.

    As for PhysX being useless, have you tried running CellFactor with EnablePhysX=false, then trying to play the game with cloth and liquid effects enabled? As soon as a piece of cloth or some fluids come on screen, the FPS drops to <1fps. Proof that the PhysX isn't useless. End of topic.
     
  5. Max_87

    Max_87 huehuehue

    That guy is running CellFactor with EnablePhysX=false. Yeah, his results might not be accurate and should not be taken too seriously.

    like I said, until those games are actually being released and they really did took advantage of PhysX, I won't buy PhysX. I'm not going to spend $300 just to play CellFactor Beta R36 and watch many million boxes bouncing around :lol: To me, it's just not worth spending $300 for a Beta game, some demos to run smoothly and some "under development" games to run smoothly. Of course, if you have loads of cash to burn, go ahead, no ones stopping you! :twisted: I will wait :p I'm not saying anything like how useful is PhysX now :p

    Oh, if you want to talk about how useful is PhysX now... It is VERY useful at running the millions-of-boxes-bouncing-around demo and CellFactor Beta R36 :mrgreen:

    Edit: oh, I forgot ghost recon.
     
    Last edited: Jul 26, 2006
  6. PsYkHoTiK

    PsYkHoTiK Admin nerd

    Havok's solution makes more sense than dropping $250... :p

    http://www.havok.com/content/view/187/77/

    Spare cycles from your SM3.0 graphics card. No bottleneck at all.

    And AGEIA is better? lol (no max its not better at GRAW :p )

    http://techreport.com/reviews/2006q2/physx/index.x?pg=5
    http://www.anandtech.com/video/showdoc.aspx?i=2751&p=4

    In game performance degrades with it on... :lol:

    I've played GRAW with the Physx driver installed (although no card) but from everything I've seen, I prolly wont get a better experience with a PPU.

    Another argument is that people with higher end CPUs and graphics card already have enough power to calculate physics on its own. A lot of people can't seem to see what's the purpose of the Phsyx card. You see a bit more on screen but get low frames. Obviously its not the other components but the Physx card itself causing it. Poor...

    If you think cell factor demo is worth it then go ahead. :p I would rather get hit by shipping back + 15% restocking fee rather than loose money that I could use to get other useful hardware.
     
    Last edited: Jul 27, 2006
  7. Adrian Wong

    Adrian Wong Da Boss Staff Member

    It's meant to inject more realism into games, so I think it should really be part of current 3D cards, rather than separate. Primarily because of cost issue.

    By integrating the physics circuitry onto the card or even better, into the GPU itself, it allows closer integration with better performance and lower cost.
     
  8. azcn2503

    azcn2503 Newbie

    CPU and GPU alone will not be able to manage things like real time liquids, cloth, steam/smoke etc. Try running CF with all the settings enabled. I don't see why people keep saying it's useless when it's obviously doing a fantastic job. Can't you embrace new technology? Are you working for old people?
     
  9. Max_87

    Max_87 huehuehue

    Hm.... so it isn't better at GRAW when physics effects are enabled?

    Yeah, PhysX get a lot of negative response about the physics effect "doesn't seem much at all" in GRAW and like you said, more powerful processors should be able to handle it. This is especially true when you look at this at a "eye candy" POV. However, it is questionable that how much physics calculations are involved when those debris flying around after the explosion which doesn't seem much at all especially when think you of it at a "eye candy' perspective.

    Some might argue that there are lots of physics calculation involved although it doesn't seem much at all, and that's why the performance is degraded so much even with the help of PhysX. If we were to run that without the help of PhysX, we might be getting <1FPS.

    Oh, but it might just be exactly as you said, our more powerful CPU/GPU are already able to calculate all that. :lol: and that defeats the whole purpose of having PhysX in your system.

    About running physics on your GPU, take a look at this http://www.theinquirer.net/default.aspx?article=33071
    So ATI/NVIDIA are bullshiting about their current GPUs could do physics calculation? :lol:

    Haha, exactly. Spending $300 just to run CellFactor and some other demos ROFL. People nowadays are just too rich :haha: I would at least wait until other games are released and see how actually PhysX performs. By the time ATI/NVIDIA might already comes up with a better solution, who knows? :lol:
     
    Last edited: Jul 27, 2006
  10. Max_87

    Max_87 huehuehue

    LoL!.... I don't think current GPUs are doing any physics calculation at all :lol: ATI/NVIDIA hasn't implement this in their GPU yet, isn't it? How did you know the GPUs could not manage all those physics calculation? :mrgreen: We will never know until ATI/NVIDIA are actually doing this on their GPUs.

    :haha:
     
  11. Adrian Wong

    Adrian Wong Da Boss Staff Member

    Nope, still no physics offloading to the GPUs in ATI/NVIDIA cards yet.

    But I think it's a feature they are likely to add since their top GPUs are so fast, well, they are just CPU-limited out of the box. You have to run them at very high settings and resolution just to benefit from their raw performance.

    So yeah, even today's GPUs have more processing power than what most people need. But the GPU is not really a general purpose processor. Most likely, physics calculations will require additional logic to implement.
     
  12. Olle P

    Olle P Newbie

    Exactly my point! To me "realism" is mostly about the stuff you don't see on screen, but merely experience the end result of.

    Take for example a first person shooter:
    Now we can have the exact trajectories of every single bullet calculated, including ricochets and even though we don't see them. More computing about the effects on and in the target, also quite possibly out of sight. Furthermore we can improve vehicle physics by keeping track of vehicles total loads and shift in centre of gravity as a result of using fuel, spending ammo and generally moving around. Etc, etc...

    Why then have the GPU spending lot's of it's efforts to do calculations that aren't directly related to the graphics?

    /Olle
     
    Last edited: Jul 27, 2006
  13. Adrian Wong

    Adrian Wong Da Boss Staff Member

    Because the GPU is far outstripping the CPU and even the monitor.

    Even if you actually plunk down the money for a GeForce 7950 GX2 board, you won't be able to fully utilize its processing power, without a really fast system to back it up and a 30" monitor that supports Extreme HD.

    So, by offloading physics processing to the GPU, you can make use of the GPU's excess processing capability. That will save you money upgrading to the best CPU and the largest monitor in the market.

    It will also give everyone more reason to buy faster graphics cards, of course. Hehe.. :D
     
  14. Olle P

    Olle P Newbie

    Depends on what GPU I have, doesn't it?
    Yeah right!
    The other option is to not spend money on a state of the art GPU, but make do with a lower end graphics card (like GF7300), that will not sit idle most of the time, together with a mid range CPU and then spend the extra money on a physics processor!

    /Olle
     
  15. The_YongGrand

    The_YongGrand Just Started

    Hmm. The Physics engine should focus the physics of gas, liquid and solid systems actually, rather than just the gravity. The players will feel the game contents much more than just the gravity.

    And also, make the Physics engine intergrated inside video cards. Will save much more money and save the developers' time. At least they won't have to write seperate codes for the Physics card actually. :D
     
  16. Papercut

    Papercut Newbie

    But when people want the best possible performance in games, then a low-end graphics card is obviously not an option. Having that paired with a physics card is like giving with one hand and taking with the other as far as the gaming experience is concerned :?
     
  17. Chai

    Chai Administrator Staff Member

    Same here. That's something I will never do, not even if the PPU card can truly flexes its muscle with full game support.
     
  18. Olle P

    Olle P Newbie

    How does that compare to:
    Which to me implies that any GPU provide sufficient computing power for as much graphics performance you want. The experience of running low end vs high end graphics card should thus be negligable, but the high end card will have more excess processing power (that could be used for other tasks).

    The difference in price tag between high end and low end graphics really ovewhelms the cost of a PhysX-card.
    Just from my regular computer store:
    XFX GeForce 7950GX2 570M: 7,195 SEK
    Compared to:
    Asus AGEIA TM PhysX P1: 2,595 SEK
    Gainward GeForce 7900GT 512MB, GS: 3,695 SEK
    ------------------ Total: 6,290 SEK
    Or:
    Asus AGEIA TM PhysX P1: 2,595 SEK
    Gainward GeForce 7300GT, GS: 899 SEK (ARP's own "Editor's Choice"!)
    ----------------- Total: 3,494 SEK

    I, for one, wouldn't call the 7900GT a "low end" card but rather high end. And it's still cheaper to buy that and a PhysX than to get the highest end graphics card!

    /Olle
     
  19. Adrian Wong

    Adrian Wong Da Boss Staff Member

    Actually, I didn't imply that ANY GPU will have excess processing power. I was talking about the best in GPUs versus the best in CPUs and the best in monitors.

    Take Quad SLI for example. Don't you think it has excessive processing power?

    I'm not saying that we should all go buy the Quad SLI just to run physics on-chip. That's what NVIDIA and even ATI hopes you will do, since it will help sell their cards.

    But it makes sense to integrate physics processing into the graphics card itself. Instead of increasing the fillrate and bandwidth, maybe GPU designers can add physics processing circuitry to future GPUs.

    Certainly, it will be cheaper to add physics processing into the GPU, rather than have a separate physics chip and card. Not only because it costs less per chip, but also because it will ship in much greater volumes than if it were in a separate card.

    Also, by tying it to the GPU, it will encourage game companies to support physics processing. After all, millions of cards will be shipped with physics processing.
     
  20. Papercut

    Papercut Newbie

    Ok, using your figures...that Gainward 7900GT costs 3,695SEK and the 7300GT + PhysX costs just a bit less at 3,494. Which do you think will give you better frame rates and a better gaming experience, given a decent CPU and all factors constant? I daresay the 7900GT will be the answer for both. ;)
    That's what I'm trying to say.

     

Share This Page