Latest Forum Posts

Latest News Posts
Coming Soon!
Social
Go Back   Techgage.com > Archives > Reviews and Articles > AGEIA PhysX.. First Thoughts

Reviews and Articles Discussion for Techgage content is located here. Only staff can create topics, but everyone is welcome to post.

Thread: AGEIA PhysX.. First Thoughts Reply to Thread
Your Username: Click here to log in
Image Verification
Title:
  
Message:
Post Icons
You may choose an icon for your message from the following list:
 

Additional Options
Miscellaneous Options

Topic Review (Newest First)
08-16-2006 03:44 PM
Unregistered
PPU not enough

Great thread guys. Love the back and forth debates. I'll probably be logging in and creating an account, but alas, I am at work and can't ;P

Just my two cents worth here: I believe that the physX card is a step in the right direction, and only that. Seems to me that it's only half a card...why only process the physics? Personally I think that they should create a card that fits into a PCIe slot, that works together with the cpu and gpu and contains the ENTIRE physics engine. That way any game built to take advantage of said engine would be able to nearly instantly get all the physics information from an outside source. Coding for games would be much easier and faster as well, allowing companies to focus more on AI and graphics... Then again, seems that everyone has a different idea as to what physics should be included in a game....so why not just make them all scalable? Gravity has an acceleration number to it - just make it selectable by the game itself...same with the friction of a given surface. Just link the object and/or texture to they physical properties wanted and voila!

Sorry if not much of that makes much sense, my mind keeps floating when I'm at work :P And with all that said, I'm probably going to pick up a physx card on my next pc build, which will hopefully be in about a month or so.

P.S. Out of curiosity, if it would be so great for games to take advantage of the second core of the CPU, why haven't they done so? multi core processors have been out for some time now, so i'm thinking there must be some reason why people aren't doing it....
05-08-2006 10:29 PM
Buck-O Excelent post.
And it hink your are abs9olutely right on every count.

I think the timing is a double edged sword.
For one, they released it to late. If this had been releases back in the day of the GeForce4, and the Athlon XP, it would have sold like hot cakes. But, unfortuantely at this point in time, as we have all mentioned, the PCI bus is simply to old, and to legacy to work with a modern graphics card system. And any fluid charted benchmark run in FRAPS with and without PPU support, as you said, shows that plain as day. With significant (upwards of 30FPS in most cases) drops in framerate as the PPU is activated for a "physics enhanced" visual. WHich at this point is nothing but some particle effects. And really, as ive said before, all of those particle effects still have to be rendered on screen via the GPU, and if the GPU is an older GPU, your simply not going to be able to push those sort of visuals.

Now the other edge of that sword is that its released to early, becuase we are int eh era ov PCIe, and the faster direct busses of modern processor technology. THey should have waiting untill the PCIe support was broadened, and there was PCIe support for a card like the PPU. At present, i cant think of any motherboard worth its weight that will work properly with an SLI setup, and a PPU (or any card installed in a PCIe 1x slot for that matter).

So its really probibly the wort time to put a product like this to market. Plus add in all the other froo fraw coming with the release of Vista, and DX10, and the uncertaintly of the market surrounding it. Its really a bad move in my honest oppinion.


Truely, time will be the only real measure of what will become of Agia and their PPU. But i can almost guarentee, that before its over, and before we see a real implimentation of the device worth its while, will take ATi of Nvidia buying them up, and putting the PPU on card with the GPU. My guess would be the most likely cantidate for this would be ATi, and their xFire setup. And i say that soley becuase of their implimentation of the Master card. Which could easily, and seamlessly hold the PPU onboard, and simple use the slave card for graphics rendering only. I think this would be in ATi;s best interest, as it would likely vault them back into the lead in the GPU market.

Infact, i think i should patent that idea, before ATi buys it up. ROFL!

Thanks for contributing to the discussion.
05-07-2006 01:01 AM
Unregistered
PhysX PPU...The Future of PC Physics?

I have read all your posts with great interest, I feel that some very good points are being made, so here's my 2 cents worth ;-)

I believe the 'IDEA' of having a dedicated PPU in your increasingly expensive monster rig is highly appealing, even intoxicating and I believe this 'IDEA' coupled with some clever marketing will ensure a good number of highly overpriced, or at least expensive, sales of this mystical technology in it's current (ineficient) form.

For some, the fact that it's expensive and also holds such high promises will ensure it's place as a 'Must have' component for the legions of early adopters. The brilliant idea of launching them through Alienware, Falcon Northwest and the top of the line Dell XPS600 systems was a stroke of marketing genius as this adds to the allure of owning one when they finally launch to the retail market...If it's good enough for a system most of us can never afford but covet none the less it's damn well good enough for my 'monster RIG'. This arrangement will allow the almost guaranteed sales of the first wave of cards on the market. I have noticed that some UK online retailers have already started taking pre-launch orders for the 218 OEM 128MB version I just have to woner how many of these pre-orders have actually been sold?

The concept of a dedicated PPU is quite simply phenominal, We spend plenty of money upgrading our GPU's, CPU's and quite recently Creative have brought us the first true APU (X-Fi series) that it makes sense for there to be a dedicated PPU and berhaps even an AiPU to follow.

The question is, will these products actually benefit us to the value of their cost?

I would say that a GPU, or in fact up to 4 GPU's running over PCIe x32 (2xPCIe x16 channels) become increasingly less value for money the more GPU's added to the equation. i.e. a 7900GTX 512MB at 440 is great bang for the buck compared to Quad SLI 7900GTX 512MB at over 1000. The framerates in the Quad machine are not 4x the single GPU. Perhaps this is where GPU's could trully be considered worthy of nVidia or ATI's Physics SLI load balancing concept. SLI GPU's are not working flat out 100% of the time...Due to the extremely high bandwidth of Dual PCIe x16 ports there should be a reasonable amount of bandwidth to spare on Physics calculations, perhaps more if Dual PCIe x32 (or even quad x16) Motherboards inevitably turn up. I am not saying that GPU's are more efficient than a DEDICATED and designed for PPU, just that if ATI and nVidia decided the market showed enough potential, they could simply 'design in' or add PPU functionality to their GPU cores or GFX cards. This would allow them to tap into the extra bandwidth PCIe x16 affords.

The Ageis PhysX PPU in it's current form runs over the PCI bus, a comparitively Narrow bandwicth bus, and MUST communicate with the GPU in order for it to render the extra particles and objects in any scene. This in my mind would create a Bottleneck as it would only be able to communicate at the bandwidth and speed afforded by the Narrow bandwidth and slower PCI bus. The slowest path governs the speed of even the fastest...This would mean that adding a dedicated PPU, even a very fast and efficient one, would be severely limited by the bus it was running over. This phenomenon is displayed in all the real world benchmarks I have seen of the Ageis PhysX PPU to date, The framerates actually DROP when the PPU is enabled.

To counter this, I believe, Ageis through ASUS, BFG and any other manufacturing partner they sign up with will have to release products designed for the PCIe bus. I believe this is what Ageis knows as the early manufacturing samples were able to be installed in the PCI bus as well as the PCIe bus (although not at the same time ;-) ). I believe the PCI bus was chosen for launch due to the very high installed user base of PCI motherboards, every standard PC I know of that would want a PPU in their system. I belive this is a mistake, as the users most likely to purchase this part in the 'Premium price' period would likely have PCIe in their system, or at least would be willing to shell out an extra 50-140 for the privelage. Although I could be completely wrong in this as it may allow for some 'Double Selling' as when they release the new and improved PCIe version, the early adopters will be forced to buy into it again at a premium price.

This leads me neatly onto the price. I understand that Ageis, quite rightly, are handing out the PhysX SDK freely, this is to allow maximum compatibilty and support in the shortest period of time. This does however mean that the end user, who purchases the card in the beginning will have to pay the full price for the card...218 for the 128MB OEM version. As time goes by and more units are sold, the installed userbase of the PPU will grow and the balance will shift, Ageis will be able to start charging the developers to use their 'must have' Hardware Physics support in their games/software and this will subsidise the cost of the card to the end user, therefore making them even more affordable to the masses and therefore making it a much more 'Must Have' for the developers. This will take several generations of the PPU before we feel the full impact of this I believe.

If ATI and nVidia are smart, they can capitalise on their high installed initial userbase and properly market the idea of Hardware physics for free with their SLI physics, they may be able to throw a spanner in the works for Agies while they attempt to attain market share. This may benefit the consumer, although it may also knock Agies out of the running depending on how effective ATI and nVidias driver based solution first appears. It could also prompt a swift buy out from either ATI or nVidia like nvidia did with 3DFX.

Using the CPU for Physics, even on a multicore CPU, in my opinion is not the way forward. The CPU is not designed for physics calculations, and from what I hear they are not (comparitively) very efficient at performing these calculations. A dedicated solution will always be better in the long run. This will free up the CPU to run the OS and also for Ai calculations and well as antivirus, firewall, background applications and generally keeping the entire system secure and stable. Multicore will be a blessing for PC's and consoles, but not for such a specific and difficult (for a CPU) task.

"Deep breath" ;-)

So there you have it, My thoughts on the PPU situation as it stands now and into the future. Right now I will not be buying into the dream, but simply keeping the dream alive by closely watching how it develops until such a time as I believe the 'Right Time' comes. 218 for an unproven, generally unsupported, and possibly seriously flawed incarnation of the PPU dream is not in my opinion The Right Time, Yet ;-)

JKay6969
05-02-2006 04:50 AM
Buck-O The two games id really like to see take full PPU support would be GF2, and FEAR, perhaps even Farcry. These three gaems seem to keep patching up for the altest in gaming tech. So im hoping the tred continues with the PPU.

A shame about GRAW though. I personally didnt like the game that much. Multiplayer was preaty rock solid though (and classic GR/R6, which i like).
[it should be noted that my experiences with GRAW are witht he 360 version]
05-01-2006 09:39 PM
Greg King Concerning the games that are not built with a PPU in mind. You are right when you say that the PPU will be useless but in it's defense, it's early. Games will need to be programed to take advantage of the PPU and thats what Ageia is giving their SDK out for free to anyone who in interested in including the features that a PPU would bring to the game.


I recieved the PhysX card from Ageia and BFG today and have had about an hour to play around with it and see just what it could do. Alas, all that was included was the Cell Factor Demo that we saw at the GDC. It's nice to be able to play this on my own PC but I want more. I am supposed to be getting Ghost Recon: Advanced War Fighter from Ubisoft soon and that game takes advantage of the PhysX card so time will tell as to where this whole "PhysX revolution" is going.


I must say however that as much as I am all for the PPU, it's only a novalty untill there are games to take advantage of it's processing power.


This will be something to keep up on in the coming months when games are released that were built specifically to use the PPU. I am not holding my breath but it would be killer to be able to patch our current games such as BF2 and CS:S to be able to use this card with those games as well. That might just be a pipe dream though...
05-01-2006 08:31 PM
twolf wow what service sent a letter to rob before i got back to the forums he resolved the problem and wrote back to me. thanks!!!!

jakal: thanks for the info that was exactly the info i was looking for !!
04-29-2006 03:18 AM
Buck-O
Quote:
Originally Posted by Unregistered
i tried to register it said you will recieve an email on how to activate your name but i never got it?

buck-o you stated you were going to get one for review. just curiouse as to what game you intend on trying it on?

and it does not specify on this but i was wondering, it states look for games optimized for the physx card... if a game is not built for the card, the card would be useless wouldn't it?

and could you shed some light on something for me as i am a mechanic not a computer guru

what is the limit on pci-x for volume? like how much can you transfer between the video card and the motherboard at a time?
Well thats a bummer on the registration. Maybe fire off an e-mail to Rob Williams, and see if he can get yoru account pushed through.

As for the review card. I am not getting one. I would like to. But im not. DarkSynergy, however, is getting one. So you should direct that question tword him. But im sure he will answer anyway.

And you nailed it. If the game isnt specificly written to take advantage of the Phys-X card...it wont even know its there. Which is one of the reason im not hip on it. Becuase it has no backward compatibility, unless the developers see fit to release a patch for the game to utilize it. And even then im sure the performance increase would be minimal.


As for your last question...its already been answered.
04-28-2006 09:49 PM
Jakal
Quote:
Originally Posted by Unregistered
what is the limit on pci-x for volume? like how much can you transfer between the video card and the motherboard at a time?
The newer cards can handle upwards of 40Gigabits/s bandwidth between it and the cpu. The Pci-x 16x slots can handle 80Gbps of encoded data, or 64Gbps unencoded. That's gigabits, not gigabytes. You're looking at 8GB unencoded and about 9GB of encoded data per second, for a 16x card, max. The 6600gt, for example, is about 4GB/s. Cpu's nowadays can do roughly 20GB/s data transfer, and better.
04-28-2006 09:08 PM
Unregistered
i tried

Quote:
Originally Posted by Buck-O
Hey, thanks for replying back.

No worries, i understand what your saying just fine.
Its my dislexia getting the best of me that you've got to worry about.


Anyway, yes i do imagine it will eventually become an enthousiast motherboard option, rahter then an onboard card. I think thats the next logical step. Though, i dont know how people will feel about paying $300+ for a motherboard with that feature on there. Who knows. Only time will tell. But i think if anyone would do it. It would be nVidia. As they already have a goot marketable motherboard base.

As for the two GPU's on one card, its been done. WAY back in the day 3dfx had a card that actually had 4 GPU's on one card. It was a monster. Absolutely hudge. I dont think it ever went into production. But i think a couple companies ctually built dual GPU setups with teh last series of VooDoo cards. I also know that (i think) MSI has a dual 6800GT signel card that works in their special SLi motherboard. And Ive seen prototypes of card that are running two lap top spec mobile GeForce GO cards on a special daughter board that you could attatch them to, that then fit into a PCI Express slot. The notion being, that with the GO cards, you could upgrade easily, and cheaply...or so they say.

Anyway, back to Agia.
Only time will tell with the PPU setup. And im sure that by the end of the year we will have a better idea of what exactly they have in mind, and how devlopers and hardware venders will accept it.

If youre enjoying the conversation...why dont you join the forums, and continue the discussion here, and in the other forums here at Techgage.
i tried to register it said you will recieve an email on how to activate your name but i never got it?

buck-o you stated you were going to get one for review. just curiouse as to what game you intend on trying it on?

and it does not specify on this but i was wondering, it states look for games optimized for the physx card... if a game is not built for the card, the card would be useless wouldn't it?

and could you shed some light on something for me as i am a mechanic not a computer guru

what is the limit on pci-x for volume? like how much can you transfer between the video card and the motherboard at a time?
04-28-2006 04:14 AM
Buck-O
Quote:
Originally Posted by username
buck-o i made that post. sorry i didn't register but i didn't figure i would not get a response and most people just start slandering and don't have a useful debate!

first off i am sorry, although it wasn't mentioned i am a horrible writer and bad at getting my point across but i will try.

ok you also agreed for bigger levels, more players, better ai, ect... you would need a much larger computernot just a ppu card. this was the point i was trying to make they could use both sides of the dual core for the ai and game engine ect... withought having to use it for physics.

i am sure it would evolve eventually maybe on the motherboard as you stated ( perhaps like nic cards seem to be doing)

my problem with it being on the video cards is how much more can you transfer between a video card and a mother board?

if they can do so much more on a video card why is sli + crossfire even considered why not just build it a double card? <-- worded bad i am trying to say put 2 gpu on 1 card

i don't belive there is any game out there currently that needs 2 video cards but it seems like it is getting closer, now if they put the ppu on there too would we then need 3 video cards? isn't there a limit on how much you can transfer between a card and a motherboard?

thanks you guys for keeping it civil !!!

Hey, thanks for replying back.

No worries, i understand what your saying just fine.
Its my dislexia getting the best of me that you've got to worry about.


Anyway, yes i do imagine it will eventually become an enthousiast motherboard option, rahter then an onboard card. I think thats the next logical step. Though, i dont know how people will feel about paying $300+ for a motherboard with that feature on there. Who knows. Only time will tell. But i think if anyone would do it. It would be nVidia. As they already have a goot marketable motherboard base.

As for the two GPU's on one card, its been done. WAY back in the day 3dfx had a card that actually had 4 GPU's on one card. It was a monster. Absolutely hudge. I dont think it ever went into production. But i think a couple companies ctually built dual GPU setups with teh last series of VooDoo cards. I also know that (i think) MSI has a dual 6800GT signel card that works in their special SLi motherboard. And Ive seen prototypes of card that are running two lap top spec mobile GeForce GO cards on a special daughter board that you could attatch them to, that then fit into a PCI Express slot. The notion being, that with the GO cards, you could upgrade easily, and cheaply...or so they say.

Anyway, back to Agia.
Only time will tell with the PPU setup. And im sure that by the end of the year we will have a better idea of what exactly they have in mind, and how devlopers and hardware venders will accept it.

If youre enjoying the conversation...why dont you join the forums, and continue the discussion here, and in the other forums here at Techgage.
04-27-2006 11:19 PM
username
me again

buck-o i made that post. sorry i didn't register but i didn't figure i would not get a response and most people just start slandering and don't have a useful debate!

first off i am sorry, although it wasn't mentioned i am a horrible writer and bad at getting my point across but i will try.

ok you also agreed for bigger levels, more players, better ai, ect... you would need a much larger computernot just a ppu card. this was the point i was trying to make they could use both sides of the dual core for the ai and game engine ect... withought having to use it for physics.

i am sure it would evolve eventually maybe on the motherboard as you stated ( perhaps like nic cards seem to be doing)

my problem with it being on the video cards is how much more can you transfer between a video card and a mother board?

if they can do so much more on a video card why is sli + crossfire even considered why not just build it a double card? <-- worded bad i am trying to say put 2 gpu on 1 card

i don't belive there is any game out there currently that needs 2 video cards but it seems like it is getting closer, now if they put the ppu on there too would we then need 3 video cards? isn't there a limit on how much you can transfer between a card and a motherboard?

thanks you guys for keeping it civil !!!
04-27-2006 05:21 PM
Unregistered http://www.anandtech.com/cpuchipsets...spx?i=2376&p=2

This is article by anandtech describes the PhyX card very well, and the reasons why the card can perform more effectivly than a dual core processor for physics processing. One thing I haven't seen mentioned is the fact that many multiplayer games will probably REQUIRE PhyX cards. I read an article earlier that said 70 game developers, and over 100 games are being designed that will make use of PhysX cards. While particle effects have little impact on gameplay, I'd be disappointed to see liquid weapons, or destructable environments removed from the impending multiplayer experiance. ATI boasts that it's current 19XX series cards have an onboard PPU that can be enabled with a drivers update.

In reply to Buck-o's last post, many measures are being taken to allow developers to rapidly code destructable environments, and other physics driven game elements without inducing a significant performance hit compared to "hand coded" effects of the same nature.

Many exciting technoligies are coming out this year and the next, it will be exciting to see how everything pans out.
04-27-2006 01:12 PM
M3G@
hmmm ...

" As these things are no easier to program for with an add in card then they where before "

Actually one great advantage for game developers using the Ageia developers kit is the fact that they do not have to deal with what is currently a very complex drawn out process - The SDK is reducing the time it takes to do complex physics -
04-27-2006 12:49 PM
M3G@
Phys X is Xciting

This is a huge deal for gamers and developers - I don't know if Ageia has the money to keep this rolling but if they can work it into games properly over the course of the next two to three years we could enjoy great advances that should have probably taken place already -

There was a point made earlier that you can just place physics onto the second core of a dual core cpu -

It would handle physics rather slowly actually

Developers and the big names in gaming have already decided to take physics off the cpu -

What about saving that core for better Ai ?
It would be much more efficient to give that core Ai responsibility -

Plus with new games & Direct X 10 API things are going to be much different in execution -

I would rather have my graphics card render - So if Nvidia and Ati expect me to buy a second video card for physics when I can get a ground-up developed card for physics at a cheaper price they've lost their marbles -

However it is likely that Nvidia and Ati will be successful in their physics efforts - I would like to see what the performance differences are -
04-26-2006 04:13 AM
Buck-O
Quote:
Originally Posted by DarkSynergy
Ahh, Quotefest 2K6 continues

See, SLi and CrossFire is backwards compatable but no one obviously coded their games to take advantage of this technology. Each game from here on out will, however, code for this tech and use it to their advantage. Me playing Red Alert 2 does little to nothing for my SLI system. We have to give the PhysX approach time to sink in before we say that it's a pointless direction and past games won't benefit from it at all.

To rebute your comment that there are plenty of gamer directed gimmicks, there are also plenty of innovations that have been designed with the gamers in mind that have taken off.

All if this is our personal opinion and in my own opinion, I dont see a long life to Ageia anyway. I think, and this is just my opinion, that they will do well initially and that they will get gobbled up by either Ati or nVidia and that will be decided by which company does worse in their approach to GPU physics.

I am one all in favor of the direction that Ageia is trying to take the industry. Is it a step back....by all means no. Will it work? Thats to be decided by you and me, the end users. I am all for deeper games and more lifelike physics. I want to completely destroy a building and have it fall on my enemies to kill them. Can this all be coded to take advantage of a second CPU core.....yes it can. Will it, that is also something that only time will tell. It's all about support and from the looks of it, Ageia has a lot of support...at least initially.

I am enjoying this "debate" so please, lets keep it up. I will however, have to bow out as I hvae to be to work in a few hours. Thank you for keeping it civil as some would resort to mindless put downs and similar comments. I am now a member of the Church of Buck-O!
Sorry for my late reply. been a busy 24 hours for me. Detail season is back in full swing, and as such, clients are calling. So im a busy man.

Anyway, the biggest reason why SLi, XFire work, is becuase, the game doesnt require any special coding, as the graphics load is split via quadrent to each video card. Simple software trick to get maximum possible performance out of the hardware. The game doesnt specificly need to be oded for it to work well. Though, not to its fullest extent. And reasonably speaking, there really arent any games out there that REQUIRE a dual GPU setup to be experienced to teh fullest extent of the graphical options (i am of course excluding Anti Aliasing, as i consider that to be a driver trick, and not an actual game graphics feature. Also along with certain filtering options, as they are driver dependant, not game dependant). You can hapily game with all of the particular engines eye candy turned on, with a single video card. Even the "lowly" 6800 GT is capable of providing this level of gameplay for most titles.

The problem with Agia in this sence, is that they are saying, "certain games wont be as good as they can, unless you ahve out PhysX card to open up all of the games avliable physics". I dont think that will make a particular game sell any better, or really make people want to rush out and buy a PhysX card.

The other sad part in all of this, is i see alot of the larger "money maker" developers (i.e. EA and Sierra to name 2) that are interested in profit over quality of content, pushing out games with bloated physics engines, and not bothering to properly code for them. Adding aditional unneccesary load on teh CPU. That can only be offloaded by a PhysX card. And naturally, the developers will not bother to properly code the game to take advantage of a dual core CPU to offload that physics data from the main CPU. Essentiallyputting you in a rock and a hard place. I think that is ultimately the posision many people will be in, and i think thats a real shame. Much for the same reason i stated above as a step backwards. We are, once again, moving tword required add on bloat control hardware, where proper coding would correct the issue head on.

With that in mind, i whole heartedly agree with you, that i dont feel agia will be around very long. And i think they whole heartedly expect that, and are infact, probibly setting themselfe up for a inevtible buyout in the future.

I think one of the bigger questions you need to ask yourself when thinking about the PhysX card is...will the developers want to code for a game that allows you to topple buildings bit by bit with physics reacting deconstructable material, that has proper load bearing, weighting, and physic effect? Im willing to bet not. As these things are no easier to program for with an add in card then they where before. Though the code can be alot crappyer, and much more resource intensive as a result of the add on card (as stated above).

I think that we will see some PhysX emulators from the mod comunity before to long, that will allow a dual core CPU to run all fo the calculations of a PhysX card. Much in the same way we had Glide wrappers, and emulators back in the good ole days. I very seriously doubt that a developer would take the heat for infringing on phycis offloading with all of the patents Agia holds. So i expect some good things from those in teh open source arena.

Discussion on a logical level is always welcome in my book. And i too appreciate it.

Services are held, every Friday, and the First Saturday of every month.
This thread has more than 15 replies. Click here to review the whole thread.

Posting Rules
You may not post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On


All times are GMT -4. The time now is 01:43 PM.