Latest Forum Posts

Latest News Posts
Coming Soon!
Social
Go Back   Techgage.com > Archives > Reviews and Articles

Reviews and Articles Discussion for Techgage content is located here. Only staff can create topics, but everyone is welcome to post.

Reply
 
Thread Tools
Old 04-25-2006, 03:07 AM   #16
Buck-O
Coastermaker
 
Buck-O's Avatar
 
Join Date: Oct 2005
Location: Seattle, WA
Posts: 153
Default

Quote:
Originally Posted by Unregistered
actually if i read the article right ageia is giving developers the license!! and making the money off the cards.

yes dual processors are becoming popular but games are getting much much more complex.

i have been jumping around the game forums and alot of people are complaining they want better ai and bigger maps and more players per map.

and if you go to the age 3 web forums there is a thread called ask sandy a developer and this was asked,

16) will there be much much much larger maps in the expansion pack
A – only if we find out suddenly that everyone’s computers are much much much more powerful. Unlike my answers, the map sizes were not chosen arbitrarily
http://forum.agecommunity.com/ibb/po...tRepeater1-p=3

i will take a seperate card!!!!!
Good greif, do you work for Agia...or a advertising company that is contracted out for Agia? Im sorry, but i see very little relavence to this post, and even less actual information that refutes any of the claims ive made. All i see is happy play spin on a product that doesnt doesnt live up to the hype that marketing is bringing it. As i said in my initial post, 90% of the unregistered user posts in this thread sound fishy, WAY to positive, and i think are highly suspect for being plants. And i think Rob should do everything in his aibility to investigate those posts, becuase as far as im conscerned, they are bogus.

But, ill certaintly take on every point youve attempted to make.

Great, thats fantastic that they are handing over their physics engine for free. Its about the only way any developer would be stupid enough to code for it. But even to that end. Getting handed an SDK for free, probibly comes with alot of marketing dollars behind it. Becuase, again, the PhysX card does not make sence. Its like having a 4WD car. And being told that useing 4WD is pointless, but adding on a special bumper attatchment, with a wheezy engine, and two aditional drive wheels will make it a better performer. Huh? WHy not just use the 4WD int eh first place, and forget that little attatchment ever exsisted? Becuase marketing wants you to think otherwise.

Either way, by virture of handing out the SDK for free, and by making such a large public spectacle out of it, if a company where to produce a PROPER SMP bassed physics engine that could fully utilize a dual core, and make better use of resources then a PhysX card could (which again wouldent be difficult given the speed of the PCI bus), Agia would have a lawsuit all over their asses for patent infringment, and copying their intelectual property. So really, in the end, all Agia will turn out to be, is a marketing company with a few big patents for a technology that is outdated by current modern hardware standards, that could easily be coded around.

The best bit, was that you debunked your last point, in the answer you gave for why it was valid.

Adding a PhysX card to your computer, wont add any benefit to allow for larger levels, or more players or for wider expansiveness of the levels. His answer... only if we find out suddenly that everyone’s computers are much much much more powerful".

Simply ploping in a PhysX card, DOES NOTHING TO REMEDY THIS! All it could do is take advantage of some poorly coded eye candy, that your video card, may or may not be able to render on screen fully at a reasonable frame rate.

Becuase, as he said, the reasoning behind it is the power of the computer. Not the video card, not the PhysX card...the WHOLE COMPUTER. If youve only got 512 megs of RAM, a PhysX card wont make Battlefield 2 run any better on your system.

If you have an old Athlon XP 1800+, Quake4 will not run any faster or look any better with a PhysX card.

In fact, the only performance segment where a PhysX card would be worth its while, is in HIGH END systems. And most low level high end systems, employ dual-core CPUs. And which point, proper coding, would make the PhysX card completely worthless. Becuase the data could be calculated faster, sent to the video card quicker, and run will less overall system overhead (no crappy legacy PCI bus constraint to get in the way), by useing SMP to dedicate one of the cores to do nothing but physics calculation.

But even then, the game has to be coded around the lowest common denominator, with eye candy thrown onto it, for those with greater horspower behind them. This has been, and always will be the modus operandi of developers around the world. And the PhysX card will never change that. Reguardless of how much their marketing might try to make you believe otherwise.

Not like it makes any difference though, considering all of this is just falling on def ears anyway.

Last edited by Buck-O; 04-25-2006 at 03:20 AM.
Buck-O is offline   Reply With Quote
Old 04-25-2006, 03:15 AM   #17
Greg King
I just kinda show up...
 
Greg King's Avatar
 
Join Date: Jul 2005
Location: Indiana
Posts: 2,086
Default

Yes!

As will I. There will have to be a great game, or 2-3, for the launch to take off but I feel that Ageia's separate slot approach is heading down the right path.

And also, you have read correct. The SDK is being given away for all to design games and the money will be made on the cards themselves. They are somehow at the mercy of how well the games are designed to take advantage of the PPU but they have some heavy hitters on board with them so I am not worried to much about that.

Give me my dedicated card and keep the GPU clock cycles to the games!

You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics. You can argue that you only need one core for in game physics but I can argue that with a dedicated PPU, the entire processing power of the CPU can be used to the full advantage of the gaming end user. We could both be wrong but only time will tell. Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well.
__________________
"It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."
- Carl Sagan

Primary:
Intel i7-5960X | ASUS X99-Deluxe | 16GB Crucial DDR4 | Intel 730 240GB SSD | Crucial M4 256GB SSD
WD 1TB Black x1 | 2 x EVGA 770 GTX Superclocked SLI | Corsair H110 Water Cooler
Corsair 750D | Windows 8.1 x64 | Dell 2410 x 3 @ 5760x1200

ESXi Host:
Intel i5 3570 | ASRock Pro4-M | 24GB Patriot DDR3 | WD 250GB | QNAP NAS iSCSI Shared Storage
Greg King is offline   Reply With Quote
Old 04-25-2006, 03:22 AM   #18
Greg King
I just kinda show up...
 
Greg King's Avatar
 
Join Date: Jul 2005
Location: Indiana
Posts: 2,086
Default

Quote:
Originally Posted by Buck-O
Good greif, go you work for Agia...or a advertising company that is contracted out for Agia? Im sorry, but i see very little relavence to this post, and even less actual information that refutes any of the claims ive made. All i see happy play spin on a product that doesnt doesnt live up to the hype that marketing is bringing it. As i said in my initial post, 90% of the unregistered post in this thread sound fishy, WAY to positive, and i think are highly suspect for being plants. And i think Rob should do everything in his aibility to investigate those posts, becuase as far as im conscerned, they are bogus.

But, ill certaintly take on every point youve attempted to make.

Great, thats fantastic that they are handing over their physics engine for free. Its about the only way any developer would be stupid enough to code for it. But even to that end. Getting handed an SDK for free, probibly comes with alot of marketing dollars behind it. Becuase, again, the PhysX card does not make sence.

Either way, by virture of handing out the SDK for free, and by making such a large public spectacle out of it, if a company where to produce a PROPER SMP bassed physics engine that could fully utilize a dual core, and make better use of resources then a PhysX card could (which again wouldent be difficult given the speed of the PCI bus), Agia would have a lawsuit all over their asses for patent infringment, and copying their intelectual property. So really, in the end, all Agia will turn out to be, is a marketing company with a few big patents for a technology that is outdated by current modern hardware standards, that could easily be coded around.

The best bit, was that you debunked your last point, in the answer you gave for why it was valid.

Adding a PhysX card to your computer, wont add any benefit to allow for larger levels, or more players or for wider expansiveness of the levels. His answer... only if we find out suddenly that everyone’s computers are much much much more powerful".

Simply ploping in a PhysX card, DOES NOTHING TO REMEDY THIS! All it could do is take advantage of some poorly coded eye candy, that your video card, may or may not be able to render on screen fully at a reasonable frame rate.

Becuase, as he said, the reasoning behind it is the power of the computer. Not the video card, not the PhysX card...the WHOLE COMPUTER. If youve only got 512 megs of RAM, a PhysX card wont make Battlefield 2 run any better on your system.

If you have an old Athlon XP 1800+, Quake4 will ot run any faster or look any better with a PhysX card.

In fact, the only performance segment where a PhysX card would be worth its while, is in HIGH END systems. And most lowe level high end systems, employ dual-core CPUs. And which point, proper coding, would make the PhysX card completely worthless. Becuase the data could be calculated faster, sent to the video card quicker, and run will less overall system overhead (no crappy legacy PCI bus constraint), by useing SMP to dedicate the dual core to nothing but physics calculation.

But even then, the game has to be coded around the lowest common denominator, with eye candy thrown onto it, for those with greater horspower behind them. This has been, and always will be the modus operandi of developers around the world. And the PhysX card will never change that. Reguardless of how much their marketing might try to make you believe otherwise.

Not like it makes any difference though, considering all of this is just falling on def ears anyway.
I agree with what you say. But I am also optimistic to see, with the right backing, what Ageia can accomplish. Also, to say that your words "fall on def ears" is a bit arrogant, dont you think? You make some valid points, I will admit that, but to write off an entirely new concept before it is even launched to the public at large is somewhat sceptical to me.

I can also admit that the PPU is geared to users with high end systems but then again, so is SLI and CrossFire and both of those concepts seem to be doing quite well.

I am more excited about the direction that Ageia is trying to take the industry, rather than where they might or might not actually take it.

No I do not work for them.
__________________
"It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."
- Carl Sagan

Primary:
Intel i7-5960X | ASUS X99-Deluxe | 16GB Crucial DDR4 | Intel 730 240GB SSD | Crucial M4 256GB SSD
WD 1TB Black x1 | 2 x EVGA 770 GTX Superclocked SLI | Corsair H110 Water Cooler
Corsair 750D | Windows 8.1 x64 | Dell 2410 x 3 @ 5760x1200

ESXi Host:
Intel i5 3570 | ASRock Pro4-M | 24GB Patriot DDR3 | WD 250GB | QNAP NAS iSCSI Shared Storage
Greg King is offline   Reply With Quote
Old 04-25-2006, 03:35 AM   #19
Buck-O
Coastermaker
 
Buck-O's Avatar
 
Join Date: Oct 2005
Location: Seattle, WA
Posts: 153
Default

Quote:
Originally Posted by DarkSynergy
Yes!

As will I. There will have to be a great game, or 2-3, for the launch to take off but I feel that Ageia's separate slot approach is heading down the right path.

And also, you have read correct. The SDK is being given away for all to design games and the money will be made on the cards themselves. They are somehow at the mercy of how well the games are designed to take advantage of the PPU but they have some heavy hitters on board with them so I am not worried to much about that.

Give me my dedicated card and keep the GPU clock cycles to the games!

You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics. You can argue that you only need one core for in game physics but I can argue that with a dedicated PPU, the entire processing power of the CPU can be used to the full advantage of the gaming end user. We could both be wrong but only time will tell. Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well.
Well at least this is a post from a respected user...

But again, i fail to see the merit.

"I feel that Ageia's separate slot approach is heading down the right path."

How do you figure? Like i said, its like adding somthing extra that doesnt need to be there. And a technology that is two years late, and $200 short of being impressive.

"Give me my dedicated card and keep the GPU clock cycles to the games!"

Im assuming this is a freudian slip, and you actually mean CPU. The GPU does nothing in terms of physics rendering for the game. Thats all handled and loaded to teh CPU. However, the dirty little secret here, is that any and all data that is processed by the PhysX card, still has to be rendered by the GPU. If your GPU cant hack it, you still wont see any marked improvment in gameplay. All you will be is a frustrated consumer who just wasted $200 on somthing he cant get the full benefit of.

"You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics."

Again, they have to program to lowest common denominator. Making the effectivness of coding for dual core AND PhysX, very dubious at best. And as it stands right now, the ise of Dual core implimentation in games has been very poor. Becuase the dedicated offloading to an individual CPU has been very limited bassed on how the games kernel can support multiple threads. Considering that PhysX is essentially a seperate entity, offloading that to a secondary CPU would be very simple to impliment, even at a user level, withen windows. Infact, i wouldent at all be suprised if in the coming months as Agia supported games are released, we see hacks that allow the PhysX computations to be offloaded directly to a seconed core, in its own thread.

"Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well"

Believe me, im ripe with anticipation. However, i want to see results on real games, with real advantages, with no special demos to oooh and ahhh at. Cause right now, the marketing is about the only thing i see.
Buck-O is offline   Reply With Quote
Old 04-25-2006, 03:51 AM   #20
Greg King
I just kinda show up...
 
Greg King's Avatar
 
Join Date: Jul 2005
Location: Indiana
Posts: 2,086
Default

There was no slip, the nVidia and ATi approach is to let the GPU take care of the physics load. I talked to a few developers at the GDC about such an approach and they want all the GPU they can take.

You and I both agree on the requirements of the GPU. Every little item on the screen that the PPU allows the user to interact with, but be rendered by the GPU, thuse making the much heavier load on said GPU much greater. I was told by Ageia that there really are not any minumum requirements for using the PPU, but rather minimum requirements for the game itself. The bottleneck will be the GPU in such instances.

I am truly waiting with baited breath for this card but to discredit a concept before it has even launched is naive at best. These arent game "demos" that we have seen. I was actually able to play Cell Factor in real time and interact with all objects on the screen. The future is wide open for Ageia....it's just up to them to choose the right path.

They are marketing this like they should so again, you and I agree on this. One can argue against anything and make at least some valid points. I am not saying, by any streach of the imagination, that I am right, but I will not let you pick what I say apart and then concede that you have been right all along. I see you and I as the same voice, only on differant ends of the spectrum. I am for seeing the good and you, you are only for the bad. Debates like this though, will benefit the end users. End users who do not have the systems that you and I have. End users that want this card but do not know what it is all about. I will give my honest opinion of the PPU in my upcoming. An opinion that will be how I feel abou the card and not how I hope Ageia wants me to feel.

I love a good debate so keep it coming. I still think that you and I are only feeding off of each other.




Quote:
Originally Posted by Buck-O
Well at least this is a post from a respected user...

But again, i fail to see the merit.

"I feel that Ageia's separate slot approach is heading down the right path."

How do you figure? Like i said, its like adding somthing extra that doesnt need to be there. And a technology that is two years late, and $200 short of being impressive.

"Give me my dedicated card and keep the GPU clock cycles to the games!"

Im assuming this is a freudian slip, and you actually mean CPU. The GPU does nothing in terms of physics rendering for the game. Thats all handled and loaded to teh CPU. However, the dirty little secret here, is that any and all data that is processed by the PhysX card, still has to be rendered by the GPU. If your GPU cant hack it, you still wont see any marked improvment in gameplay. All you will be is a frustrated consumer who just wasted $200 on somthing he cant get the full benefit of.

"You can argue that dual core will be the untimely coup de gras of the PPU but I feel that the more game programmers learn to code with dual cores, the more they will take advantage of the fact that they can either code the game to use one core for gaming and one for physics or both for the game and none for physics."

Again, they have to program to lowest common denominator. Making the effectivness of coding for dual core AND PhysX, very dubious at best. And as it stands right now, the ise of Dual core implimentation in games has been very poor. Becuase the dedicated offloading to an individual CPU has been very limited bassed on how the games kernel can support multiple threads. Considering that PhysX is essentially a seperate entity, offloading that to a secondary CPU would be very simple to impliment, even at a user level, withen windows. Infact, i wouldent at all be suprised if in the coming months as Agia supported games are released, we see hacks that allow the PhysX computations to be offloaded directly to a seconed core, in its own thread.

"Stay tuned as I have a PPU on the way for review and I will let you all know what is found out. If it sucks in real time, then I will let you know but if it rules, you will know that as well"

Believe me, im ripe with anticipation. However, i want to see results on real games, with real advantages, with no special demos to oooh and ahhh at. Cause right now, the marketing is about the only thing i see.
__________________
"It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."
- Carl Sagan

Primary:
Intel i7-5960X | ASUS X99-Deluxe | 16GB Crucial DDR4 | Intel 730 240GB SSD | Crucial M4 256GB SSD
WD 1TB Black x1 | 2 x EVGA 770 GTX Superclocked SLI | Corsair H110 Water Cooler
Corsair 750D | Windows 8.1 x64 | Dell 2410 x 3 @ 5760x1200

ESXi Host:
Intel i5 3570 | ASRock Pro4-M | 24GB Patriot DDR3 | WD 250GB | QNAP NAS iSCSI Shared Storage
Greg King is offline   Reply With Quote
Old 04-25-2006, 03:53 AM   #21
Buck-O
Coastermaker
 
Buck-O's Avatar
 
Join Date: Oct 2005
Location: Seattle, WA
Posts: 153
Default

Quote:
Originally Posted by DarkSynergy
I agree with what you say. But I am also optimistic to see, with the right backing, what Ageia can accomplish. Also, to say that your words "fall on def ears" is a bit arrogant, dont you think? You make some valid points, I will admit that, but to write off an entirely new concept before it is even launched to the public at large is somewhat sceptical to me.

I can also admit that the PPU is geared to users with high end systems but then again, so is SLI and CrossFire and both of those concepts seem to be doing quite well.

I am more excited about the direction that Ageia is trying to take the industry, rather than where they might or might not actually take it.

No I do not work for them.
Unfortunately, the only backing i see for them now is hype, and cheesy ohhh ahhh demos, and no real substance. There has been nothing that has really jumped out and impressed the sox off me.

And yes, i am one Arogant SOB. Consdiering that after my initial psot it took nearly a week for someone to respond, even somewhat, and it was yet antoher unregistered user, i dont assume to get any feedback from the post. Untill you showed up. So untill now...yes, i did assume it would fall on def ears.

As for writing it off...need i remind everyone of the hype of RD-RAM? Yeah, Rambus for the win...NOT! Or better yet...3dfx's "28bit psuedo color is good enough". Or "Daikatana will be the best game ever!" There are plenty of examples of overhyped technology focused tword gamers that has flopped miserably, with millions of marketing behind it. Im not saying its a guarentee that PhysX will turn out this way. Im just saying that i refuse to hold my breath for somthing amazing.

But, the thing you forget about SLi and Crossfire, is that both of these technologys are backwards compatibile with almost every legacy game, is nearly future proof (in a gaming sence, dont even get me started on HD content flags, ugh), and offer real world performance benefits that the user can see imidiately, and take full advantage of out of the box.PhysX has yet to proove its worth in that respect. It is a future speced piece of hardware, and that doesnt give you much but a pot to piss in, and no physics accelerated window to throw it out of.

See, i am NOT excited on where Agia wants to take the industry with PhysX...becuase i believe its a step BACKWARDS. We have jsut gotten away from the Glide enhanced games. The EAX exclusive games, and moved tword standardizing hardware specs to take full advantage of 90% of the hardware still out there. And now this...a dedicated Physics board. I mean come on...you are stepping back into the days of 1997 here, and the release of the 3dfx VooDoo g3d graphics accelerator. Not into the future of compartmentalized widely compatible hardware. I fail to see where on any level this is a "good direction" for the hardware industry to be headed. We;ve been down this road before, and the sign said "dead end".

No if Agia had worked up a deal with ATi and nVidia to put the PPU onboard with the GPU, then yeah, id be all for it. Or even better yet, as a sort of intergrated "East Bridge" on a motherboard. Even though eventually in either case the technology would probibly go exclusive to one of the major hardware vendorsm whomever that may be. But it would still be a step forward. Not a step back to the dark ages of add on boards, and every PCI slot full of a piece of acceleration hardware. This is the new age of PCI-Express, lets use it.

And dont worry, i know you dont work for them.
Buck-O is offline   Reply With Quote
Old 04-25-2006, 04:09 AM   #22
Greg King
I just kinda show up...
 
Greg King's Avatar
 
Join Date: Jul 2005
Location: Indiana
Posts: 2,086
Default

Quote:
Originally Posted by Buck-O
Unfortunately, the only backing i see for them now is hype, and cheesy ohhh ahhh demos, and no real substance. There has been nothing that has really jumped out and impressed the sox off me.

And yes, i am one Arogant SOB. Consdiering that after my initial psot it took nearly a week for someone to respond, even somewhat, and it was yet antoher unregistered user, i dont assume to get any feedback from the post. Untill you showed up. So untill now...yes, i did assume it would fall on def ears.

As for writing it off...need i remind everyone of the hype of RD-RAM? Yeah, Rambus for the win...NOT! Or better yet...3dfx's "28bit psuedo color is good enough". Or "Daikatana will be the best game ever!" There are plenty of examples of overhyped technology focused tword gamers that has flopped miserably, with millions of marketing behind it. Im not saying its a guarentee that PhysX will turn out this way. Im just saying that i refuse to hold my breath for somthing amazing.

But, the thing you forget about SLi and Crossfire, is that both of these technologys are backwards compatibile with almost every legacy game, is nearly future proof (in a gaming sence, dont even get me started on HD content flags, ugh), and offer real world performance benefits that the user can see imidiately, and take full advantage of out of the box.PhysX has yet to proove its worth in that respect. It is a future speced piece of hardware, and that doesnt give you much but a pot to piss in, and no physics accelerated window to throw it out of.

See, i am NOT excited on where Agia wants to take the industry with PhysX...becuase i believe its a step BACKWARDS. We have jsut gotten away from the Glide enhanced games. The EAX exclusive games, and moved tword standardizing hardware specs to take full advantage of 90% of the hardware still out there. And now this...a dedicated Physics board. I mean come on...you are stepping back into the days of 1997 here, and the release of the 3dfx VooDoo g3d graphics accelerator. Not into the future of compartmentalized widely compatible hardware. I fail to see where on any level this is a "good direction" for the hardware industry to be headed. We;ve been down this road before, and the sign said "dead end".

No if Agia had worked up a deal with ATi and nVidia to put the PPU onboard with the GPU, then yeah, id be all for it. Or even better yet, as a sort of intergrated "East Bridge" on a motherboard. Even though eventually in either case the technology would probibly go exclusive to one of the major hardware vendorsm whomever that may be. But it would still be a step forward. Not a step back to the dark ages of add on boards, and every PCI slot full of a piece of acceleration hardware. This is the new age of PCI-Express, lets use it.

And dont worry, i know you dont work for them.
Ahh, Quotefest 2K6 continues

See, SLi and CrossFire is backwards compatable but no one obviously coded their games to take advantage of this technology. Each game from here on out will, however, code for this tech and use it to their advantage. Me playing Red Alert 2 does little to nothing for my SLI system. We have to give the PhysX approach time to sink in before we say that it's a pointless direction and past games won't benefit from it at all.

To rebute your comment that there are plenty of gamer directed gimmicks, there are also plenty of innovations that have been designed with the gamers in mind that have taken off.

All if this is our personal opinion and in my own opinion, I dont see a long life to Ageia anyway. I think, and this is just my opinion, that they will do well initially and that they will get gobbled up by either Ati or nVidia and that will be decided by which company does worse in their approach to GPU physics.

I am one all in favor of the direction that Ageia is trying to take the industry. Is it a step back....by all means no. Will it work? Thats to be decided by you and me, the end users. I am all for deeper games and more lifelike physics. I want to completely destroy a building and have it fall on my enemies to kill them. Can this all be coded to take advantage of a second CPU core.....yes it can. Will it, that is also something that only time will tell. It's all about support and from the looks of it, Ageia has a lot of support...at least initially.

I am enjoying this "debate" so please, lets keep it up. I will however, have to bow out as I hvae to be to work in a few hours. Thank you for keeping it civil as some would resort to mindless put downs and similar comments. I am now a member of the Church of Buck-O!
__________________
"It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."
- Carl Sagan

Primary:
Intel i7-5960X | ASUS X99-Deluxe | 16GB Crucial DDR4 | Intel 730 240GB SSD | Crucial M4 256GB SSD
WD 1TB Black x1 | 2 x EVGA 770 GTX Superclocked SLI | Corsair H110 Water Cooler
Corsair 750D | Windows 8.1 x64 | Dell 2410 x 3 @ 5760x1200

ESXi Host:
Intel i5 3570 | ASRock Pro4-M | 24GB Patriot DDR3 | WD 250GB | QNAP NAS iSCSI Shared Storage

Last edited by Greg King; 04-25-2006 at 04:11 AM.
Greg King is offline   Reply With Quote
Old 04-26-2006, 04:13 AM   #23
Buck-O
Coastermaker
 
Buck-O's Avatar
 
Join Date: Oct 2005
Location: Seattle, WA
Posts: 153
Default

Quote:
Originally Posted by DarkSynergy
Ahh, Quotefest 2K6 continues

See, SLi and CrossFire is backwards compatable but no one obviously coded their games to take advantage of this technology. Each game from here on out will, however, code for this tech and use it to their advantage. Me playing Red Alert 2 does little to nothing for my SLI system. We have to give the PhysX approach time to sink in before we say that it's a pointless direction and past games won't benefit from it at all.

To rebute your comment that there are plenty of gamer directed gimmicks, there are also plenty of innovations that have been designed with the gamers in mind that have taken off.

All if this is our personal opinion and in my own opinion, I dont see a long life to Ageia anyway. I think, and this is just my opinion, that they will do well initially and that they will get gobbled up by either Ati or nVidia and that will be decided by which company does worse in their approach to GPU physics.

I am one all in favor of the direction that Ageia is trying to take the industry. Is it a step back....by all means no. Will it work? Thats to be decided by you and me, the end users. I am all for deeper games and more lifelike physics. I want to completely destroy a building and have it fall on my enemies to kill them. Can this all be coded to take advantage of a second CPU core.....yes it can. Will it, that is also something that only time will tell. It's all about support and from the looks of it, Ageia has a lot of support...at least initially.

I am enjoying this "debate" so please, lets keep it up. I will however, have to bow out as I hvae to be to work in a few hours. Thank you for keeping it civil as some would resort to mindless put downs and similar comments. I am now a member of the Church of Buck-O!
Sorry for my late reply. been a busy 24 hours for me. Detail season is back in full swing, and as such, clients are calling. So im a busy man.

Anyway, the biggest reason why SLi, XFire work, is becuase, the game doesnt require any special coding, as the graphics load is split via quadrent to each video card. Simple software trick to get maximum possible performance out of the hardware. The game doesnt specificly need to be oded for it to work well. Though, not to its fullest extent. And reasonably speaking, there really arent any games out there that REQUIRE a dual GPU setup to be experienced to teh fullest extent of the graphical options (i am of course excluding Anti Aliasing, as i consider that to be a driver trick, and not an actual game graphics feature. Also along with certain filtering options, as they are driver dependant, not game dependant). You can hapily game with all of the particular engines eye candy turned on, with a single video card. Even the "lowly" 6800 GT is capable of providing this level of gameplay for most titles.

The problem with Agia in this sence, is that they are saying, "certain games wont be as good as they can, unless you ahve out PhysX card to open up all of the games avliable physics". I dont think that will make a particular game sell any better, or really make people want to rush out and buy a PhysX card.

The other sad part in all of this, is i see alot of the larger "money maker" developers (i.e. EA and Sierra to name 2) that are interested in profit over quality of content, pushing out games with bloated physics engines, and not bothering to properly code for them. Adding aditional unneccesary load on teh CPU. That can only be offloaded by a PhysX card. And naturally, the developers will not bother to properly code the game to take advantage of a dual core CPU to offload that physics data from the main CPU. Essentiallyputting you in a rock and a hard place. I think that is ultimately the posision many people will be in, and i think thats a real shame. Much for the same reason i stated above as a step backwards. We are, once again, moving tword required add on bloat control hardware, where proper coding would correct the issue head on.

With that in mind, i whole heartedly agree with you, that i dont feel agia will be around very long. And i think they whole heartedly expect that, and are infact, probibly setting themselfe up for a inevtible buyout in the future.

I think one of the bigger questions you need to ask yourself when thinking about the PhysX card is...will the developers want to code for a game that allows you to topple buildings bit by bit with physics reacting deconstructable material, that has proper load bearing, weighting, and physic effect? Im willing to bet not. As these things are no easier to program for with an add in card then they where before. Though the code can be alot crappyer, and much more resource intensive as a result of the add on card (as stated above).

I think that we will see some PhysX emulators from the mod comunity before to long, that will allow a dual core CPU to run all fo the calculations of a PhysX card. Much in the same way we had Glide wrappers, and emulators back in the good ole days. I very seriously doubt that a developer would take the heat for infringing on phycis offloading with all of the patents Agia holds. So i expect some good things from those in teh open source arena.

Discussion on a logical level is always welcome in my book. And i too appreciate it.

Services are held, every Friday, and the First Saturday of every month.
Buck-O is offline   Reply With Quote
Old 04-27-2006, 12:49 PM   #24
M3G@
Guest Poster
 
Posts: n/a
Lightbulb Phys X is Xciting

This is a huge deal for gamers and developers - I don't know if Ageia has the money to keep this rolling but if they can work it into games properly over the course of the next two to three years we could enjoy great advances that should have probably taken place already -

There was a point made earlier that you can just place physics onto the second core of a dual core cpu -

It would handle physics rather slowly actually

Developers and the big names in gaming have already decided to take physics off the cpu -

What about saving that core for better Ai ?
It would be much more efficient to give that core Ai responsibility -

Plus with new games & Direct X 10 API things are going to be much different in execution -

I would rather have my graphics card render - So if Nvidia and Ati expect me to buy a second video card for physics when I can get a ground-up developed card for physics at a cheaper price they've lost their marbles -

However it is likely that Nvidia and Ati will be successful in their physics efforts - I would like to see what the performance differences are -
  Reply With Quote
Old 04-27-2006, 01:12 PM   #25
M3G@
Guest Poster
 
Posts: n/a
Default hmmm ...

" As these things are no easier to program for with an add in card then they where before "

Actually one great advantage for game developers using the Ageia developers kit is the fact that they do not have to deal with what is currently a very complex drawn out process - The SDK is reducing the time it takes to do complex physics -
  Reply With Quote
Old 04-27-2006, 05:21 PM   #26
Unregistered
Guest Poster
 
Posts: n/a
Default

http://www.anandtech.com/cpuchipsets...spx?i=2376&p=2

This is article by anandtech describes the PhyX card very well, and the reasons why the card can perform more effectivly than a dual core processor for physics processing. One thing I haven't seen mentioned is the fact that many multiplayer games will probably REQUIRE PhyX cards. I read an article earlier that said 70 game developers, and over 100 games are being designed that will make use of PhysX cards. While particle effects have little impact on gameplay, I'd be disappointed to see liquid weapons, or destructable environments removed from the impending multiplayer experiance. ATI boasts that it's current 19XX series cards have an onboard PPU that can be enabled with a drivers update.

In reply to Buck-o's last post, many measures are being taken to allow developers to rapidly code destructable environments, and other physics driven game elements without inducing a significant performance hit compared to "hand coded" effects of the same nature.

Many exciting technoligies are coming out this year and the next, it will be exciting to see how everything pans out.
  Reply With Quote
Old 04-27-2006, 11:19 PM   #27
username
Guest Poster
 
Posts: n/a
Default me again

buck-o i made that post. sorry i didn't register but i didn't figure i would not get a response and most people just start slandering and don't have a useful debate!

first off i am sorry, although it wasn't mentioned i am a horrible writer and bad at getting my point across but i will try.

ok you also agreed for bigger levels, more players, better ai, ect... you would need a much larger computernot just a ppu card. this was the point i was trying to make they could use both sides of the dual core for the ai and game engine ect... withought having to use it for physics.

i am sure it would evolve eventually maybe on the motherboard as you stated ( perhaps like nic cards seem to be doing)

my problem with it being on the video cards is how much more can you transfer between a video card and a mother board?

if they can do so much more on a video card why is sli + crossfire even considered why not just build it a double card? <-- worded bad i am trying to say put 2 gpu on 1 card

i don't belive there is any game out there currently that needs 2 video cards but it seems like it is getting closer, now if they put the ppu on there too would we then need 3 video cards? isn't there a limit on how much you can transfer between a card and a motherboard?

thanks you guys for keeping it civil !!!
  Reply With Quote
Old 04-28-2006, 04:14 AM   #28
Buck-O
Coastermaker
 
Buck-O's Avatar
 
Join Date: Oct 2005
Location: Seattle, WA
Posts: 153
Default

Quote:
Originally Posted by username
buck-o i made that post. sorry i didn't register but i didn't figure i would not get a response and most people just start slandering and don't have a useful debate!

first off i am sorry, although it wasn't mentioned i am a horrible writer and bad at getting my point across but i will try.

ok you also agreed for bigger levels, more players, better ai, ect... you would need a much larger computernot just a ppu card. this was the point i was trying to make they could use both sides of the dual core for the ai and game engine ect... withought having to use it for physics.

i am sure it would evolve eventually maybe on the motherboard as you stated ( perhaps like nic cards seem to be doing)

my problem with it being on the video cards is how much more can you transfer between a video card and a mother board?

if they can do so much more on a video card why is sli + crossfire even considered why not just build it a double card? <-- worded bad i am trying to say put 2 gpu on 1 card

i don't belive there is any game out there currently that needs 2 video cards but it seems like it is getting closer, now if they put the ppu on there too would we then need 3 video cards? isn't there a limit on how much you can transfer between a card and a motherboard?

thanks you guys for keeping it civil !!!

Hey, thanks for replying back.

No worries, i understand what your saying just fine.
Its my dislexia getting the best of me that you've got to worry about.


Anyway, yes i do imagine it will eventually become an enthousiast motherboard option, rahter then an onboard card. I think thats the next logical step. Though, i dont know how people will feel about paying $300+ for a motherboard with that feature on there. Who knows. Only time will tell. But i think if anyone would do it. It would be nVidia. As they already have a goot marketable motherboard base.

As for the two GPU's on one card, its been done. WAY back in the day 3dfx had a card that actually had 4 GPU's on one card. It was a monster. Absolutely hudge. I dont think it ever went into production. But i think a couple companies ctually built dual GPU setups with teh last series of VooDoo cards. I also know that (i think) MSI has a dual 6800GT signel card that works in their special SLi motherboard. And Ive seen prototypes of card that are running two lap top spec mobile GeForce GO cards on a special daughter board that you could attatch them to, that then fit into a PCI Express slot. The notion being, that with the GO cards, you could upgrade easily, and cheaply...or so they say.

Anyway, back to Agia.
Only time will tell with the PPU setup. And im sure that by the end of the year we will have a better idea of what exactly they have in mind, and how devlopers and hardware venders will accept it.

If youre enjoying the conversation...why dont you join the forums, and continue the discussion here, and in the other forums here at Techgage.
Buck-O is offline   Reply With Quote
Old 04-28-2006, 09:08 PM   #29
Unregistered
Guest Poster
 
Posts: n/a
Default i tried

Quote:
Originally Posted by Buck-O
Hey, thanks for replying back.

No worries, i understand what your saying just fine.
Its my dislexia getting the best of me that you've got to worry about.


Anyway, yes i do imagine it will eventually become an enthousiast motherboard option, rahter then an onboard card. I think thats the next logical step. Though, i dont know how people will feel about paying $300+ for a motherboard with that feature on there. Who knows. Only time will tell. But i think if anyone would do it. It would be nVidia. As they already have a goot marketable motherboard base.

As for the two GPU's on one card, its been done. WAY back in the day 3dfx had a card that actually had 4 GPU's on one card. It was a monster. Absolutely hudge. I dont think it ever went into production. But i think a couple companies ctually built dual GPU setups with teh last series of VooDoo cards. I also know that (i think) MSI has a dual 6800GT signel card that works in their special SLi motherboard. And Ive seen prototypes of card that are running two lap top spec mobile GeForce GO cards on a special daughter board that you could attatch them to, that then fit into a PCI Express slot. The notion being, that with the GO cards, you could upgrade easily, and cheaply...or so they say.

Anyway, back to Agia.
Only time will tell with the PPU setup. And im sure that by the end of the year we will have a better idea of what exactly they have in mind, and how devlopers and hardware venders will accept it.

If youre enjoying the conversation...why dont you join the forums, and continue the discussion here, and in the other forums here at Techgage.
i tried to register it said you will recieve an email on how to activate your name but i never got it?

buck-o you stated you were going to get one for review. just curiouse as to what game you intend on trying it on?

and it does not specify on this but i was wondering, it states look for games optimized for the physx card... if a game is not built for the card, the card would be useless wouldn't it?

and could you shed some light on something for me as i am a mechanic not a computer guru

what is the limit on pci-x for volume? like how much can you transfer between the video card and the motherboard at a time?
  Reply With Quote
Old 04-28-2006, 09:49 PM   #30
Jakal
Tech Monkey
 
Jakal's Avatar
 
Join Date: Nov 2005
Location: Missiskippy
Posts: 634
Default

Quote:
Originally Posted by Unregistered
what is the limit on pci-x for volume? like how much can you transfer between the video card and the motherboard at a time?
The newer cards can handle upwards of 40Gigabits/s bandwidth between it and the cpu. The Pci-x 16x slots can handle 80Gbps of encoded data, or 64Gbps unencoded. That's gigabits, not gigabytes. You're looking at 8GB unencoded and about 9GB of encoded data per second, for a 16x card, max. The 6600gt, for example, is about 4GB/s. Cpu's nowadays can do roughly 20GB/s data transfer, and better.
__________________
Intel C2Quad Q9400 @3.6Ghz | Asus PM5Q Deluxe | OCZ Reaper HPC PC2-8500 8GB | XFX Black Edition 260/216| knobs are great for twisting, turning, squeezing and pulling... especially your own..... that's how doors open | Chaos Havok: grrrr im lagging me | <@Deathspawner> I wish I was in Windows :-/
Jakal is offline   Reply With Quote
Reply

Tags
None

Thread Tools

Posting Rules
You may not post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Thoughts on NVIDIA aquiring AGEIA Greg King Video Cards and Displays 3 02-14-2008 01:55 PM
AGEIA PhysX - $99 Rob Williams Tech Deals 1 11-24-2007 11:49 PM
AGEIA PhysX UT3 Mod Now Available Rob Williams Video Cards and Displays 0 11-21-2007 10:33 PM
AGEIA Unveils PhysX APEX Development Components Rob Williams Video Cards and Displays 0 11-16-2007 02:24 AM
Ageia PhysX First Look madmat Reviews and Articles 16 06-24-2006 04:02 PM


All times are GMT -4. The time now is 04:58 AM.