View Single Post
Old 08-17-2011, 08:42 AM   #17
Join Date: Mar 2008
Location: New Zealand
Posts: 67

Originally Posted by 2Tired2Tango View Post
I'm sorry if this is a foregone conclusion sort of thing... But I don't get it...

What's with NVidia? Their chips are furnaces, they suck more power than some CPUs and far as I can tell, they're nothing to write home about...

Case in point... I recently had a system come to me with power problems. In the process of trying to diagnose the screwup I pulled out the NVidia card and stuck in a $39.00 ATI card and the system straightened right out... But it didn't end there, suddenly it was playing 1080p avi tests perfectly and running DPI Latency tests showed far lower latency than with the NVidia card... Games worked about the same... A $39.00 passively cooled ATI Radeon 4500 beat the pants off a $250 NVidia card with all it's fan noise, external power etc...

And now we have to use external coolers on them???

So why is NVidia such a big contender?
I'm sorry but I just don't understand it....
Your overall complaint currently applies to both brands. Both companies higher end cards use too much power and run too hot, IMO. The HD6900's aren't exactly quiet or cool. I think it's because we are still on 40nm when we should have been at 32nm by now. Soon, when 28nm arrives I think we'll see lower powered, cooler running cards that give better performance.
Relayer is offline   Reply With Quote