View Single Post
Old 08-10-2011, 06:18 PM   #11
Tech Monkey
Join Date: Mar 2009
Location: Ontario Canada
Posts: 648

Originally Posted by marfig View Post
I see your point.

I think the single most important factor is screen resolution.
The test monitor on my bench is 1080p... Not sure what he's got at home.

So what then? Well, most of the problem is a certain "moar is better" culture that surrounds the gaming community, coupled with lots of misinformation and an almost pathological vulnerability to the placebo effect; "I swear I play better at 120 fps than at 80 fps!" (a physiological impossibility).
Especially on a 60hz screen....

Sorry guys I don't care what fram rate you're composing at... y'er still watching 60hz.

I won't deny anyone the pleasure of spending money on something really cool and powerful. Pride is a good thing. But then, for some reason, people try to get all sorts of justifications to this behavior. They shouldn't! It's perfectly fine. But they do anyways. And it's those justifications that create myths and misinformation.
I don't much care how people waste their money either... it's the "being mislead" part that sticks in my craw... I once had a guy ask me --are you ready for this-- which power cord would sound better on his stereo... Yes, I said Power Cord... ROFL...

But up the screen resolution (and the type of games one plays, or the work they do on the computer) and things start to become more justified. At 1920x1080 certain games will simply not run well on low end cards. Gamers want their games to run at the top settings because these do affect the quality (the beauty) of what one sees on the screen. At that resolution they can't do this with modern games on a $100 card and some will still have some troubles with a $200 card.
Don't get me wrong, I'm not saying there isn't a difference... I'm just wondering it it's anywhere nearly as big a difference as is claimed...

"An unexamined life is not worth living" ... Socrates
2Tired2Tango is offline   Reply With Quote