-   Video Cards and Displays (
-   -   Nvidia Titan (

DarkStarr 02-19-2013 11:31 AM

Nvidia Titan
So what do we think of the Titan? I doubt it will stay within that envelope of 250w, I think in actual testing it will end up much closer to 300w+. I gotta say, I dunno how well its gonna sell at that premium and because a lot of people who purchased 680s for launch price are not gonna want to sell for hundreds less to buy new, especially if what they have works fine. That's not to say however some wont but, I imagine the resale vale on the 680s is gonna drop quite a bit. I wonder how well its going to live up to expectations since Nvidia is know for artificially limiting GPU compute, will this card show its true power or will it be more of oh... its basically on par with where the 680 compute should have been ([email protected] for example). Personally I think for sales, its to late and I don't think in most situations its going to be enough to convince users to upgrade. I do hope however, it puts some pressure on AMD and they step up the Radeons, even if it means my 7970 is outdated :(

EDIT: WOW! Only a 8+6 Pin setup! That doesn't bode to well for overclocking..... I mean if it is above that 250w that isn't much headroom.

EDIT 2: Neat little graph, which if true is pretty damn impressive.

Rob Williams 02-19-2013 02:19 PM

I think it's an amazing card, both technoligically and performance-wised (based on the specs, that is... I don't currently have one). As for TDP, that's generally when cards are used normally, not stress-tested, so I think it'll definitely go far beyond that as well. Heck, my own PC with six-core and 680 can reach 1KW when fully stressed ([email protected]).

I do think sales are going to be rough, but the fact AMD isn't following up with a launch of its own right away is going to help things. It's also an expensive-as-hell card, so its customer base is already very limited. For what it offers though, it's priced-right I'd say, if you want to go the 3x1 route. I think even for a single 30", this card would be overkill for any game. But then again, it's kind of a "future-proofed" card, more than the others anyway.

As for overclocking, did you even read our article? NVIDIA hit 1,100MHz in the lab, which is something like 270MHz over stock.

DarkStarr 02-19-2013 04:46 PM

You say 3x1 EXCEPT when did Nvidia allow a single card to do 3 monitors IN surround? Last I knew it took SLI to run surround. As for the 1100 that could be without load or any number of variables. Looks to be pretty good though if true, it also seems they changed directions damn fast on the boost. No OV to OV and pushing it farther, probably hurt their sales a bit not allowing voltage control. That or the outrage, however I doubt that since they don't listen any other time.

EDIT: Quoted from the Nvidia site:

> SLI motherboard is required for all SLI configurations. NVIDIA recommends a PCI-E 2.0 x16 motherboard for the best performance. Please make sure you have updated your motherboard SBIOS to the latest version.

> NVIDIA® GeForce® GTX 295, GeForce GTX 590, and GeForce GTX 690 cards do not require an SLI motherboard and work on all compatible motherboards.

Rob Williams 02-19-2013 05:26 PM

That's for 3D Vision Surround, which requires a ton of horsepower (it's rendering the frames twice over). You can run up to 4 monitors from any 600 series card that has enough connectors:

Doomsday 02-20-2013 04:53 AM

Whoaa Beastly! :D

DarkStarr 02-21-2013 01:33 AM

Hope your right, I don't have any Nvidia hardware anymore, definitely not any 600 series cards.

Rob Williams 02-21-2013 04:53 AM


Originally Posted by DarkStarr (Post 64398)
Hope your right, I don't have any Nvidia hardware anymore, definitely not any 600 series cards.

Something I wrote from our quick look at the 680 on release:

"With its Kepler architecture, NVIDIA “fixes” something that has bugged me for a while: the inability to run three monitors off of a single GPU. In fact, the company has gone one further by offering support for up to 4 monitors (a typical configuration might be 3×1 and then another monitor up top, center). Given just how powerful today’s GPUs are, it’s nice to have the option to stick to just one for multi-monitor gaming."

As far as I was aware at the time, that spanned the entire Kepler line - as long as the GPU has that many video connectors, of course. That pretty much rules out most of the low-end, but 660 and up should handle it no problem.

DarkStarr 02-21-2013 10:39 AM

Sorry, I meant more of I could run 2 screens on an 8600 GT but could only game on one. I was unsure as to whether Nvidia had allowed gaming across multiple screens on a single card. Also last I know it was either one screen or 3+. My SLI 480 drivers wouldn't allow 2, huge part of why I sold them for a single 7970.

Rob Williams 02-22-2013 03:24 PM

Well.. I run an emulator on a second screen sometimes full-screen, but I can't really "use" my primary one, because it ends up booting the emulator out to Windowed mode. I think this is more of a game issue than an NVIDIA issue though. I'll be playing a regular game on a second monitor next week, so I'll see then what's possible.

RainMotorsports 02-24-2013 05:38 AM


Originally Posted by DarkStarr (Post 64402)
Sorry, I meant more of I could run 2 screens on an 8600 GT but could only game on one. I was unsure as to whether Nvidia had allowed gaming across multiple screens on a single card.

Of course you can. Its pretty much a matter of game support. I could run Sins of a Solar Empire on 2 screens with my laptops 9800M. Many games that support windowed mode can be forced upon as many screens as you can drive as well.

Unless you had an epic card like this I would recommend gaming like this but you can run 3 screens indirectly via many tricks. While some cards are offered with 4 or 5 ports via a chip. You can also use USB adapters which are basically a pathway but the GPU is still doing the rendering. All the solutions available are generally fine for software workflow purposes. But generally not for gaming across.

I accidently ran 3 screens thanks to the on CPU intel graphics. I forget which way I had it configured but the special drivers were not installed for multi gpu support with the intel. I had 2 screens on the Nvidia one on the Intel all 3 showing up in nvidia control panel.

DarkStarr 02-24-2013 03:24 PM

Windowed does not count. It slows down the FPS for one and no true gamer would run a game windowed across multiple screens.

Kougar 02-25-2013 10:10 AM

For what I think... I think it's awesome. Shorter length, lower noise, and less power draw than a 690, yet significantly better performance in HPC workloads. Still waiting to see how it compares in single-precision workloads like [email protected] though... Clocks up to 1Ghz on air, which is impressive given the 8800 GTX G80 core wasn't even close and was smaller in die area. Can't wait to see what an EVGA Titan Hydrocopper can do as it will cut the temps in half. EVGA is even going to be launching two versions of the watercooled model this time around.

I am very, very happy to hear that NVIDIA is completely removing the artificial double-precision performance cap as well. It needed to go, and now in DP-capable programs Titan actually has a performance justifications for existing beyond a simple, expensive luxury product.

The monitor overdrive option is also extremely awesome. With those Korean panels easily able to clock higher users needed a simple exposed setting rather than hacking drivers to test it out. I can only hope this will help encourage people to start asking for better refresh rate displays, and then maybe manufacturers will start making them. Because if A- panels are capable of hitting 120Hz, I'm sure better quality ones, or at least tweaked electronics designs will easily be able to do it with IPS panels.

marfig 02-28-2013 09:29 AM

I'm not convinced by this card.

I think it completely eclipses the 690 while asking too much money per FPS count when compared to the 400 USD 680. This is a card that managed to both make their top offering on the 600 line obsolete, while not adding anything that could make 680 users look further. This is just simply a card that has nothing to offer, except for anyone wishing to pay 2000 or 3000 USD for a dual or 3x solution.

2x 680 = ~800 USD
2x Titan = 2000 USD

Meanwhile, the 7870 is even faster than the 680, making the Titan an even less appealing solution to AMD users. This card simply has no place in the vast majority of the market.

Sure we can go wide eyed and be impressed at the specs. But the specs need to match a desire to buy. The Titan is anything but. And when that happens, this just makes it a bad product. More, the Titan is going to apparently remain an isolated product. The 700 and 800 line of cards will quickly eclipse it at much smaller prices.

Frankly, if I want to just open my mouth at specs, I can imagine one monster card in my dreams. I don't need for anyone to actually make one.

DarkStarr 02-28-2013 09:50 AM

My thoughts on this is, Nvidia wanted nothing more than the TITLE. They wanted to say they have the fastest single GPU card in the world. That's why its priced like this, that's also why it wont sell. Its a pretty good card but around the net I am seeing people say oh titan? nah let me get a second 680.

Anyways AMD is making a dual GPU card that has dual 1Ghz chips on it. Sounds really epic TBH.

marfig 02-28-2013 11:21 AM

Tomorrow or the day after the embargo on the Titan will end and we'll get to see a whole bunch of benchmark data. That will put any doubts to rest. My thought is that it is going to be generally disappointing. Particularly when comparing it to the 7970. (No wonder AMD reacted by saying they won't be replying to the Titan).

BTW, I did mean 7970, not 7870 on the previous post. But I guess anyone got it :)

All times are GMT -4. The time now is 03:05 AM.

Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2017, vBulletin Solutions, Inc.
Copyright © 2005 - 2017, Techgage Networks Inc.