NVIDIA Unveils GeForce GTX 690: Dual GK104 Flagship Launching May 3rd
by Ryan Smith on April 28, 2012 10:39 PM ESTAs we mentioned back on Monday, NVIDIA was going to be making some kind of GeForce announcement this evening at the NVIDA Gaming Festival 2012 in Shanghai, China. NVIDIA’s CEO Jen-Hsun Huang has just finished his speech, announcing NVIDIA’s next ultra-premium video card, the GeForce GTX 690.
Launching later this week, the GeForce GTX 690 will be NVIDIA’s new dual-GPU flagship video card, complementing their existing single-GPU GeForce GTX 680. Equipped with a pair of fully enabled GK104 GPUs, NVIDIA is shooting for GTX 680 SLI performance on a single card, and with GTX 690 they just might get there. We won’t be publishing our review until Thursday, but in the meantime let’s take a look at what we know so far about the GTX 690.
GTX 690 | GTX 680 | GTX 590 | GTX 580 | |
Stream Processors | 2 x 1536 | 1536 | 2 x 512 | 512 |
Texture Units | 2 x 128 | 128 | 2 x 64 | 64 |
ROPs | 2 x 32 | 32 | 2 x 48 | 48 |
Core Clock | 915MHz | 1006MHz | 607MHz | 772MHz |
Shader Clock | N/A | N/A | 1214MHz | 1544MHz |
Boost Clock | 1019MHz | 1058MHz | N/A | N/A |
Memory Clock | 6.008GHz GDDR5 | 6.008GHz GDDR5 | 3.414GHz GDDR5 | 4.008GHz GDDR5 |
Memory Bus Width | 2 x 256-bit | 256-bit | 2 x 384-bit | 384-bit |
VRAM | 2 x 2GB | 2GB | 2 x 1.5GB | 1.5GB |
FP64 | 1/24 FP32 | 1/24 FP32 | 1/8 FP32 | 1/8 FP32 |
TDP | 300W | 195W | 375W | 244W |
Transistor Count | 2 x 3.5B | 3.5B | 2 x 3B | 3B |
Manufacturing Process | TSMC 28nm | TSMC 28nm | TSMC 40nm | TSMC 40nm |
Launch Price | $999 | $499 | $699 | $499 |
First and foremost, the GTX 690 won’t be launching until this Thursday (May 3rd), and while we won’t be able to publish our review until then NVIDIA has provided a bounty of information on the GTX 690 ahead of the formal launch. Specs wise – and something they’re trying to make clear from the start – unlike what they did with the GTX 590 NVIDIA is targeting close to full GTX 680 SLI performance here. As GK104 is a much smaller and less power hungry GPU from the get-go, NVIDIA doesn’t have to do nearly as much binning in order to get suitable chips to keep their power consumption in check. With GTX 690 NVIDIA will be able to reach their target TDP of 300W with all functional units enabled and with clockspeeds above 900MHz, which means performance should indeed be much closer to the GTX 680 in SLI than the GTX 590 was to its SLI counterparts.
Altogether, for the GTX 690 we’re looking at a pair of fully enabled GK104 GPUs (1536 CUDA cores) clocked at 915MHz, paired with 4GB of 6GHz GDDR5 (2GB per GPU) all on a single card. The GPU boost target will be 1019MHz, though until we have a chance to review the card it’s hard to say what the average clockspeeds will be in most games. Taken altogether, this means the GTX 690 should be able to reach at least 91% of the GTX 680 SLI’s performance and probably closer to 95% depending on where GPU boost tops out.
As for power consumption, NVIDIA is designing the GTX 690 around a 300W TDP, with a typical board power (and default power target) of 263W. Compared to the 375W TDP GTX 590 this will allow the card to be used in more power/cooling limited computers, something NVIDIA lost going from the GTX 295 to the GTX 590. The tradeoff of course being that clockspeeds had to be lowered compared to GTX 680 to pull this off, which is why even with a more liberal GPU boost both the base and boost clocks are slightly below the GTX 680. As always NVIDIA is going to be doing some binning here to snag the best GK104 GPUs for the GTX 690, which is the other factor in bringing down power consumption versus the 2x195W GTX 680 SLI.
With that said, similar to what AMD did with their dual-GPU Radeon HD 5970 and 6990 NVIDIA is in practice targeting two power/performance levels with the GTX 690: a standard performance level and a higher performance level for overclockers. On the hardware side of things the card itself is equipped with 2 8-pin PCIe power sockets, enabling the card to safely draw up to 375W, 75W over its standard 300W TDP. Delivering that power will be 10 power phases (5 per GPU), so the GTX 690 will have power delivery capabilities nearly identical to the GTX 680.
Meanwhile on the software side the GTX 690 will have an adjustable power target and clock offsets just like the GTX 680. NVIDIA is giving the GTX 690 a max power target of +35%, which given the card’s default power target of 263W means it can be set to draw up to 355W. Until we have a chance to review the card it’s not clear just how many bins higher than the boost clock GPU boost can go, but even if the GTX 690 was unable to quite catch the GTX 680 SLI at standard settings the combination of a higher power target and a core clock offset should be more than enough to make sure it can be overclocked to GTX 680 specs. On paper at least some further overclocking should even be possible, as standard power target for GTX 680 is 170W; so 2x170W for GTX 680 SLI means that all else held equal, there’s still at least 15W of additional headroom to play with.
Speaking of specifications and performance, for those of you are curious about PCIe bridging NVIDIA has finally moved away from NF200 in favor of a 3rd party bridge chip. With the GTX 590 and earlier dual-GPU cards NVIDIA used their NF200 bridge, which was a PCIe 2.0 capable bridge chip designed by NVIDIA’s chipset group. However as NVIDIA no longer has a chipset group they also no longer have a group to design such chips, and with NF200 now outdated in the face of PCIe 3.0, NVIDIA has turned to PLX to provide a PCI 3.0 bridge chip.
As for the card’s construction, while we don’t have the complete specifications on hand we know that the basic design of the GTX 690 is very similar to the GTX 590. This means it will have a single axial fan sitting at the center of the card, with a GPU and its RAM at either side. Heat from one GPU goes out the rear of the card, while the heat from the other GPU goes out the front. Heat transfer will once again be provided by a pair of nickel tipped aluminum heatsinks attached to vapor chambers, which also marks the first time we’ve seen a vapor chamber used with a 600 series card.
NVIDIA tells us that they’ve done some further work here to minimize noise by tweaking their fan ducting to reduce obstructions – primarily by eliminating variations in baseplate height that had previously been necessary to accommodate the GPUs – and are claiming that the GTX 690 should be notably quieter than the GTX 680 SLI. The GTX 590 was already a small bit quieter than the GTX 580 SLI, so given the quieter nature of the GTX 680 SLI this is something we’ll be paying particular attention to.
Elsewhere, compared to the GTX 590 the biggest change most buyers will likely notice is that NVIDIA has replaced the shrouding material. NVIDIA has replaced the GTX 590’s plastic shroud with a metal shroud, specifically a mixture of cast aluminum parts and injection molded magnesium parts. Ostensibly the use of metals as opposed to plastic further reduces noise, but along with the polycarbonate windows over the heatsinks I suspect this was largely done to further reinforce its ultra-premium nature and to make the card look more lavish.
Meanwhile when it comes to display connectivity NVIDIA is using the same 3x DL-DVI and 1x miniDP port configuration that we saw on the GTX 590. This allows NVIDIA to drive 3 3D Vision monitors over DL-DVI – the first DisplayPort enabled 3D Vision monitors just started shipping – with the tradeoff being reduced external ventilation.
Finally, we have the matter of pricing and availability. In typical NVIDIA fashion, NVIDIA has given us the pricing at the last moment. The MSRP on the GTX 690 will be $999, exactly double the price of the GTX 680. Given what we know of the specs of the GTX 690 this doesn’t come as any great surprise, as NVIDIA has little incentive to price it significantly below a pair of GTX 680 cards ($1000) since performance will be within 10%, particularly since AMD’s own dual GPU card has yet to launch. This makes the GTX 690 the most expensive GeForce ever, eclipsing even 2007’s $830 GeForce 8800 Ultra.
Spring 2012 GPU Pricing Comparison | |||||
AMD | Price | NVIDIA | |||
$999 | GeForce GTX 690 | ||||
$499 | GeForce GTX 680 | ||||
Radeon HD 7970 | $479 | ||||
Radeon HD 7950 | $399 | GeForce GTX 580 | |||
Radeon HD 7870 | $349 | ||||
$299 | GeForce GTX 570 | ||||
Radeon HD 7850 | $249 | ||||
$199 | GeForce GTX 560 Ti | ||||
$169 | GeForce GTX 560 | ||||
Radeon HD 7770 | $139 |
Availability will also be a significant issue. As it stands NVIDIA cannot keep the GTX 680 in stock in North America, and while the GTX 690 may be a very low volume part it requires 2 binned GPUs, which are going to be even harder to get. We’ll know more on Thursday, but as it stands this will probably be the lowest volume ultra-performance card launch in years. While I have no doubt that NVIDIA can produce these cards in sufficient volume when they have plenty of GPUs, until TSMC’s capacity improves NVIDIA has no chance of meeting the demand for GK104 GPUs, and that bodes very poorly for GTX 690. Consequently while this technically won’t be a paper launch it’s certainly going to feel like one; coupled with the low supply only a couple major retailers will have cards on May 3rd, with wider availability not occurring until May 7th.
Wrapping things up, while we have the specs for the GTX 690 this is only beginning to scratch the surface. Specs won't tell you about real world performance; for that you'll have to check back on May 3rd for our complete review.
109 Comments
View All Comments
MUYA - Monday, April 30, 2012 - link
Just imagine come May 3rd...they announce MSRP at less than $999 ...hmm *wishful thinking* $799 and lo and behold, GTX 680 also get a price cut too!*slap*
..ya a GPU pipelinedream
wwwcd - Monday, April 30, 2012 - link
$1k for one piece of hot(very hot) sh*t. Sorry Nvidia. Useless!Golgatha - Monday, April 30, 2012 - link
I think nVidia can afford to throw on a high flow bracket on their flagship card. With the stacked DVI ports, it actually does make a substantial difference this generation.Golgatha - Monday, April 30, 2012 - link
Just noticed one other thing. WTF is with 3 DVI ports and no HDMI?!EJ257 - Monday, April 30, 2012 - link
Seriously. I wish they dropped the 2nd and 3rd DVI and gave us another 2x DP and HDMI.Also the first chart has a typo? How is the 580 and 680 both $499?
prophet001 - Monday, April 30, 2012 - link
First, I would definitely get 2 680s and SLI them. 10% is a significant amount of performance. Especially when it comes to extending the life of your system.Second, do dual GPU cards suffer from the same micro-stutter problems as SLI setups do?
If they do, why would they sell cards like that? If they don't, why don't they take whatever they learned and apply it to SLI setups?
CeriseCogburn - Tuesday, May 1, 2012 - link
They did. Read some reviews, SLI is smooth as silk with frame rate target.CF 7970 is not as good a gaming experience, and the drivers suck.
Even when 7970CF has a frame rate win, the gaming sucks in comparison.
Glad to help.
Farkus - Monday, April 30, 2012 - link
I'll have to say I was disappointed that my 5970s; even with the voltage tweaker, I could barely get them OC'd at all. The price of them almost equalled two 5870s which would easily out-perform a single 5970. They had a cold bug which required them to warm up first and you had to jack up the idle clock if you didn't want 2-D artifacting or lock-ups going into games off the desktop. Bottom line here: The only reason to buy one of these is if you plan to SLI them and deal with any idiosyncrasy, if you're up for it. Otherwise get two singles and SLI them. As much as I am a fan of technology, I think it's asking a lot for a dual card (not to mention two of them) to perform like singles. Besides, it's easier to cool two singles. When I did have the 5970's warmed up and running in SLI it was quite the deal. I've since replaced them with two 5870's @ 900 Mhz and never had a peep out of them and they do fly. My two PNY liquid cooled 580's are absolute screamers at 915 Mhz. I'm not sure if quad-SLI is that much of an advantage over two single cards maxxed out. I wouldn't do it again just to avoid any headaches. Now go ahead and rip on me.CeriseCogburn - Tuesday, May 1, 2012 - link
amd drivers suck, nvidia drivers do not suck. You experienced the amd suck of their drivers.The reviewers already reported how amd CF drivers suck with the 79xx, many problems, and some games just won't run at all, at any resolution.
Now recently the 12.4 drivers have been breaking CF installs.
Get with it bro, buy Nvidia, and don't blame the massive suck of amd on the winning green team, who really cares for their end users and proves it.
BTW - now if you had a clue instead of a fanboy brainwash for the last 5 years, Nvidia just gave you another gigantic present, FXAA and Adaptive V-sync and frame rate target (with card partner software) in the latest 301.24 Nvidia driver all the way back to the 8 series meaning it inclused ALL the nvidia rebrands the entire amd fanboy anger and rage crew spewed about for years on end.
ROFL - HA HA HA guess who has the only laugh, let alone the last ?
von Krupp - Monday, April 30, 2012 - link
...yup.Now we must wait to see if the second half of what we thought might happen will indeed happen. That is, Nvidia is going to hold GK110 and refine it until AMD launches its next generation response. I would be surprised if they bothered to release a $1500+ GK110-based consumer solution to trump the dual-GPU 7990 when it comes out. As awesome as GK110 sounds, I remain skeptical that a single GK110 could trump two HD 7970s or two GTX 680s all by itself. So yes, I believe they will wait for the HD 8970 before firing back with another architecture.
That all said, that is one very pretty card. The recent price shifts make me feel silly for spending $1100 on my cards, but c'est la vie. This is the tech world, get what you need when you need it (within a three month margin) or else you'll always be waiting.