NVIDIA GeForce GTX 680M: Kepler GK104 Goes Mobileby Jarred Walton on June 4, 2012 9:12 PM EST
Origin PC spoiled the GTX 680M launch party a bit with their announcement of their new EON15-S and EON17-S notebooks this morning, but NVIDIA asked us to avoid discussing the particulars of the new mobile GPU juggernaut until the official NDA time. As you’ve probably guessed, that time is now (or 6PM PDT June 4, 2012 if you’re reading this later). NVIDIA also shared some information on upcoming Ultrabooks, which we’ll get to at the end.
NVIDIA has had their fair share of success with Kepler so far, and the GTX 680 desktop cards continue to sell out. Newegg for example currently lists 18 GTX 680 cards, but only one is currently in stock: the EVGA GTX 680 FTW comes with a decent overclock and a starting price $70 higher than the standard GTX 680. On the laptop side, we’ve already had a couple Kepler-based GK107 laptops in for review already, and graphics performance has shown a large improvement relative to the previous Fermi midrange cards.
On the high-end notebooks, so far the only Kepler GPU has been a higher clocked GK107, the GTX 660M, but increasing the core clocks will only take you so far. NVIDIA has continued to sell their previous generation GTX 570M and 580M as the GTX 670M and 675M (with a slight increase in core clocks), but clearly there was a hole at the top just waiting for the GTX 680M, and it’s now time to plug it. Below is a rundown of the three of NVIDIA’s fastest mobile GPUs to help put the GTX 680M in perspective.
|NVIDIA High-End Mobile GPU Specifications|
|GeForce GTX 680M||GeForce GTX 675M||GeForce GTX 660M|
|GPU and Process||28nm GK104||40nm GF114||28nm GK107|
|CUDA Cores||1344||384||Up to 384|
|Memory Eff. Clock||3.6GHz||3GHz||4GHz|
|Memory||Up to 4GB GDDR5||Up to 2GB GDDR5||Up to 2GB GDDR5|
Just running the raw numbers here, the GTX 680M has up to 20% more memory bandwidth than GTX 675M/580M, thanks to the improved memory controller and higher RAM clocks available with Kepler. The bigger improvement however comes in the computational area: even factoring in the double-speed shader clocks, GTX 680M has potentially 103% more shader performance than its predecessor. NVIDIA gives an estimated performance improvement of up to 80% over GTX 580M, which is a huge jump in generational performance. And while Fermi on the desktop still offers potentially better performance in several compute workloads, there’s a reasonable chance that the gap won’t be quite as large on notebooks—not to mention compute generally isn’t as big of a factor for most notebook users. (And for those that need notebooks with more compute performance, there’s always the Quadro 5010M—likely to be supplemented by a new Quadro in the near future.)
Unfortunately, we’ll have to wait a bit longer to do our own in-house investigation of GeForce GTX 680M performance, as we don’t have any hardware in hand. NVIDIA did provide some performance benchmarks with a variety of games, though, and we’re going to pass along that information in the interim. As always, take such information with a grain of salt, as NVIDIA may be picking games/settings that are particularly well suited to the GTX 680M, but for many of the titles there’s a canned benchmark that should allow for “fair” comparisons.
Assuming the above chart uses the built-in benchmarks in the games that support it, we do have a few points of comparison with the Alienware M18x in GTX 580M and HD 6990M configurations. We’ll skip those, however, as the only game where we appear to run at identical settings is DiRT 3 (43.8FPS if you’re wondering). Luckily, NVIDIA has included similar performance tables in previous launches, so we do have some overlap with their GTX 580M information. First, here’s their full benchmarking page from 580M, and then we’ll summarize the points of comparison.
Tentative Gaming Performance Comparison
(Using NVIDIA GTX 580M/680M Results)
|Aliens vs. Predator||59.7||39||53%|
|Far Cry 2||115.6||79||46%|
|Lost Planet 2||57.9||33||75%|
|Stalker: Call of Pripyat||96.4||50||93%|
|StoneGiant (DoF Off)||67||46||46%|
|StoneGiant (DoF On)||36||25||44%|
|Street Fighter IV||165.5||138||20%|
|Total War: Shogun 2||97.8||59||66%|
|Witcher 2 High||43.7||26||68%|
|Witcher 2 Ultra||20.1||10||101%|
Even given the discrepancies between test notebooks (Clevo’s X7200 with an i7-980X compared to the i7-3720QM), given both chips have the same maximum Turbo Boost clocks (3.6GHz on i7-980X and i7-3720QM) plus the fact that we should be GPU limited and the above scores look pretty reasonable. The only games that don’t see a >40% increase are Civilization V (which has proven to be CPU limited in the past) and Street Fighter IV (which is running at >120FPS on both GPUs already). There are a few titles where we even see nearly a doubling of performance. We don’t have raw numbers, but NVIDIA is also claiming around a 15-20% average performance advantage over AMD’s Radeon 7970M—hopefully we’ll be able to do our own head-to-head in the near future.
Overall, using NVIDIA’s own numbers it looks like GTX 680M ought to be around 50% faster than GTX 580M. If that doesn’t seem like much, consider that the difference between GTX 480M and GTX 580M was only around 20% (according to NVIDIA and using 3DMark11). A 50% increase in mobile graphics performance within the same power envelope is a huge step; if Kepler manages to reduce power use at all then it will be an even bigger jump. Put another way, a single GTX 680M in the above games using NVIDIA’s own results ends up offering 86% of the performance of GTX 580M SLI, and it will definitely use a lot less power and have fewer headaches than mobile SLI.
As usual, NVIDIA had a wealth of other information to share about their product and software features, and with their latest drivers NVIDIA is adding a few new items. No, we’re not even talking about CUDA or PhysX here (though NVIDIA does at least list those as important features). Optimus also gets a plug, and just as with the 400M and 500M series, all 600M GPUs support Optimus. The difference is that this time around, instead of just Alienware supporting Optimus with their M17x R3, NVIDIA also has MSI and Clevo on board for GTX 680M Optimus.
Briefly covering the other features, Kepler adds TXAA support (Time based anti-aliasing), a frame based anti-aliasing algorithm that NVIDIA touts as providing quality near the level of 8xMSAA but with a performance hit similar to that of 2xMSAA—or alternately, even better quality for a performance hit similar to 4XMSAA. It sounds like TXAA for now will require application support, and NVIDIA provided the above slide showing some of the upcoming titles that will have native TXAA built into the game. NVIDIA also made mention of FXAA (sample based anti-aliasing), which is a full scene shader technique that can help remove jaggies with a very minor performance hit (around 4%). New with their latest drivers is the ability to force-enable FXAA on all games.
Another newer addition is Adaptive V-Sync, which sounds similar in some ways to Lucid’s Virtu MVP solution. In practice, however, it sounds like NVIDIA is simply enabling/disabling V-Sync based on the current frame rate. If a game is running at more than 60FPS, V-Sync will turn on to prevent tearing, while at <60FPS V-Sync will turn off to improve performance and reduce stuttering.
Besides GTX 680M, there should be quite a few Ultrabook announcements coming out at Computex with support for NVIDIA GPUs. We’ve already looked at Acer’s TimelineU M3, and we mentioned ASUS’ UX32A/UX32VD and Lenovo’s new U410. Ultrabooks are quickly reaching the point where they’re “fast enough” for the vast majority of users; the one area where they appear deficient is in graphics performance. Ivy Bridge and HD 4000 in a ULV chip simply aren’t able to provide the same sort of performance we find in the higher TDP chips.
That’s where NVIDIA plans on getting a lot of wins with their GT 610M (48 core Fermi) and their GT 620M (96 core Fermi); GT 620M will initially be available as a 40nm and 28nm part, but we're still trying to find out if GT 610M will also have a 28nm variant. For larger laptops, GT 610M wouldn’t make much sense, but in an Ultrabook it may be just what you need. If so, keep your eyes on our Computex 2012 and Ultrabook coverage, as there’s surely more to come.
Post Your CommentPlease log in or sign up to comment.
View All Comments
JarredWalton - Tuesday, June 5, 2012 - linkLink please? And are you talking about the 7970M or the 7970?
Diablo III is an NVIDIA supported title, so I'd be surprised if it's actually slower on their hardware, but stranger things have happened (e.g. DiRT 2 and Civ5 are "ATI" titles where NVIDIA held an advantage for a long time). Arkham Asylum is so old and undemanding that you'd be looking at 300 FPS vs. 290 FPS and very likely CPU limited; perhaps you meant Arkham City? If that's what you're talking about, it looks like the latest Batman with high-end hardware is hitting other bottlenecks on most GPUs (http://www.anandtech.com/show/5818/9). Skyrim also favors NVIDIA quite a lot on desktop GPUs, and Civ5 is basically a toss up.
Where are your "plethora of titles" They're not even there on the desktop GPUs! Switch to mobile GPUs and you're looking at a Pitcairn card that has 1280 cores clocked at 850MHz (a 15% drop from the desktop HD 7870 1GHz) and memory still clocked at 4800MHz. In most titles, particularly at 1080p laptop resolutions, I expect we're basically shader/core limited and the extra memory bandwidth will only help in select titles.
But hey, let's do a reasonable estimate. Take the 1900x1200 results here (http://www.anandtech.com/bench/Product/548?vs=598) and subtract 15% from HD 7870 and 32% from GTX 670 as a rough estimate. Do you know what you get? Downclocked 7870 on average would be 10% slower than downclocked GTX 670. The titles where AMD leads out of the list (only looking at actual games) are Crysis: Warhead (6%), Metro 2033 (2%), and Civilization V (15%) -- so really, two ties and only one clear lead. The titles where NVIDIA leads are DiRT 3 (5%), Total War: Shogun 2 (6%), Batman: Arkham City (28%), Portal 2 (14%), Battlefield 3 (30%), StarCraft II (25%).
Given we're only estimating performance, it does appear relatively close and AMD will certainly lead in some of their traditionally strong titles, but "pure hogwash" is a stretch. You're also not "paying double", unless you can manage to just by the MXM separate from everything else. You're looking at a $2000 notebook with HD 7970M (and relatively spartan components for storage), or a $2300 notebook with GTX 680M. The GTX 680M will also come with Optimus, so battery life at least won't suck when you're not gaming. For a 15% increase in total notebook cost, that's pretty darn reasonable: 10% better performance, likely double the battery life, and only 15% more money.
TokamakH3 - Tuesday, June 5, 2012 - linkComparing an estimated total system cost to downplay the massive price difference in these GPUs is very disingenuous. And you even overestimate the system price. A P150EM with 7970m can be had for ~$1600-1700. Face it, AMD owns the price/performance mobility crown this time around, just like last generation.
bhima - Tuesday, June 5, 2012 - linkThough that is true, I wish it were true that there were a lot of options to get an AMD mobile GPU. VERY few vendors are using them right now and the only one worth it IMO is the Sager with the 7970m. I'd like to see some Sagers/ASUS machines with the more midrange 7870m to compete with the 660m.
So while AMD may be winning the price/performance war in mobile, they aren't winning the distribution war.
TokamakH3 - Wednesday, June 6, 2012 - linkYeah, the 660m wins midrange for sure.
On the high end, there are a whole lot more 7970ms available than 680ms.
whatthehey - Tuesday, June 5, 2012 - linkWhatever. AMD owns shit right now. They've got decent GPUs, but their drivers are garbage, ESPECIALLY on laptops. If you're stupid enough to buy a laptop with AMD graphics, either you don't care at all about battery life, or you don't care about drivers. Even at the same price, I'd take the slower GTX 675M over the HD 7970M in a laptop. On desktops, AMD GPUs are better, but they still have driver issues.
kyuu - Wednesday, June 6, 2012 - linkBullcrap. They do need a better iGPU/dGPU switching mechanism to compete with Optimus, but otherwise all this crap about their drivers is FUD. Yeah, their early drivers for 7xxx had issues, but Nvidia had some problems with early Kepler drivers. Driver issues happen with both companies.
whatthehey - Wednesday, June 6, 2012 - linkGotta luv the AMD fanboys. Have you even used a laptop with AMD graphics lately? The drivers are a major problem; hell, they're a problem on desktops as well! For one, the user interface is complete garbage that any software developer with have a brain would throw out. You don't use .Net for drivers unless you're lazy and want your product to feel sluggish. On a fast PC, after first boot it will take several seconds for the Catalyst Control Center to appear; on a slower PC or a laptop? 10+ seconds isn't unusual.
But getting back to laptops, if you don't think drivers are a problem for AMD laptops, you haven't purchased a laptop with an AMD GPU lately, at least not from a big OEM. Sony and HP are two of the biggest PC manufacturers and you can't update drivers for shit on their laptops with AMD graphics. If they're discrete only graphics, you can at least try to get someone else to download the drivers for you, but if they're switchable graphics you're screwed. Hard!
It's not all AMD's fault, but it's at least partly their fault. If nVidia can get Sony and HP to allow driver updates on Optimus laptops, AMD should be able to do the same. Why haven't they? Because they just don't give a damn about you once you've purchased their product. Acer is better about driver updates, but they build horrible plastic shit to sell for bottom dollar at retailers.
The only decent laptops with AMD graphics right now are the MacBook Pros, and you'll have to pay homage to the lifeless shell of Steve Jobs by spending more money and running the oh-so-lovely OS X if you go that route. And what do you get? Gaming performance under OS X that's worse than HD 6470M under Windows. So you install Boot Camp and run Windows, right? But than why the hell did you pay $1700 for a Mac that will have lousy battery life under Windows (and you're still up a creek when it comes to getting the latest drivers for Windows)?
Yup, AMD has the best drivers and support for laptops ever! The only laptops with AMD graphics that are worth buying are the ones with Llano and now Trinity. But guess what? You get POS laptops there as well, with their awesome 15.6" 1366x768 displays that look washed out as hell! At least you can get driver updates for those (usually).
Meaker10 - Wednesday, June 6, 2012 - linkDon't buy sony or HP then lol.
I can think of plenty of machines with AMD graphics that look good.
You have the 17.3" samsung with 3d display, Alienware M17X and M18X. Also now that clevo sorted out their issue their 15.6" and 17.3" machines look good too.
But hey, haters gonna hate.
katana111 - Friday, June 8, 2012 - linkYou are out of your mind if you assume NVIDIA has stable drivers and AMD does not. Has it ever dawned on you that amd products do not need patch after update to get them running properly.
After having bought various nvidia products in the past and being burned, not a chance in hell am I touching that stuff again.
I've still got products running on AMD and they work fine. Whereas, anything with nvidia in it that is a few years old might as well be thrown away and are typically just collecting dust.
Second mistake is attacking the macbooks, which are excellent machine.
Speaking of quality AMD cards render video like streaming tv on WMC significantly better than nvidia cards.
CeriseCogburn - Thursday, July 12, 2012 - linkThank you for telling it like it is.
Amd fans blabber price and fps (even as they lose miserably and then moan price again) a million times as their lips bleed, and they all lie about drivers "being the same", and are often completely clueless and totally ignorant by choice and by default, blissfully ignorant, incapable of admitting a single fact, and utterly foobar and totally broken, but they now love Winzip.
lol - yes they love Winzip