Meet The EVGA GeForce GTX 660 Ti Superclocked

Our first card of the day is EVGA’s entry, the EVGA GeForce GTX 660 Ti Superclocked.  Among all of the GTX 670 cards we’ve looked at and all of the GTX 660 Ti cards we’re going to be looking at, this is the card that is the most like its older sibling. In fact with only a couple cosmetic differences it’s practically identical in construction.

GeForce GTX 660 Ti Partner Card Specification Comparison
  GeForce GTX 660 Ti(Ref) EVGA GTX 660 Ti Superclocked Zotac GTX 660 Ti AMP! Gigabyte GTX 660 Ti OC
Base Clock 915MHz 980MHz 1033MHz 1033MHz
Boost Clock 980MHz 1059MHz 1111MHz 1111MHz
Memory Clock 6008MHz 6008MHz 6608MHz 6008MHz
Frame Buffer 2GB 2GB 2GB 2GB
TDP 150W 150W 150W ~170W
Width Double Slot Double Slot Double Slot Double Slot
Length N/A 9.5" 7.5" 10,5"
Warranty N/A 3 Year 3 Year + Life 3 Year
Price Point $299 $309 $329 $319

EVGA will be clocking the GTX 660 Ti SC at 980MHz for the base clock and 1059MHz for the boost clock, which represents a 65MHz (7%) and 79MHz (8%) overclock respectively. Meanwhile EVGA has left the memory clocked untouched at 6GHz, the reference memory clockspeed for all of NVIDIA’s GTX 600 parts thus far.

The GTX 660 Ti is otherwise identical to the GTX 670, for all of the benefits that entails. While NVIDIA isn’t shipping a proper reference card for the GTX 660 Ti, they did create a reference design, and this appears to be what it’s based on. Both the EVGA and Zotac cards are using identical PCBs derived from the GTX 670’s PCB, which is not unexpected given the power consumption of the GTX 660 Ti. The only difference we can find on this PCB is that instead of there being solder pads for 16 memory chips there are solder pads for 12, reflecting the fact that the GTX 660 Ti can have at most 12 memory chips attached.

With this PCB design the PCB measures only 6.75” long, with the bulk of the VRM components located at the front of the card rather than the rear. Hynix 2Gb 6GHz memory chips are placed both on the front of the PCB and the back, with 6 on the front and 2 on the rear. The rear chips are directly behind a pair of front chips, reflecting the fact that all 4 of these chips are connected to a single memory controller.

With the effective reuse of the GTX 670 PCB, EVGA is also reusing their GTX 670 cooler. This cooler is a blower, which due to the positioning of the GPU and various electronic components means that the blower fan is off of the PCB entirely by necessity. Instead the blower fan is located behind the card in a piece of enclosed housing. This housing pushes the total length of the card out to 9.5”. Housed inside of the enclosure is a block-style aluminum heatsink with a copper baseplate that is providing cooling for the GPU. Elsewhere, attached to the PCB we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.

Elsewhere, at the top of the card we’ll find the 2 PCIe power sockets and 2 SLI connectors. Meanwhile at the front of the card EVGA is using the same I/O port configuration and bracket that we saw with the GTX 670. This means they’re using the NVIDIA standard: 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means that the card features EVGA’s high-flow bracket, a bracket with less shielding in order to maximize the amount of air that can be exhausted.

Rounding out the package is EVGA’s typical collection of accessories and knick-knacks. In the box you’ll find a pair of molex power adapters, a quick start guide, and some stickers. The real meat of EVGA’s offering is on their website, where EVGA card owners can download their wonderful video card overclocking utility (Precision X), and their stress test utility (OC Scanner X). The powered-by-RivaTuner Precision X and OC Scanner X still set the gold standard for video card utilities thanks to their functionality and ease of use. Though personally I’m not a fan of the new UI – circular UIs and sliders aren’t particularly easy to read – but it gets the job done.

Gallery: EVGA X Tools

Next, as with all EVGA cards, the EVGA GeForce GTX 660 Ti Superclocked comes with EVGA’s standard 3 year transferable warranty, with individual 2 or 7 year extensions available for purchase upon registration, which will also unlock access to EVGA’s step-up upgrade program. Finally, the EVGA GeForce GTX 660 Ti Superclocked will be hitting retail with an MSRP of $309, $10 over the MSRP for reference cards.

That Darn Memory Bus Meet The Zotac GeForce GTX 660 Ti AMP! Edition
Comments Locked

313 Comments

View All Comments

  • TheJian - Monday, August 20, 2012 - link

    Please point me to a 7970 for $360. The cheapest on newegg even after rebate is $410.

    Nice try though. "I'm pretty disappointed". Why? You got a 30in monitor or something? At 1920x1200 this card beats the 7970 ghz edition in a lot of games. :) Skyrim being one and by 10fps right here in this article...LOL.

    Mod to 670 isn't worth it when the shipped cards already beat it (3 of them did here). Remember, you should be looking at 1920x1200 and ignoring Ryans BS resolution only 2% or less use (it's a decimal point at steampowered hardware survey). If you're not running at 2560x1600 read the article again ignoring ryans comments. It's the best card at 1920x1200, regardless of Ryans stupid page titles "that darned memory"...ROFL. Why? STill tromps everything at 1920x1200...LOL.

    Got anything to say Ryan? Any proof we'll use 2560x1600 in the world? Can you point to anything that says >2% use it? Can you point to a monitor using it that isn't a 27/30in? Raise your hand if you have a 30in...LOL.
  • JarredWalton - Tuesday, August 21, 2012 - link

    http://www.microcenter.com/single_product_results....

    That's at least $20 cheaper than what you state.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    That's a whitebox version card only. LOL
  • TheJian - Friday, August 24, 2012 - link

    And the prices just dropped, so yeah, I should be off by ~20 by now :) White box, as stated. No game. Well, dirt showdown don't count it's rated so low ;)

    But nothing that states my analysis is incorrect. His recommendations were made based on 2560x1600 even though as proven 98% play 1920x1200 or less and the monitor he pointed me to isn't even sold in the USA. YOu have to buy it in Korea. With a blank faq page, help is blank, no phone and a gmail acct for help. No returns. Are you going to buy one from out of country from a site like that? Nothing I said wasn't true.
  • Mr Perfect - Thursday, August 16, 2012 - link

    I wonder if any board partners will try making the board symetrical again by pushing it up to 3GB? It's not like the extra ram would do any good, but if you could keep an already memory bandwidth starved card humming along at 144GB/s and prevent it from dropping all the way down to 48GB/s, it might help.
  • CeriseCogburn - Sunday, August 19, 2012 - link

    It doesn't drop to 48GB, that was just the reviewers little attack.
    You should have noticed the reviewer can't find anything wrong, including sudden loss of bandwidth, in this card, or the prior released nVidia models with a similar weighted setup.
    The SPECULATION is what the amd fanboys get into, then for a year, or two, or more, they will keep talking about it, with zero evidence, and talk about the future date when it might matter.... or they might "discover" and issue they have desperately been hunting for.
    In the mean time, they'll cover up amd's actual flaws.
    It's like the hallowed and holy of holies amd perfect circle algorithm.
    After years of the candy love for it, it was admitted it had major flaws in game, with disturbing border lines at shader transitions.
    That after the endless praise for the perfect circle algorithm, that, we were told - when push came to shove, and only in obscurity, that no in game advantage for it could be found, never mind the endless hours and tests spent searching for that desperately needed big amd fanboy win...
    So that's how it goes here. A huge nVidia advantage is either forgotten about and not mentioned, or actually derided and put down with misinformation and lies, until some amd next release, when it has appeared it is the time that it can finally be admitted that amd has had a huge fault in the exact area that was praised, and nVidia has a huge advantage and no fault even though it was criticized, and now it's okay because amd has fixed the problem in the new release... ( then you find out the new release didn't really fix the problem, and new set of sdpins and half truths starts after a single mention of what wrong).
    Happened on AA issues here as well. Same thing.
  • JarredWalton - Tuesday, August 21, 2012 - link

    Most games are made to target specific amounts of memory, and often you won't hit the bottlenecks unless you run at higher detail settings. 1920x1200 even with 4xAA isn't likely to hit such limits, which is why the 2560x1600 numbers can tell us a bit more.

    Best case for accessing the full 2GB, NVIDIA would interleave the memory over the three 64-bit connections in a 1:1:2 ratio. That means in aggregate you would typically get 3/4 of the maximum bandwidth once you pass 1.5GB of usage. This would explain why the drop isn't as severe at the final 512MB, but however you want to look at it there is technically a portion of RAM that can only be accessed at 1/3 the speed of the rest of the RAM.

    The better question to ask is: are we not seeing any major differences because NVIDIA masks this, or because the added bandwidth isn't needed by the current crop of games? Probably both are true to varying degrees.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    " GTX 660 Ti and 7950 tied at roughly 67fps. If you want a brief summary of where this is going, there you go. Though the fact that the GTX 660 Ti actually increases its lead at 2560 is unexpected. "

    Theory vs fact.
  • TheJian - Monday, August 20, 2012 - link

    Memory starved at what? NEVER at 1920x1200 or less. Are you running a 30in monitor? All 24in monitors are 1920x1200 or below on newegg (68 of them!). 80% of the 27inchers are also this way on newegg.com. 3GB has been proven useless (well 4gb was):
    http://www.guru3d.com/article/palit-geforce-gtx-68...

    "The 4GB -- Realistically there was NOT ONE game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference."

    "But 2GB really covers 98% of the games in the highest resolutions. "
    Game over even on 2560x1600 for 4GB or 3GB. Ryan is misleading you...Sorry. Though he's talking bandwidth mostly, the point is 98% of us (all 24in and down, most 27in) are running at 1920x1200 or BELOW.
  • Galcobar - Thursday, August 16, 2012 - link

    Was wondering about how the Zotac was altered to stand in as a reference 660 Ti.

    Were the clock speeds and voltages lowered through one of the overclocking programs, or was a reference BIOS flashed onto it? I ask because as I understand AMD's base/boost clock implementation, the base clock is set by the BIOS and is not alterable by outside software.

Log in

Don't have an account? Sign up now