ZOTAC Quietly Releases GeForce GT 710 Graphics Card with PCIe x1 Interface
by Anton Shilov on April 22, 2016 10:00 AM ESTZOTAC has quietly introduced a new video card which is compatible with virtually every desktop PC released in the recent years. The new GeForce GT 710 graphics card with PCIe 3.0 x1 interface is not going to outperform modern higher-end iGPUs in games, but it will help owners of very low-cost systems, particularly those which may not even have a PCIe x16 slot, to add support for another display, or improve over the performance of completely outdated iGPUs.
The ZOTAC GeForce GT 710 1 GB (ZT-71304-20L) video card is powered by a cut-down version of NVIDIA’s GK208 GPU with 192 CUDA cores, 16 texture units and 8 ROPs. The GPU is based on the Kepler architecture, which supports Direct3D feature level 11_0, OpenGL 4.5 as well as OpenCL 1.2 APIs. The chip is clocked at 954 MHz and has compute performance of around 366 GFLOPS (well below that of modern iGPUs). The card is equipped with 1 GB of DDR3-1600 memory featuring 12.8 GB/s bandwidth.
The card comes in half height half length (HHHL) form-factor and is shipped with two brackets (for low-profile and standard PCs) to maximize compatibility with various computers. The graphics board has minimal (19W) power consumption and does not require active cooling (which means, it is also whisper quiet).
The main selling points of the ZOTAC GT 710 are its PCIe 3.0 x1 interface as well as three display outputs — DVI, HDMI 1.4 and D-Sub. Some entry-level PCs simply do not have PCIe x16 or x8 slots to install a graphics card, but virtually all desktops released in the last ten years have at least one PCIe x1 slot. ZOTAC’s new graphics card promises to be compatible with such systems. If owners of such PCs need to add one or two more display outputs, or just find their iGPUs too slow in Windows 10, they can buy the GeForce GT 710 1 GB PCIe 3.0 x1 graphics adapter. The board supports up to three displays, which should be enough for many workloads.
NVIDIA GPU Specification Comparison | ||||||
GT 710 | GT 720 | GT 630 | GT 610 | |||
CUDA Cores | 192 | 192 | 192 | 48 | ||
Texture Units | 16 | 16 | 16 | 8 | ||
ROPs | 8 | 8 | 16 | 4 | ||
Core Clock | 954MHz | 797MHz | 875MHz | 710MHz | ||
Shader Clock | N/A | N/A | N/A | 1620MHz | ||
Memory Clock | 1.8GHz DDR3 | 1.8GHz DDR3/ 5GHz GDDR5 | 1.8GHz DDR3 | 1.8GHz DDR3 | ||
Memory Bus Width | 64-bit | 64-bit | 64-bit | 64-bit | ||
VRAM | 1GB or 2GB | 1GB or 2GB | 1GB or 2GB | 1GB | ||
TDP | 19W | 19W | 50W | 29W | ||
GPU | GK208 | GK208 | GK107 | GF119 | ||
Launch Timeframe | January, 2016 | March, 2014 | April, 2012 | May, 2012 | ||
Launch Price | $30 - $50 | $49 | OEM | $49 |
When it comes to performance in games, it is unlikely that the GeForce GT 710 is fast enough for more or less demanding titles. The product may be faster than iGPUs integrated into entry-level Celeron or Pentium processors, but only in various online games that do not require a lot of GPU horsepower anyway.
As for market perspectives of ZOTAC’s GeForce GT 710 1 GB PCIe 3.0 x1, it should be noted that this is a niche product designed for owners of low-end systems, who need a better GPU or additional display outputs. Typically, people, who use such systems do not upgrade often, hence, ZOTAC’s new video card will hardly become a bestseller. Nonetheless, it will be a unique device for those who really need it.
ZOTAC does not list recommended prices on its web-site. However two stores which sell the device in Europe have it listed €58.30 ($65.8) and €84.7 ($95). This is definitely higher than you'd otherwise expect for a bottom-tier NVIDIA card, though it may very well be that retailers are counting on its unique nature.
Source: ZOTAC (via the Tech Report)
55 Comments
View All Comments
magreen - Friday, April 22, 2016 - link
Definitely confusing. But the photo shows a x1 interface.nightbringer57 - Friday, April 22, 2016 - link
Clearly a x1 card. The x4 is a typo.Ryan Smith - Friday, April 22, 2016 - link
Sorry about that. Fixed.BrokenCrayons - Friday, April 22, 2016 - link
x1 - But your comment brings up an interesting thought on our perspective about performance. I know the numbers aren't quite comparable because of the cross-generational nature of the comparison, but bear with me. A relatively high end GeForce 8800 GTS munched up over 100W to feed 92 stream processors clocked at 500MHz and 320 or 640MB of VRAM over a first generation PCIe x16 slot that offered about 4 GB/s of bandwidth. A PCIe 3.0 x1 slot only has about 1 GB/s of throughput but is going to be burdened with 192 much more modern CUDA cores at almost double the clock speed. True the 710 has a lot less memory bandwidth (the 8800 had a 320 bit wide bus) but I would think if it was given access to a PCIe x16 slot, it could most certainly put the added throughput to use. It's just a pity this thing isn't on GDDR5 as it probably has enough GPU power to take advantage of a lot more memory bandwidth if comparing it roughly to an 8800 GTS is any indication.nightbringer57 - Friday, April 22, 2016 - link
The 8800GTS's shaders were clocked at 1200MHz ;)And even at that time, on pci-express 1.0/1.1, you could barely see the difference between x4 and x16 slots, let alone x8 and x16 slots.
BrokenCrayons - Friday, April 22, 2016 - link
Yeah, I admit I didn't look more than briefly at the 8800 spec chart when I was poking up my previous post. That whole difference between core and shader clock didn't cross my mind. I'd still be pretty interested in an Anandtech analysis measuring the difference between a PCIe x16 and x1 GT 710.fanofanand - Friday, April 22, 2016 - link
I was just going to say, I had the second gen 8800 GTS and I remembered clocking it over 1200. Fantastic card while it lasted.extide - Friday, April 22, 2016 - link
It's x1, the x4 is a typo.ingwe - Friday, April 22, 2016 - link
Please tell me I am not the only one who laughed at the pun in the title.xthetenth - Friday, April 22, 2016 - link
Isn't part of the idea of PCIe that you can plug whatever size connector into whatever size slot? I'm pretty sure that PCIe slots have an open back just for that sort of scenario.