ZOTAC Quietly Releases GeForce GT 710 Graphics Card with PCIe x1 Interface
by Anton Shilov on April 22, 2016 10:00 AM ESTZOTAC has quietly introduced a new video card which is compatible with virtually every desktop PC released in the recent years. The new GeForce GT 710 graphics card with PCIe 3.0 x1 interface is not going to outperform modern higher-end iGPUs in games, but it will help owners of very low-cost systems, particularly those which may not even have a PCIe x16 slot, to add support for another display, or improve over the performance of completely outdated iGPUs.
The ZOTAC GeForce GT 710 1 GB (ZT-71304-20L) video card is powered by a cut-down version of NVIDIA’s GK208 GPU with 192 CUDA cores, 16 texture units and 8 ROPs. The GPU is based on the Kepler architecture, which supports Direct3D feature level 11_0, OpenGL 4.5 as well as OpenCL 1.2 APIs. The chip is clocked at 954 MHz and has compute performance of around 366 GFLOPS (well below that of modern iGPUs). The card is equipped with 1 GB of DDR3-1600 memory featuring 12.8 GB/s bandwidth.
The card comes in half height half length (HHHL) form-factor and is shipped with two brackets (for low-profile and standard PCs) to maximize compatibility with various computers. The graphics board has minimal (19W) power consumption and does not require active cooling (which means, it is also whisper quiet).
The main selling points of the ZOTAC GT 710 are its PCIe 3.0 x1 interface as well as three display outputs — DVI, HDMI 1.4 and D-Sub. Some entry-level PCs simply do not have PCIe x16 or x8 slots to install a graphics card, but virtually all desktops released in the last ten years have at least one PCIe x1 slot. ZOTAC’s new graphics card promises to be compatible with such systems. If owners of such PCs need to add one or two more display outputs, or just find their iGPUs too slow in Windows 10, they can buy the GeForce GT 710 1 GB PCIe 3.0 x1 graphics adapter. The board supports up to three displays, which should be enough for many workloads.
NVIDIA GPU Specification Comparison | ||||||
GT 710 | GT 720 | GT 630 | GT 610 | |||
CUDA Cores | 192 | 192 | 192 | 48 | ||
Texture Units | 16 | 16 | 16 | 8 | ||
ROPs | 8 | 8 | 16 | 4 | ||
Core Clock | 954MHz | 797MHz | 875MHz | 710MHz | ||
Shader Clock | N/A | N/A | N/A | 1620MHz | ||
Memory Clock | 1.8GHz DDR3 | 1.8GHz DDR3/ 5GHz GDDR5 | 1.8GHz DDR3 | 1.8GHz DDR3 | ||
Memory Bus Width | 64-bit | 64-bit | 64-bit | 64-bit | ||
VRAM | 1GB or 2GB | 1GB or 2GB | 1GB or 2GB | 1GB | ||
TDP | 19W | 19W | 50W | 29W | ||
GPU | GK208 | GK208 | GK107 | GF119 | ||
Launch Timeframe | January, 2016 | March, 2014 | April, 2012 | May, 2012 | ||
Launch Price | $30 - $50 | $49 | OEM | $49 |
When it comes to performance in games, it is unlikely that the GeForce GT 710 is fast enough for more or less demanding titles. The product may be faster than iGPUs integrated into entry-level Celeron or Pentium processors, but only in various online games that do not require a lot of GPU horsepower anyway.
As for market perspectives of ZOTAC’s GeForce GT 710 1 GB PCIe 3.0 x1, it should be noted that this is a niche product designed for owners of low-end systems, who need a better GPU or additional display outputs. Typically, people, who use such systems do not upgrade often, hence, ZOTAC’s new video card will hardly become a bestseller. Nonetheless, it will be a unique device for those who really need it.
ZOTAC does not list recommended prices on its web-site. However two stores which sell the device in Europe have it listed €58.30 ($65.8) and €84.7 ($95). This is definitely higher than you'd otherwise expect for a bottom-tier NVIDIA card, though it may very well be that retailers are counting on its unique nature.
Source: ZOTAC (via the Tech Report)
55 Comments
View All Comments
Tom Braider - Monday, April 25, 2016 - link
Speaking from personal experience, this is not always a matter of 'pain'.If you want to use a hi-res display with a Celeron- or Pentium-class IGPU you will quickly discover that the on-board HDMI/DVI outputs will not support anything over 1920x1080 and running a WQHD monitor over VGA is every bit as crappy as you can imagine.
pugster - Tuesday, April 26, 2016 - link
The users of these video cards are mostly companies who are using outdated pc's and needs 2 or more displays. Some people that I know are using 2 video cards for more than 3+ displays.ShieTar - Sunday, April 24, 2016 - link
It's just about a GT640 cut in half. Testing won't tell you much, it is obviously to slow for any kind relevant gaming or compute applications, but it will handle 2D-Graphics and Full-HD Video playback just fine. It may or may not handle HEVC-coded 4K Video, but it doesn't have HDMI2.0, so it doesn't make sense to use it in a 4K-Video-Player anyways.Also, this card has been available to OEMs for over 2 years (though at first with 512MB of memory only). I'm a little surprised they releasing this now as a end-customer version, would have rather expected that they get ready to produce low-performance versions of Maxwell now as Pascal is about to be released.
Barilla - Sunday, April 24, 2016 - link
First thing that comes to mind is any Xeon E5 machine as there are no E5 with iGPU as far as I know. And you need some kind of video output, but not necessarily the graphics processing power so why waste money on a fancy GPU.ddriver - Friday, April 22, 2016 - link
I wonder if we are gonna see a USB 3.1 graphics card soon, bandwidth is pretty much the same as PCIE x1, although latency is higher, but as long as individual transfers are combined together as bulk transfers, latency shouldn't be that much of an issue.JeffFlanagan - Friday, April 22, 2016 - link
I use a USB 2.0 graphics adapter for my 3rd monitor at work. While not for gaming, I don't perceive any latency issues when using that display. Application windows move as smoothly on that display as on my other displays. Win 7 with Aero enabled. I expect you could do a lot with a USB 3.1 graphics card.nightbringer57 - Friday, April 22, 2016 - link
USB to HDMI/DVI/VGA adapters don't typically contain a GPU. Their drivers usually "hack" the existing graphics card and create some sort of a virtual additional monitor, rendered by the existing GPU, then compress (in the case of USB2 adapters, compress harshly) the frame and send it to the adapter for output. This compression causes the usual latency noticed on such adapters, albeit barely noticeable in some cases.You probably couldn't use your USB adapter on a completely GPU-less computer.
Due to overall latency issues (AFAIK) USB3 adapters do the same thing, except with much better results as the bandwidth and latency are overall so much better than on USB2.
descendency - Saturday, April 23, 2016 - link
You might see a "Type-C" thunderbolt backed one, but not a USB 3.1 based card, due to latency issues of using the USB bus.QuantumPion - Friday, April 22, 2016 - link
This seems like it would be a perfect HTPC video card except that it lacks HDMI 2.0, making it useless for that purpose.JeffFlanagan - Friday, April 22, 2016 - link
As long as it's a 1080p theater, HDMI 1.4 will get the job done. Anyone with a 4K theater can afford a better video card, and may have a pair of at least 970s so they can game.