ASUS P8Z68-V PRO: Board Features

Market Segment Performance
CPU Interface LGA 1155
CPU Support i3/i5/i7 Sandy Bridge
Chipset Z68
Base Clock Frequency 80.0 MHz to 300.0 MHz in 0.1 MHz intervals
DDR3 Memory Speed 1333 MHz by default, 800-2400 MHz supported
Core Voltage Auto, offset or fixed modes, 0.800 V to 1.990 V in 0.005 V intervals
CPU Clock Multiplier Dependent on CPU
DRAM Voltage Auto, 1.2 V to 2.2 V in 0.000625V intervals
DRAM Command Rate Auto, 1T-3T
Memory Slots Four 240-pin DDR3 DIMM slots in dual-channel
Regular unbuffered DDR3 memory
Up to 32GB total supported
Expansion Slots 3 x PCI Express 2.0 x16 slots (x16/x0 or x8/x8, final x1 or x4)
2 x PCI Express 2.0 x1 slot
2 x PCI slots
Supports ATI Crossfire
Onboard SATA/RAID 2 x SATA 6.0 Gbps ports (white) supporting RAID 0/1/5/10
4 x SATA 3.0 Gbps ports (light blue) supporting RAID 0/1/5/10
2 x SATA 6.0 Gbps ports (dark blue) from Marvell
1 x eSATA 3.0 Gbps ports from JMicron JMB362
Onboard 4 x SATA 3Gbps connectors
4 x SATA 6Gbps connectors
6 x Fan Headers (2x4-pin, 4x3-pin)
3 x USB 2.0 headers support additional 6 USB 2.0 Ports
1 x USB 3.0 header supports additional 2 USB 3.0 Ports
1 x Power on button
1 x Reset button
1 x Front panel audio header
1 x S/PDIF out connector
2 x Firewire/IEEE 1394 headers
TPU/EPU/MemOK buttons
Onboard LAN Intel® 82579 Gigabit Fast Ethernet
Onboard Audio Realtek® ALC892 8-Channel HD Audio
Power Connectors 24-pin EATX Power connector
8-pin EATX 12V Power connector
Fan Headers 2 x CPU Fan (4-pin, 3-pin)
2 x CHA Fan (4-pin, 3-pin)
2 x PWR Fan (3-pin)
IO Panel 1 x RJ45 LAN connectors
1 x Audio port (Line-in, Line-out, Mic-in)
6 x USB 2.0 ports
2 x USB 3.0 ports
1 x Optical S/PDIF Out connector
1 x IEEE 1394 port
1 x eSATA 3Gbps ports
HDMI video output
DVI video output
VGA video output
BIOS Version 8801
Warranty Period 3 Years

ASUS P8Z68-V PRO: In The Box

  • IO Shield
  • SLI bridge
  • 4 x Locking SATA cables with one right-angled end
  • USB 3.0 back panel

Compared to some of the P67 extras we've seen, ASUS are middle of the road with this offering.

ASUS P8Z68-V PRO: Software

AI Suite II

ASUS kindly wrap all their software into one package called the AI Suite II. You may remember this from our P8P67 PRO review. Not a lot has changed to be honest—the major one is that you can now overclock the integrated GPU from the software. I'll give a quick rundown:


AI Suite II is a program that lets you adjust:

  • The overclock of the system, in terms of BCLK, multiplier, voltages, iGPU speed
  • The switching frequencies for the DIGI VRM
  • The power saving utilities on offer
  • The full fan speed profile for the chassis and CPU fans, with double ramping
  • Thresholds for fans and voltages to give OS warnings if they go out of any comfort zones

Other features include using a Bluetooth device as a media player or as an overclock tool for the system, and options to update and change the BIOS and its logo.



The auto overclock system we will cover later, but the voltage and frequency screen is of interest. Unlike most other Cougar Point motherboard manufacturers, ASUS lets a user control the BCLK and the multiplier for the CPU from the OS in real time—not requiring reboots in-between. There are a couple of negative points however—the iGPU frequency requires a reboot after every selection, and the changes you do make to the CPU frequency aren't written to the BIOS (and thus aren't permanent) nor are they initialized on reboot, requiring a manual adjust on every boot.


Of particular interest is the fan profiling, which is pretty substantial compared to other motherboard manufacturers—even ECS which we have recently liked. There are a raft of preset options, or each fan group can be controlled independently. Unfortunately you can only control the CPU fans as a group and the chassis fans as a group, rather than all of them individually.


Anand will be covering Virtu is some detail in his Z68 article, however I'd like to explain my experiences with the software for the ASUS Z68 board.

If you've never heard of Virtu, it's a program developed by LUCIDLOGIX which essentially enables the use of both the integrated GPU in the processor and the discrete GPU connected to a PCIe slot on board.

Virtu can run in three modes—off (normal mode with a discrete GPU, integrated GPU is turned off), i-Mode (i for integrated) and d-Mode (d for discrete). These latter two modes consist of:

For i-Mode, the integrated GPU (iGPU) is used as the main video output, allowing the discrete graphics (dGPU) to run at idle. The iGPU is used for doing the majority of non-3D activity (transcoding, etc.) and as such the overall power of the system should be lower. Under full 3D mode, Virtu uses the dGPU to process the raw data, and then takes a copy of the memory buffer and puts this into the iGPU, which the iGPU can then process. There is a small overhead associated with this transfer, limiting the capabilities of the dGPU. In this mode, we are limited to one dGPU on board.

For d-Mode, the dGPU is used as the main video output, meaning this time the video cable is connected to the dGPU. In this mode, multi-dGPU setups such as SLI and Crossfire can be activated. For all work, the dGPU provides the video output. The only difference in d-Mode to any regular computer is that we now have access to the superior transcoding capabilities of the iGPU—any program which can use the iGPU for a process boost is monitored by Virtu so the iGPU can be invoked to do its job. There are power savings in this mode, limited to the power difference between the iGPU doing the computation rather than the dGPU on any other system.

To sum up: both modes allow Quick Sync, i-Mode is limited to one dGPU, offers power savings, but at the expense of 3D performance; d-Mode is multi-GPU, no power savings, unless you use Quick Sync over a GPU.

Does that make sense at all? Well in my line of computer use, I can see three main issues at stake here:

1) In i-Mode, how good are the power savings?
2) in i-Mode, how bad is the overhead in 3D mode?
3) in d-Mode, is the 3D mode hindered by Virtu?

Luckily, I was able to answer all my own questions.

In i-Mode, I took an AMD GPU (5850 1GB) and an NVIDIA GPU (GTX460 768MB) and performed my normal power tests on them—Idle, HD Video Playback, OCCT (CPU Stress Test), and High Resolution Metro 2033. The results are as follows:

AMD 5850 Power (Normal/i-Mode):
Idle: 81/85
Video: 85/92
OCCT: 148/150
Metro: 202/205

GTX 460 Power (Normal/i-Mode):
Idle: 74/75
Video: 106/80
OCCT: 148/144
Metro: 216/212

In my experience, the only power saving observed was with the GTX 460 when watching HD video. This is essentially in line with the Virtu advertising, but I'm under the impression it has to be taken with a pinch of salt. In these 'modern' times, AMD and NVIDIA are pretty good with their sleep states—meaning in light video loads, you don't need to power up the whole GPU. This is what keeps the AMD GPU from increasing its power under HD video mode. In fact, with the AMD GPU, I saw an increase in power from using Virtu, as the iGPU was enabled all the time—under normal mode, it's not activated and it is power gated so it uses almost no power.

In terms of 3D, I ran my series of gaming benchmarks and resolutions for comparison. We actually came across a flaw in Metro 2033 thanks to this—when the benchmark program is loaded, it checks the current video output for the DirectX capabilities. As we're being channeled through the iGPU when we load it, it sees DX10. The dGPU is only invoked when the benchmark launches into full 3D mode—so we're stuck benchmarking in DX10. The Dirt 2 benchmark is full DX11, as the settings are decided after the game launches (and the dGPU becomes active).

AMD 5850 FPS (Normal/i-Mode):
Metro 2033 DX10 1920x1080: 23.1/21.8 (-5.6%)
Metro 2033 DX10 1680x1050: 58.5/55.7 (-4.8%)
Dirt 2 DX11 1920x1080: 66.4/60.75 (-8.5%)
Dirt 2 DX11 1680x1050: 82.3/74.65 (-9.3%)

GTX 460 FPS (Normal/i-Mode):
Metro 2033 DX10 1920x1080: 18.4/16.8 (-8.7%)
Metro 2033 DX10 1680x1050: 46.5/43.4 (-6.7%)
Dirt 2 DX11 1920x1080: 45.0/ 40.6 (-9.8%)
Dirt 2 DX11 1680x1050: 77.9/ 73.2 (-6.0%)

On average, we see a decrease in performance of around 5-10% due to Virtu in this mode. In my view, this is quite a lot, but then again I do like to play a few games now and again. There are potential advantages in the power savings, and there will be a few situations where the iGPU just won't be able to handle certain calculations and a consumer will require a dGPU, but is it really worth a 10% decrease in single dGPU performance?

Now to my third question, if there's a difference in d-Mode compared to a normal non-Virtu environment. Thankfully, by running the gaming benchmarks, I saw no change in FPS values:

Dirt2, single GPU, 1920x1080: 66.4 FPS
d-Mode: 66.4 FPS

Dirt2, single GPU, 1680x1050: 82.3 FPS
d-Mode: 82.3 FPS

Now I must stress that even with statistical variance, it would be rare to get these values spot on, but these were the averages of several runs in each case.

In either mode, programs can be specified to switch where the power is coming from—the picture above is the default list of programs when I am in d-Mode on the ASUS P8Z68-V Pro.

ASUS P8Z68-V PRO: Overview and Visual Inspection ASUS P8Z68-V PRO: UEFI and Overclocking
Comments Locked


View All Comments

  • AnnihilatorX - Wednesday, May 11, 2011 - link

    Short answer, no...
  • mczak - Wednesday, May 11, 2011 - link

    IMHO doesn't make a whole lot of sense overclocking the IGP but not the memory. The IGP could potentially benefit quite a bit.
  • Markstar - Wednesday, May 11, 2011 - link

    Great review - makes me wish Anandtech had more of these (along with reviews low-capacity SSDs). :p
  • Shadowmaster625 - Wednesday, May 11, 2011 - link

    You guys need to stop giving Asus a free pass on their horrible website that you cant even download drivers from. And their horrible bios flashing tools that do not work. Never ever buy asus.
  • sor - Wednesday, May 11, 2011 - link

    I'm actually kind of pissed at them myself. my P8P67Pro is a good board, but it has problems with SAS cards. They updated the BIOS to fix some compatibility issues, but my LSI card still does not work. It's a pretty well documented issue, and I think it's sort of ridiculous. I will consider buying a new Z68 of a different brand so I can use my LSI card if they don't have a BIOS fix soon.

    I didn't have any trouble flashing to the latest ASUS BIOS with a DOS boot cd.
  • The_Assimilator - Wednesday, May 11, 2011 - link

    You're using a piece of server hardware in a desktop board and you don't understand why it doesn't work? Did you perhaps think about that, or about verifying compatibility between the board and the card before purchasing?

    Don't blame the manufacturer for your own ignorance.
  • sor - Wednesday, May 11, 2011 - link

    Give me a break. That's the most ridiculous comment I've heard in a long while. You're the one showing your ignorance now.
  • sor - Wednesday, May 11, 2011 - link

    Sorry, I can't let this go. I'm not sure why you decided to take an antagonstic tone, but apparently hardware is mysterious to you? What makes you think this is a server part? What makes you think it's unreasonable to believe that a card that adheres to the PCIe standard should be able to work in a PCIe slot? Why is it unreasonable to think that a card that worked in an old P35 desktop and my wife's older Q965 should work in my new P67 desktop? I can only imagine you read "SAS" and thought "ooh scary, that's the shiny server stuff, what is this guy thinking? I need to belittle him". Nevermind that they make windows XP drivers for it. Nevermind that I've installed over 300 LSI SAS cards in various hardware over the last year at work.

    Compatibility lists are not exhaustive, and are usually 6-12 months out of date. I still maintain that it's a reasonable expectation that a PCIe card shipping with desktop OS drivers, that works in every other motherboard I can get a hold of, should work in the PCIe slot of my new motherboard, and if not, it's the motherboard's fault.
  • L. - Thursday, May 12, 2011 - link

    I would say, let's blame the manufacturers ... So many inconsistencies should not be tolerated, like for example some sticks not working on some boards etc.. or ridiculous PCB design leading to not being able to put a 'standard' modern cooler / ridiculous cooler design the other way around etc. etc.
  • Pneumothorax - Wednesday, May 11, 2011 - link

    Does overclocking the IGP help Quicksync speeds? I don't think I've seen testing done on this so far.

Log in

Don't have an account? Sign up now