Puget Systems Echo: Intel and AMD Showdown at 65 Watts
by Dustin Sklavos on March 21, 2012 2:35 AM EST- Posted in
- Systems
- AMD
- Intel
- Mini ITX
- Sandy Bridge
- Llano
- boutique
- Mini-Tower
System Performance
When I started testing these two systems from Puget Systems I honestly wasn't prepared for the kind of tug of war that would occur. Fundamentally, the expected results are on the page: the Intel CPU outclasses the AMD APU at every turn, while AMD's integrated graphics hardware thoroughly outclasses Intel's. I'm not going to lie either, the results are about what you'd expect. What impressed me was just how wide the gaps were. Take a look.
While the A6-3500's CPU performance would certainly be fine for a notebook, it's absolutely lousy on the desktop. Granted much of our competition is pretty unfair, with overclocked systems abounding, but look at how badly it even struggles against a last-generation Phenom II X4 955, much less the Intel Core i5-2320 in the Alienware X51. The i7-2600S is consistently two-to-three times faster in roughly the same power envelope.
To be fair, though, these results need to be framed in a more meaningful way than just "the A6-3500's CPU is dog slow." We need to consider the environments in which these systems are going to be used, and at the risk of sounding like an AMD apologist, I don't see many situations where the Intel chip's mammoth lead on the A6-3500 is going to be relevant. The A6-3500 is fine for basic Photoshop work, and neither of these systems are really ideal for serious video editing, where you need a much faster storage subsystem and CPU/GPU than either can provide either internally or externally.
Where a computer is much more likely to see frequent (if casual) use is in trying to run games, and here's where things take a turn.
The Intel Core i7-2600S with its crippled IGP can't even run two of our benchmarks, and only produces playable performance in one of them: Portal 2, with its ancient (albeit updated) Source engine. Meanwhile, the Radeon HD 6530D inside the A6-3500 can stretch its legs and deliver playable performance across every game except Battlefield 3, where a dip in resolution or settings will render that game playable as well.
Granted, these are conservative settings at a relatively low resolution, but the point remains that if someone wants to play a game on the A6-3500, they can, and reasonably comfortably. I've seen someone suddenly decide they want to try and play a game only to discover their system's integrated graphics can't handle it at all, and forums are rife with threads of people asking how to upgrade the graphics of their cheap desktops or their notebooks being met with the same answer: "you're screwed." With larger desktop systems, it's a different matter, but for mini-ITX and laptops you have to be prepared to live with whatever graphics the system includes from the factory.
62 Comments
View All Comments
djfourmoney - Thursday, April 12, 2012 - link
Same here, I need to buy some USB stuff since I have DirecTV and internal PCI HDTV Tuners. Just adds to the budget, I really can't afford to spend the extra $135 (Case, USB Combo Tuner, $15 extra for Mini-ITX)Mothergoose729 - Wednesday, March 21, 2012 - link
In the review, the power consumption of each platform was tested only under CPU load. This is inaccurate and unfair because the GPU power consumption contributes a lot to heat and detracts from efficiency. A combination of furmark GPU torture test and a CPU intensive load tester is needed to get an accurate measurement of the power consumption of these chips.Dustin Sklavos - Wednesday, March 21, 2012 - link
Incorrect. The systems are tested under load, CPU and GPU combined. I ran the stress test in AIDA64, stressing the CPU, GPU, and system memory. Previous results used whatever the most stressful situation I could find was to maximize power consumption; sometimes it was Mafia II, sometimes it was Left 4 Dead 2, sometimes it was AIDA64. My goal is consistently to maximize the power consumption, and the CPU and GPU are being stressed in tandem here.Mothergoose729 - Wednesday, March 21, 2012 - link
In the review, both CPU fit a 65 watt envelope. While it is true that AMD A8 processors feature more cores and better graphics, they also have a much higher TDP. To my knoweledge, the A6 processor in this review is the fastest or one of the fastest chips that is under 65 watts.BornDaemon - Wednesday, March 21, 2012 - link
Registered just to post this - this is a SFF with a low noise output and small energy footprint. Why was this not tested as a HTPC, looking at different outputs, image quality analysis between HD2000 and the AMD chip, etc? Seems a lot more likely it will be used hooked up to a TV than as a gaming rig, in my mind.HW_mee - Thursday, March 22, 2012 - link
I believe Anadtech already has a comparison of the Intel HD graphics and the Llano GPU somewhere on the site making such an analysis worthless.http://www.anandtech.com/show/4479/amd-a83850-an-h...
chuckula - Wednesday, March 21, 2012 - link
How DARE you only "lean" towards Llano!! This just shows that Anandtech is an evil Intel SHILL operation bought and paid for with evil Intel Blood Money!Any *objective* review would never have even considered using parts that aren't blessed by the holy elders of AMD! It's disgusting that you would even write an article that insults AMD by using the word Intel in it! And to have the nerve to suggest that people should choose a system based on their needs instead of just signing over their children and life savings to AMD is absolutely appalling!
I will never read this site again after such a twisted and disgustingly biased article! GOOD DAY SIR!
P.S. --> To the two people who were dumb enough to take this seriously, yes I am joking. It does show that the AMD cultists who constantly bash Anandtech don't have a clue though, they don't realize that the easiest way for Anandtech to give better reviews of AMD products is for AMD to actually make better products.
Mayuyu - Wednesday, March 21, 2012 - link
IMO, you should have reviewed the video image quality difference between Intel and AMD. It is a much more relevant test than gaming for this system.Stuff like how much frames can Quicksync vs AMD decode a 1080 H264 40Mbps stream at.
MadVR Performance..., etc.
chuckula - Wednesday, March 21, 2012 - link
Quicksync has exactly 0 to do with video decoding, this keeps coming up over and over and it's a little depressing how uneducated most people are. I can (and have) done full H.264 1080P video decoding with a 3 year old Core 2 notebook with x4500 graphics over an HDMI output with audio under Linux, so video playback is a piece of cake.Quicksync is for video *trans*coding which is 1. usually done offline and 2. often done on a separate box from the HTPC. The HTPC plays back the video *after* transcoding.
zebrax2 - Wednesday, March 21, 2012 - link
A good review. Some of the commentators seem to forget that this is not a processor review rather a system review. Dustin reviewed what was available, it is not his problem that a certain processor is not available for the system rather it is Pudgets.I actually think this review put AMD in good light. Even though the the processor (possibly also the ram) used in the system was not the best that one can get it still managed to impress the reviewer.