Thank you so much for this article. I'm looking to buy a new sub $500 laptop and was looking at the AMD A8 processors... and then the A10's got announced.
Regardless, I was curious of how they would stack up. Really helpful seeing how midrange equipment stacks. Looks like I'm gonna pick up an A10 laptop.
Hopefully prices come down a bit; right now the only A10-4600M laptops I can find are going for over $700. They're decent chips overall, but I'm not convinced they're better than a dual-core Sandy Bridge with GT 540M. The Acer I used is clearly not the best representative of that market, as the 13.3" chassis is quite thin and just can't cool the CPU+GPU well enough to avoid throttling; pretty much any 15.6" chassis should do better.
It seemed to me that the Llano laptops were marketed the same way. The A8 seemed to always be considerably more expensive than the A6 model, when the chips could not be that much different in cost. However, the A8 was usually better equipped with more ram.
Pretty cool from A10-4600M to providing the same performance as Sandy Bridge with GT 540M for half energy consumption. Is it two weeks after Trinity's launch? Too bad still not possible to find these laptops.
Right and the price is not going to be nice, so later in here a GT540 optimus laptop is recommended with a link by the author. $599 MC, $679 egg.
So long gone is the hope the A10 comes in a cheap $350 or $400 walmart special like the (low end) brazos.
If amd is going to charge high prices for their cheapo chip, the rest of the laptop had better be awesome not some plastic creaking crud - and the screen had better be good.
I think what we'll see instead is junky cheap builds that cost a lot.
For people planning to play into Hell or Inferno difficulties. Be aware that elite monster packs will have 3 magical abilities on Hell and 4 on Inferno. You could also run into two or more packs of monsters at once, so you could be looking at 6, 8 or more magical effects. Also, some of the monster abilities cause them to replicate. I've definitely seen 30 or more monsters at once with the entire screen covered with fire, poison and lightning.
I don't play on a laptop or use an IGP, but I assume this could have negative impact on performance. Normal mode and even Nightmare mode would probably not be too bad though.
I'm in Nightmare Act 4 and in the Keep (Tristram equivalent, the homebase) I get over 100fps. Taking on average mobs outside the Arreat Gate drops fps to 80. And that's on a 6950 , 1900x1200 everything maxed.
On my laptop with a 8600m GT I could play fine in Normal until I got to Act4. I can still play but it's annoyingly choppy when there are lots of mobs. The 8600m GT is mentioned as supported 'Low' on the Bliz site.
Problem is, to test on Hell I have to play through all of Normal, then all of Nightmare. I know people who have already done that, sure, but I only got the game two days ago and I have a family and a life outside of playing games. Hence the disclaimer at the beginning. I'll update the text to mention slowdowns on later areas.
If you want, you can use my account to test. You'll need to sign a contract I receive that MSI laptop if you abuse my account.
Seriously though, I wouldn't mind you using my account (the greater good and all that). Only problem I see is I'm playing on Europe servers so the lag might spoil testing.
There is a basic reason why the game runs so well in Act 1 Normal.. play through Act 3 Hell then come back and redo your review. Only the 650M has a chance of playable frame rates in those levels and we haven't even covered multi-player. My 7870 OC to 1100 Mhz has some slowdowns in those levels under some high stress scenarios and basically the game becomes an absolute nut-fest in later difficulties. People will want to play through the later difficulties, its part of the game's progression. Now I get that its hard to benchmark through the randomness but you can make subjective comparisons or do several run throughs. I can say with absolute certainty, none of the apus have a chance in playable frame rates in scenarios where it will matter. D3 is a very unforgiving game, it can take a split second to die, smooth frame rates in non-normal difficulties is essential.
did the test included multiplayer? game staggers when more than one player is in game on some configurations, while it's totally smooth in single player...
also, memory usage tends to increase greatly in later acts, may hurt performance if memory is shared....
Tell you what, guys: email me your account login and password and don't play the game for a day, and give me instructions on a good stressful area to play on Hell difficulty, and then I can test that area. Otherwise, I simply don't have the 40+ hours needed to get to that point in the game in less than a week.
And in case it's not clear, I'm mostly joking here. I've got several items I'm working on reviewing that are going to be higher priority than revisiting Diablo III performance in later acts. Perhaps this summer I'll have a chance to go back, but by then it won't really matter that much. So I'd suggest taking these figures as a way of getting relative performance from the various GPUs/IGPs, and then extrapolate from there. If you need to play on Hell difficulty on a laptop with maximum details enabled, you're probably going to want at least a GK107 dGPU (or perhaps Southern Islands).
hehe, but it's not that hard - you don't even have to be on higher level difficulties - its only ACT 1 OF NORMAL, which is more like a tutorial and considerably less populated (and task manager is claiming ~300mb ram, which increases up to 1gb later, still on normal)
all the things mentioned later, like having freezing monsters or duplicates or 100+ creeps on screen are happening on nightmare also, and even on late normal, so it shouldn't be that kind of bother...
on account topic user/password, use freejack/demise001xp, that's mine :)
(joking of course, but you could give me YOUR user/password and authenticate it with one of those mobile apps while on chat, and i could level you up pretty fast, playing since diablo1. being on normal, you don't have much to lose, i'll even leave you some nice gear to start nightmare with - seriously, talking about few hours job)
and all is in good-faith, since i don't play d3 on laptop anyway :)
All joking about account sharing not withstanding, would AT buying a new D3 account for testing and letting a volunteer (not me) level it up for late game/hell testing be a viable option?
We do have a couple people playing the game, so at some point we'll be able to test later levels. Give me a chance to: A) have a holiday (today), B) write a few other articles, C) enjoy the game without more benchmarking of it (which totally kills the fun!). Probably in a week or so I can come back with results from Act II or later.
My diplomacy skills are of the Europe 1914 level; the odds of my being able to sweet talk someone I don't know well into anything are slim to none.
Better results in a week or so isn't that bad a delay. I'm just mildly frustrated since I've had a few people ask what sort of hardware they needed to play the game; and it seems that all the numbers I can find are from very early in the game and thus non-representative of what's really needed.
I'm curious about that as well, SoC GPUs like the SGX 543MP4 are getting pretty complex and Intel themselves used to use integrated PowerVR GPUs in their chipsets.
Actually Metro/WinRT won't be used for gaming, If you want a restricted environment there already is XNA so. Games will be too difficult and too less of an incentive or anything to gain to port to the WinRT framework. Or Windows Runtime as they call it. Game developers will never target Metro/WinRT if they don't have to and they don't on x86 machines, desktop is there, you can still build for Windows 7 etc where most users are and so on. Won't happen that much here until next gen consoles either. Plus macs have gotten a whole lot better in the department and plenty of game engines are available now. Taking those kind of engines and porting to C++/WinRT isn't something taken lightly it probably won't actually be possible without a rewrite which defeats the purpose. The performance wouldn't be good. The sandbox is probably too restrictive. It also means in practice it is a more restrictive environment then the mobile sandboxed OS's, several mobile OS's run Firefox for example. WinRT never will. WinRT never will run even IE.
Did I miss the part where you talk about using an external monitor, or how else were you able to run all of these GPUs at all three resolutions? I'm not saying the data isn't important, as it could be relevant to different notebooks that use the same or similar hardware just with higher-res screens.
Also, I've played this game on an old desktop with with GTX 285 @ 1080p and everything turned up. While that is fairly smooth and playable, I still get quite a few moments of "stuttering" in Hell difficulty. I also play on basically the same Acer book with the GT 540M, and even at the lowest possible graphics settings and resolution in normal mode, it's hard for me to characterize that performance as anything other than horrible in comparison to the desktop.
Yes, all of the higher than 1366x768 results were done on an external LCD where required (which was the case for Llano, Trinity, VAIO C, TimelineX, and Vostro; the other laptops had 1080p displays, except for quad-core SNB which has a 1600x900 LCD and I didn't run the 1080p tests).
Good review for what it is, but I think it could have been a little more complete with some additional information:
1) Use Act 3 Bastion's Keep for the "intensive" case instead of Act 1 Old Town. I think this would be better representative of the game's peak demand. (probably just a run through of the signal fires quest since it's easy to get to)
2) Include a brief section on how much of an impact additional players put on the game. I find it can actually be quite significant. This doesn't have to be full-depth review just a quick.
Overall, I'm using an A8-3500M + 6750M crossfire (overclocked to 2.2GHz) @1366x768 and my framerates during battles (ie. when it counts) average about 1/2 to 1/3 what the reviewer posts because the game gets much more intensive than Act 1, and having a party also slows it down significantly compared to solo.
Just some ideas to expand the review if you want =)
I have been testing this out on my W520 for the sake of seeing what I can do to play diablo and maintain decent battery life.
For what it is worth, turning off shadows, and playing @ 1366x768 on the HD 3000 results in roughly 28fps - more than enough to play the game through the first difficulty anyhow. I have been using this for some time now with 4 players in game. When running @ 1080P, it dips down into the low 20's, and occasionally is a problem in act 3 so I wouldn't suggest it.
Point is though, that anyone that has a notebook with SB and no video card CAN still play this game, even if it isn't ideal.
Given this is a cross platform game, it would have been interesting to provide Mac results with similar hardware. I play using a GT330m and i7 dual core, and it runs pretty well. I'd like to see how it stacks up to the latest AMD chips and HD3000 on a Mac.
Anecdotally, Brian and Anand have both commented that Diablo 3 on a MacBook Pro under OS X runs like crap. I'm not sure if they're running on latest generation MBP13 or something else, though, so that's about all I can pass along.
Was there any doubt? OSX is severely lacking in the graphical driver support. Apple never gave a rat's rear about this crucial aspect of gaming support. They are always late with drivers and with the latest OpenGL spec.
The recommendations / minimum requirements on Macs are discrete graphics with good drivers though. I.e. no nvidia 7300 / 7600, ATi X1600 / X1900 etc. Starting point is 8600 GT. Obviously no integrated Intel graphics is enough there. OpenGL3.2 or OpenGL 2.1 with extensions should be fine for game developers and the drivers handle it, nVidia and AMD can put in performance improvements if they have the feedback. They could even launch their own "game edition" card for the Mac Pro with their own drivers outside of Apples distribution channel. Nvidia releases drivers on there site from time to time. That said both the game engine port and drivers are a bit less optimized then their Windows and Direct3D counterpart. They [drivers] are quiet robust and well working but might not be that fast. It's mainly a problem for the developers today though as most macs has somewhat decent graphics with maintained drivers and have pretty good driver support and support pretty much all the features you need any way.
The OS is very dependent on OGL so the support it self is decent and fairly up to to date even if it is not OpenGL 4.2/3.3 yet. Latest OpenGL 4.2 is not even supported by much of any hardware that Apples uses either so. R700, R600, GF 8M, GF 9M and the desktop versions does not support more then OpenGL 3.3 any way which it self is a backport of as much as possible. 3.2 is a decent level there. Apple always support the whole API in the software renderer too so they have no joy hunting the latest features, though the vendors can use any extensions they wish to add those features, all the supported gpus supports the API too. Intel drivers on Windows do not have OpenGL 4.1/4.2 drivers. It's a lot better driver support then for say Intel graphics on Linux and in some regards even on Windows. Intel drivers on Windows don't support OpenGL 3.2 yet.
Some people are severely space limited; others are casual gamers and don't play enough to justify having two computers. As a very mass market game; DIII will be selling a huge number of copies to people in the latter group.
Many people have both laptops and desktops, and regardless of the desktop obviously being superior for gaming, they'd still like to be able to game a bit on their laptop when they're out and about and don't have access to their desktop.
Then, there are people who don't have the space for a desktop, or simply prefer the freedom of being able to move around their home but would still like to play games.
The question is, why do other people's usage models bother you so much? You don't care about the gaming ability of laptops. Fine. Don't pay attention to articles about it. Meanwhile, there are plenty of others, such as myself, who are highly interested.
I am confused about the GT540M vs the GT630M. Isnt the GT630m just a re-badged GT540m with higher clocks? Or is it the new G-force architecture? I believe only the 640M and above have the new architecture.
The GT 630M is technically a straight up rebadge of GT 540M. However, the GT 630M in the ASUS N56VM that we have shows clocks that are quite a bit lower than other GT 540M GPUs. Strangely, those lowered clocks don't appear to matter much in Diablo III, so either NVIDIA's control panel (and GPU-Z) are reporting incorrect information, or the shader cores aren't as demanding as you might expect.
You did say the acer was running hot. Maybe the cpu/gpu was throttling due to temps. Or maybe the quad core in the Acer made a difference (not likely?).
Correct. The Acer is definitely not hitting Turbo speeds, let along the base 2.3GHz clock. So the combination of faster CPU + slower GPU works out in favor of the N56VM in this instance. I really wish I had a good way to test the N56VM with the fully clocked GT 540M, though.
Google 'ThrottleStop'. Many of the Acer Aspire users that experience throttling problems in games use it to prevent the under-clocking. Note it will get hot though and you might want to raise the back of the laptop off the desk with a binder or something to improve airflow (intake).
Funny enough, I actually reran tests with ThrottleStop already and it just didn't seem to help. I'm not sure what's going on, but even with TS enabled and set to a 20x (or 21, 22, 23, Turbo) multiplier, the system still seems to just plug along at the 800-1500MHz range during testing. I've done everything I know of to make it run faster, and it just doesn't seem to matter. But this particular laptop has always been a bit quirky; I might try looking for a new BIOS again just to see if anything has changed, but I doubt it.
And funny enough, after additional investigation, the issue isn't throttling on the Acer but rather a higher clock on the GT 630M compared to the GT 540M. NVIDIA's updated specs page for the 630M lists 800MHz as the clock, but oddly their control panel is only reporting 475MHz on the ASUS. According to GPU-Z's Sensors tab, however, it really is running an ~800MHz core clock (1600MHz shaders), which accounts for the higher performance compared to the 672MHz GT 540M. I've updated the text in the article to explain this.
Suddenly those fancy expensive ultrabooks(Apple or otherwise) seem like extremely poor deals for tech enthusiasts. Then again they were always aimed at bloggers.
No offense, enthusiasts(in the real sense of the word) are always more extreme than your average MBA wielding blogger. If they wanted something light they would spare no expense and would have gone with a VaioZ, or some crazy Japanese Fujitsu that is lithium made, or a moded Sony UX. PC hardware enthusiasm has nothing to do with Apple commodities that try to be as "safe" as possible.
Suddenly? They were never marketed as gaming rigs, most don't even have dGPUs and Diablo 3 isn't even one of the 5 most demanding games this year. I dunno what you're getting at, ultrabooks are still great for the propose they're meant for. Can you get just as much done with an uglier/thicker/heavier $700 laptop? Sure, you might even get a dedicated GPU to go along with it... They're serving entirely different markets tho.
I have to stick to a subset of the possible resolution/detail settings or I'd be testing a single game 24/7 for a week. I've already spent probably 20 hours benchmarking Diablo III, and let me tell you: running the same three minute sequence at least a dozen times per laptop gets to be mighty damn tedious. I did run tests at some other settings, which I commented on I believe, but here's a bit more detail.
For example, on the N56VM, 1080p with all settings maxed but Shadow Quality set to Low results in performance of 20.1 FPS/18.5 FPS for our test sequences -- so that one setting boosted performance by over 50% compared to having all settings at High/Max. What's more, I also tested at max detail 1080p but with Shadow Quality set to Off, and the scores are 27.1/24.8 -- another 35% improvement over Low shadows. Everything else combined (e.g. 1080p but all other settings at low) only accounts for probably 20%. I could test that as well if you really want, but I have other things to do right now.
I'm mostly thinking that a large majority of laptops sold, even now, have 1366x768 displays. It looks like all of the non-Intel laptops handle playable framerates with low detail at that resolution, so I'm curious how that performance falls as the detail goes up.
In particular, can Llano and Trinity handle high detail at 1366x768? They are (or will be) sold in budget laptops that won't get high-res screens.
However, I understand the time constraints your working under. Thanks for the comparison, anyway.
I agree with this. I understand time constraints, but honestly, the paradigm that's being followed here (and with a lot of reviews) is simply not representative of real-world usage. It's not the case that people play with low details at low resolutions and high details at high resolutions. *Especially* when you're dealing with laptops. Generally, you're going to have the resolution at the display's native resolution, and going to work with the settings from there.
In any case, the article is still appreciated, and it's possible, at least, to make an educated guess at how the game will run at various resolutions and settings based on the presented info. Definitely going to grab myself a nice Trinity-powered laptop soon as one meeting my desired specs comes out.
Also, yet again we see that HD4000 does not match Llano, let alone exceed it, as I've seen some people spreading around.
This is the whole purpose of having three different settings, discussing what settings we selected and why, etc. Consider the Value setting a "near-best-case" result while still looking decent; in this case, the only thing you can really do to further improve frame rates is to turn off shadows and/or lower the resolution further. If you look at our Mainstream results, you can see what happens as you start to turn up the dials, and the same goes for Enthusiast. I've discussed in the article exactly how much the various elements impact performance, going so far as to include additional results at "Enthusiast 1080p" but with Shadow Quality on Low/Off.
If someone can't get at least a decent idea of where to start in terms of settings and what to expect from their laptop hardware with the information in this article, I'm not sure what I could do to help the situation. Hold their hand and walk through each and every specific setting? Because that tends to come off sounding very condescending if I write that way, and I think most people who care enough to read our articles are much smarter than that.
Fair enough, I understand that. However, I'm not suggesting you write in a hand-holding, condescending manner. Just having three bars on the graph for each resolution (one bar for value, mainstream, and enthusiast settings) would be fine. I understand the time constraints, though, as I said. That would be the ideal, however.
You won't have to worry about that soon for nVidia chipped laptops as nVidia is rolling out that automatic best game play settings in their drivers. That's going to be a wonderful thing for the majority of gamers and laptop users who don't have a clue on game settings - I hope it helps increase the user base so computer games overall gain strength.
Amd needs to follow suit quickly, to help all of us with a larger user base, instead of being stupid and lame on the driver side as usual. Of course, I'm scowling at the idea amd could possibly man up in that area.
Even though HD4000 is barely playable, I am still impressed by how far Intel has come along. HD4000 is right on the heel of HD6620G, which I didn't expect to happen just 1.5 years ago.
Where's the Alienware laptop comparisons...? I have a M17X R3 6900M and D3 is pretty smooth (sorry didn't download FRAPs yet), but I'd like to see some official numbers.
higher levels get substantially more performance impacting.
I'm using a HD 6970 at 1920x1200 with 6 gigs of ram and and an i7 920 and by hell difficulty I've encountered packs of mobs that have brought my machine to it's knees (sub 15FPS).
Blizzard really needs to work on whatever they did poorly with for othis game.
Thanks for that bit of info, ignore the fanboy, and continue your observations please - as we have already been told in the last card reviews by so many users here they have 1900X1200 monitors and they are by no means rare and "all real enthusiasts" have sought them.
So the information you have there is very valuable to all the amd fans that own their 1900X1200 here that only lost to the new nVidia flagship by 9% at that resolution instead of 14% overall loss at 1920X1080, which anand doesn't show.
Please ignore the sniping, cursing rude person and continue the observations as that one surprised me.
"What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft, DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings."
WoW got a major graphics upgrade for the expansion pack Cataclysm, and it is now one of the few DX11 capable MMOGs released. You're overall point is valid in that Blizzard makes games so that people with lower priced systems can play them, but a bit out-of-date when it comes to WoW.
Anyone old enough would have remember those article about Mobile Graphics, How they sucks, how every year we were suppose to get 50 - 100% performance improvement. How Quake 3 didn't work, we could only play SimCity 2000.
And by todays standard, Diablo 3 isn't even ground breaking in terms of Graphics. And yet, most of these laptop dont even play the game at acceptable frame rate ( 30fps ), ( And we are already excluding ANY of the ACT 3 / 4 loads in the game )
And we even have Retina Display resolution coming. We are talking about 2 - 4x Pixel Density.
I really do hope Haswell will provide 3x the performance of Top HD4000 numbers. This way we could push and ensure that everytime i select a discrete graphics in Notebook, i am guarantee to get at least decent graphics performance numbers.
Ever heard about heat and battery life. Laptops are not for games! They actually do run games good when they are 5+ years older than the laptop itself. :)
So play old games and be happy and they are mostly better than the current breed of graphic intensive crap.
BTW That's the reason why consoles are better than PCs for gaming. You invest once and it guarantees you (unless it is XBOX 360 and rings red of death) that you will be able to play all available games until you have it. For the price of a console you cannot even buy a good graphics card.
Yes, but Diablo 3 is exactly like a game 3 - 5 years older then my current laptop ( 6 months old ) with 6630M.
We have Laptop as Desktop Replacement. But most of those dont even run the game well.
And exactly like you said consoles are MUCH better for gaming. Which leads me to think we are getting less efficient in extracting performance out of GPU.
What's your intended resolution and what level of detail are you willing to run at? If you're okay with 1366x768 and Low Shadow Quality, you should be able to play through at least Normal and Nightmare difficulty on any Llano, Trinity, or possibly (if you're tolerant) HD 4000 laptop. If you want higher quality settings or a higher resolution, you'll want probably something with at least a GT 630M level GPU.
It's a bit larger (and should therefor run a bit cooler/better) than the 3830TG used in the benchmarks for this article. At a price of $600, I don't see Trinity A10 surpassing it any time soon, though I do suspect the number of 4830TG units currently available is all that's left, so they might go out of stock in the next month or so.
Secondly, many gamers now are on laptops, not by choice, but some by necessity. There are quite a few gamers that are, in fact, space-limited and simply don't have the space for a full desktop setup. I am actually one of those at the moment. I will have more space in the future, where I will then get a desktop, but I don't have enough space right now. That is why a laptop is ideal for me. Secondly, a laptop has almost everything integrated and makes it easy to be mobile; speakers, trackpad, keyboard, screen are all in one unit. You can't be lugging around a desktop everywhere. If you're going to a friend's house or visiting somewhere, a laptop allows you to game on the go.
Lastly, laptops are all about cooling. An Acer that's throttling is not going to cut it. The Act 1 benchmarks are not realistic. In Act 3 or 4 with tons of mobs on screen, that will stress both the CPU and GPU a lot more. A properly cooled and properly designed laptop should be hitting max turbo speeds almost always, and should not be throttling at all. Properly cooled the laptop should be running at minimum on base clocks when hooked to the A/C adapter. If you're gaming on the battery, than that's a bad idea. Gaming should be done hooked up the adapter when possible.
With an i5 or i7 hitting max turbo clocks, combined with a 540M or 630M Geforce, D3 should run smoothly at medium/high settings even in Act 3 or 4. If your laptop is throttling, then of course that's a different story. So in the end, it is possible to game pretty well on a laptop, as long as the laptop has strong cooling.
You'll note that with further investigation into the performance, it does not appear that the Acer is throttling. It simply isn't hitting max turbo during testing because the game doesn't require it. The performance of the much faster CPU in the N56VM is never more than 20% faster than the Acer, and that accounts for the GPU clock speed difference.
As for later acts, give us a bit of time and we'll return to the benchmarks with results from late in the game. We have some other stuff that's higher priority, but we are aware of the fact that the Act I numbers are not fully representative of Diablo III performance. It will probably be a couple weeks, though.
Hey, it pays to work in the industry. One of our hardware contacts managed to get me a code -- actually had to buy a box, open it up, photograph the key, and email that to me. Hahahaha... Something about the address on my Battle.net account not matching the billing address for the CC, so that was just easier than trying to figure it out. Thank goodness for that as well, as there's no way my wife (with a newborn) would be letting me buy Diablo III.
Well, just to give some perspective on the low end laptops, I try and play D3 on a HD3200 & 2ghz dual core AMD, and its pretty horrible most of the time. I even have it set to 800x600! and get about 10-20fps... Im looking for a better, but inexpensive upgrade. I have a q6600 & HD4870 desktop that runs the game pretty well with all setting low or off at the highest resolution and it looks great and runs ok. Im wanting to get a laptop with a 6750M...
HD 3200 is sadly very old by today's standards; it's actually not much better than Intel's Arrandale HD Graphics (original Core i3/i5 dual-core laptops). HD 3200 was fine when it came out in early 2008, but then AMD didn't release a significant update (just similarly clocked HD 4200/4250/4290) until the launch of Llano last June. That's over three years without a real improvement in IGP performance, which is pretty much an eternity for GPUs.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
87 Comments
Back to Article
mepenete - Saturday, May 26, 2012 - link
Thank you so much for this article. I'm looking to buy a new sub $500 laptop and was looking at the AMD A8 processors... and then the A10's got announced.Regardless, I was curious of how they would stack up. Really helpful seeing how midrange equipment stacks. Looks like I'm gonna pick up an A10 laptop.
JarredWalton - Saturday, May 26, 2012 - link
Hopefully prices come down a bit; right now the only A10-4600M laptops I can find are going for over $700. They're decent chips overall, but I'm not convinced they're better than a dual-core Sandy Bridge with GT 540M. The Acer I used is clearly not the best representative of that market, as the 13.3" chassis is quite thin and just can't cool the CPU+GPU well enough to avoid throttling; pretty much any 15.6" chassis should do better.frozentundra123456 - Saturday, May 26, 2012 - link
It seemed to me that the Llano laptops were marketed the same way. The A8 seemed to always be considerably more expensive than the A6 model, when the chips could not be that much different in cost. However, the A8 was usually better equipped with more ram.JKnows - Saturday, May 26, 2012 - link
Pretty cool from A10-4600M to providing the same performance as Sandy Bridge with GT 540M for half energy consumption. Is it two weeks after Trinity's launch? Too bad still not possible to find these laptops.CeriseCogburn - Saturday, June 2, 2012 - link
Right and the price is not going to be nice, so later in here a GT540 optimus laptop is recommended with a link by the author. $599 MC, $679 egg.So long gone is the hope the A10 comes in a cheap $350 or $400 walmart special like the (low end) brazos.
If amd is going to charge high prices for their cheapo chip, the rest of the laptop had better be awesome not some plastic creaking crud - and the screen had better be good.
I think what we'll see instead is junky cheap builds that cost a lot.
QuantumPion - Tuesday, May 29, 2012 - link
I got the Acer laptop with the core i5 and GT540M at newegg for $499 last year. That or a similar model would probably be a better bet.CeriseCogburn - Saturday, June 2, 2012 - link
Well that was a find, still hard to get at that price.narlzac85 - Saturday, May 26, 2012 - link
For people planning to play into Hell or Inferno difficulties. Be aware that elite monster packs will have 3 magical abilities on Hell and 4 on Inferno. You could also run into two or more packs of monsters at once, so you could be looking at 6, 8 or more magical effects. Also, some of the monster abilities cause them to replicate. I've definitely seen 30 or more monsters at once with the entire screen covered with fire, poison and lightning.I don't play on a laptop or use an IGP, but I assume this could have negative impact on performance. Normal mode and even Nightmare mode would probably not be too bad though.
cjb110 - Saturday, May 26, 2012 - link
I guess this is where the Low FX setting will be most useful then.Certainly a play through on normal doesn't seem to really get worse near the end...on average the mob size is slightly bigger, but that's about it.
Co-op would be another interesting test, that's probably the most demanding that the graphics would get.
Herald85 - Saturday, May 26, 2012 - link
I'm in Nightmare Act 4 and in the Keep (Tristram equivalent, the homebase) I get over 100fps. Taking on average mobs outside the Arreat Gate drops fps to 80. And that's on a 6950 , 1900x1200 everything maxed.On my laptop with a 8600m GT I could play fine in Normal until I got to Act4. I can still play but it's annoyingly choppy when there are lots of mobs. The 8600m GT is mentioned as supported 'Low' on the Bliz site.
I would LOVE to see a review done on Hell.
JarredWalton - Saturday, May 26, 2012 - link
Problem is, to test on Hell I have to play through all of Normal, then all of Nightmare. I know people who have already done that, sure, but I only got the game two days ago and I have a family and a life outside of playing games. Hence the disclaimer at the beginning. I'll update the text to mention slowdowns on later areas.kmmatney - Saturday, May 26, 2012 - link
Can you use someone else's save file?JarredWalton - Saturday, May 26, 2012 - link
Not that I know of; everything is stored on Blizzard's servers. The only way to access your characters is to login to Battle.net.Herald85 - Saturday, May 26, 2012 - link
If you want, you can use my account to test. You'll need to sign a contract I receive that MSI laptop if you abuse my account.Seriously though, I wouldn't mind you using my account (the greater good and all that). Only problem I see is I'm playing on Europe servers so the lag might spoil testing.
shank15217 - Saturday, May 26, 2012 - link
There is a basic reason why the game runs so well in Act 1 Normal.. play through Act 3 Hell then come back and redo your review. Only the 650M has a chance of playable frame rates in those levels and we haven't even covered multi-player. My 7870 OC to 1100 Mhz has some slowdowns in those levels under some high stress scenarios and basically the game becomes an absolute nut-fest in later difficulties. People will want to play through the later difficulties, its part of the game's progression. Now I get that its hard to benchmark through the randomness but you can make subjective comparisons or do several run throughs. I can say with absolute certainty, none of the apus have a chance in playable frame rates in scenarios where it will matter. D3 is a very unforgiving game, it can take a split second to die, smooth frame rates in non-normal difficulties is essential.dingetje - Saturday, May 26, 2012 - link
+1 on that !snakefist - Saturday, May 26, 2012 - link
agreed, act1 is much less demandingdid the test included multiplayer? game staggers when more than one player is in game on some configurations, while it's totally smooth in single player...
also, memory usage tends to increase greatly in later acts, may hurt performance if memory is shared....
JarredWalton - Saturday, May 26, 2012 - link
Tell you what, guys: email me your account login and password and don't play the game for a day, and give me instructions on a good stressful area to play on Hell difficulty, and then I can test that area. Otherwise, I simply don't have the 40+ hours needed to get to that point in the game in less than a week.And in case it's not clear, I'm mostly joking here. I've got several items I'm working on reviewing that are going to be higher priority than revisiting Diablo III performance in later acts. Perhaps this summer I'll have a chance to go back, but by then it won't really matter that much. So I'd suggest taking these figures as a way of getting relative performance from the various GPUs/IGPs, and then extrapolate from there. If you need to play on Hell difficulty on a laptop with maximum details enabled, you're probably going to want at least a GK107 dGPU (or perhaps Southern Islands).
dingetje - Saturday, May 26, 2012 - link
lol ;)snakefist - Sunday, May 27, 2012 - link
hehe, but it's not that hard - you don't even have to be on higher level difficulties - its only ACT 1 OF NORMAL, which is more like a tutorial and considerably less populated (and task manager is claiming ~300mb ram, which increases up to 1gb later, still on normal)all the things mentioned later, like having freezing monsters or duplicates or 100+ creeps on screen are happening on nightmare also, and even on late normal, so it shouldn't be that kind of bother...
on account topic user/password, use freejack/demise001xp, that's mine :)
(joking of course, but you could give me YOUR user/password and authenticate it with one of those mobile apps while on chat, and i could level you up pretty fast, playing since diablo1. being on normal, you don't have much to lose, i'll even leave you some nice gear to start nightmare with - seriously, talking about few hours job)
and all is in good-faith, since i don't play d3 on laptop anyway :)
DanNeely - Monday, May 28, 2012 - link
All joking about account sharing not withstanding, would AT buying a new D3 account for testing and letting a volunteer (not me) level it up for late game/hell testing be a viable option?JarredWalton - Monday, May 28, 2012 - link
We do have a couple people playing the game, so at some point we'll be able to test later levels. Give me a chance to: A) have a holiday (today), B) write a few other articles, C) enjoy the game without more benchmarking of it (which totally kills the fun!). Probably in a week or so I can come back with results from Act II or later.Unless you can talk Anand into your idea? ;-)
DanNeely - Tuesday, May 29, 2012 - link
My diplomacy skills are of the Europe 1914 level; the odds of my being able to sweet talk someone I don't know well into anything are slim to none.Better results in a week or so isn't that bad a delay. I'm just mildly frustrated since I've had a few people ask what sort of hardware they needed to play the game; and it seems that all the numbers I can find are from very early in the game and thus non-representative of what's really needed.
damianrobertjones - Saturday, May 26, 2012 - link
I'm hoping that Windows 8 Metro games bring a stable platform and for once I'm glad that at we'll at least have HD4000 as a base platform.dagamer34 - Saturday, May 26, 2012 - link
I'm wondering how the HD 4000 compares to the GPUs in ARM SoCs as that will be the actual low mark if slower.tipoo - Saturday, May 26, 2012 - link
I'm curious about that as well, SoC GPUs like the SGX 543MP4 are getting pretty complex and Intel themselves used to use integrated PowerVR GPUs in their chipsets.tipoo - Saturday, May 26, 2012 - link
The GMA 3600 is based on the PowerVR SGX545.JarredWalton - Saturday, May 26, 2012 - link
And the Windows drivers for it are crap right now. I'm just saying....Penti - Saturday, May 26, 2012 - link
Actually Metro/WinRT won't be used for gaming, If you want a restricted environment there already is XNA so. Games will be too difficult and too less of an incentive or anything to gain to port to the WinRT framework. Or Windows Runtime as they call it. Game developers will never target Metro/WinRT if they don't have to and they don't on x86 machines, desktop is there, you can still build for Windows 7 etc where most users are and so on. Won't happen that much here until next gen consoles either. Plus macs have gotten a whole lot better in the department and plenty of game engines are available now. Taking those kind of engines and porting to C++/WinRT isn't something taken lightly it probably won't actually be possible without a rewrite which defeats the purpose. The performance wouldn't be good. The sandbox is probably too restrictive. It also means in practice it is a more restrictive environment then the mobile sandboxed OS's, several mobile OS's run Firefox for example. WinRT never will. WinRT never will run even IE.oopyseohs - Saturday, May 26, 2012 - link
Did I miss the part where you talk about using an external monitor, or how else were you able to run all of these GPUs at all three resolutions? I'm not saying the data isn't important, as it could be relevant to different notebooks that use the same or similar hardware just with higher-res screens.Also, I've played this game on an old desktop with with GTX 285 @ 1080p and everything turned up. While that is fairly smooth and playable, I still get quite a few moments of "stuttering" in Hell difficulty. I also play on basically the same Acer book with the GT 540M, and even at the lowest possible graphics settings and resolution in normal mode, it's hard for me to characterize that performance as anything other than horrible in comparison to the desktop.
JarredWalton - Sunday, May 27, 2012 - link
Yes, all of the higher than 1366x768 results were done on an external LCD where required (which was the case for Llano, Trinity, VAIO C, TimelineX, and Vostro; the other laptops had 1080p displays, except for quad-core SNB which has a 1600x900 LCD and I didn't run the 1080p tests).PolarisOrbit - Saturday, May 26, 2012 - link
Good review for what it is, but I think it could have been a little more complete with some additional information:1) Use Act 3 Bastion's Keep for the "intensive" case instead of Act 1 Old Town. I think this would be better representative of the game's peak demand. (probably just a run through of the signal fires quest since it's easy to get to)
2) Include a brief section on how much of an impact additional players put on the game. I find it can actually be quite significant. This doesn't have to be full-depth review just a quick.
Overall, I'm using an A8-3500M + 6750M crossfire (overclocked to 2.2GHz) @1366x768 and my framerates during battles (ie. when it counts) average about 1/2 to 1/3 what the reviewer posts because the game gets much more intensive than Act 1, and having a party also slows it down significantly compared to solo.
Just some ideas to expand the review if you want =)
drkrieger - Saturday, May 26, 2012 - link
Hey folks, I've got an older Asus G71Gx which has a Nvidia GTX260M, I can play it on medium/low at about 40 fps @ 1920x1200.Hope this gives some idea of older mobile graphics stack up.
waldojim42 - Saturday, May 26, 2012 - link
I have been testing this out on my W520 for the sake of seeing what I can do to play diablo and maintain decent battery life.For what it is worth, turning off shadows, and playing @ 1366x768 on the HD 3000 results in roughly 28fps - more than enough to play the game through the first difficulty anyhow. I have been using this for some time now with 4 players in game. When running @ 1080P, it dips down into the low 20's, and occasionally is a problem in act 3 so I wouldn't suggest it.
Point is though, that anyone that has a notebook with SB and no video card CAN still play this game, even if it isn't ideal.
Zoolookuk - Saturday, May 26, 2012 - link
Given this is a cross platform game, it would have been interesting to provide Mac results with similar hardware. I play using a GT330m and i7 dual core, and it runs pretty well. I'd like to see how it stacks up to the latest AMD chips and HD3000 on a Mac.egtx - Saturday, May 26, 2012 - link
Yes I am interested in Mac results as well.ananduser - Saturday, May 26, 2012 - link
Provided the testing is done on a dual booting Apple machine, D3 under Windows will always run better.JarredWalton - Saturday, May 26, 2012 - link
Anecdotally, Brian and Anand have both commented that Diablo 3 on a MacBook Pro under OS X runs like crap. I'm not sure if they're running on latest generation MBP13 or something else, though, so that's about all I can pass along.ananduser - Sunday, May 27, 2012 - link
Was there any doubt? OSX is severely lacking in the graphical driver support. Apple never gave a rat's rear about this crucial aspect of gaming support. They are always late with drivers and with the latest OpenGL spec.Penti - Thursday, May 31, 2012 - link
The recommendations / minimum requirements on Macs are discrete graphics with good drivers though. I.e. no nvidia 7300 / 7600, ATi X1600 / X1900 etc. Starting point is 8600 GT. Obviously no integrated Intel graphics is enough there. OpenGL3.2 or OpenGL 2.1 with extensions should be fine for game developers and the drivers handle it, nVidia and AMD can put in performance improvements if they have the feedback. They could even launch their own "game edition" card for the Mac Pro with their own drivers outside of Apples distribution channel. Nvidia releases drivers on there site from time to time. That said both the game engine port and drivers are a bit less optimized then their Windows and Direct3D counterpart. They [drivers] are quiet robust and well working but might not be that fast. It's mainly a problem for the developers today though as most macs has somewhat decent graphics with maintained drivers and have pretty good driver support and support pretty much all the features you need any way.The OS is very dependent on OGL so the support it self is decent and fairly up to to date even if it is not OpenGL 4.2/3.3 yet. Latest OpenGL 4.2 is not even supported by much of any hardware that Apples uses either so. R700, R600, GF 8M, GF 9M and the desktop versions does not support more then OpenGL 3.3 any way which it self is a backport of as much as possible. 3.2 is a decent level there. Apple always support the whole API in the software renderer too so they have no joy hunting the latest features, though the vendors can use any extensions they wish to add those features, all the supported gpus supports the API too. Intel drivers on Windows do not have OpenGL 4.1/4.2 drivers. It's a lot better driver support then for say Intel graphics on Linux and in some regards even on Windows. Intel drivers on Windows don't support OpenGL 3.2 yet.
JarredWalton - Friday, June 1, 2012 - link
Anand commented to me the other day that "Windows HD 4000 results are faster than MBP15 with dGPU results", if that's anything to go by.Patflute - Saturday, May 26, 2012 - link
...DanNeely - Saturday, May 26, 2012 - link
Some people are severely space limited; others are casual gamers and don't play enough to justify having two computers. As a very mass market game; DIII will be selling a huge number of copies to people in the latter group.kyuu - Saturday, May 26, 2012 - link
Many people have both laptops and desktops, and regardless of the desktop obviously being superior for gaming, they'd still like to be able to game a bit on their laptop when they're out and about and don't have access to their desktop.Then, there are people who don't have the space for a desktop, or simply prefer the freedom of being able to move around their home but would still like to play games.
The question is, why do other people's usage models bother you so much? You don't care about the gaming ability of laptops. Fine. Don't pay attention to articles about it. Meanwhile, there are plenty of others, such as myself, who are highly interested.
frozentundra123456 - Saturday, May 26, 2012 - link
I am confused about the GT540M vs the GT630M. Isnt the GT630m just a re-badged GT540m with higher clocks? Or is it the new G-force architecture? I believe only the 640M and above have the new architecture.JarredWalton - Saturday, May 26, 2012 - link
The GT 630M is technically a straight up rebadge of GT 540M. However, the GT 630M in the ASUS N56VM that we have shows clocks that are quite a bit lower than other GT 540M GPUs. Strangely, those lowered clocks don't appear to matter much in Diablo III, so either NVIDIA's control panel (and GPU-Z) are reporting incorrect information, or the shader cores aren't as demanding as you might expect.frozentundra123456 - Saturday, May 26, 2012 - link
You did say the acer was running hot. Maybe the cpu/gpu was throttling due to temps. Or maybe the quad core in the Acer made a difference (not likely?).JarredWalton - Saturday, May 26, 2012 - link
Correct. The Acer is definitely not hitting Turbo speeds, let along the base 2.3GHz clock. So the combination of faster CPU + slower GPU works out in favor of the N56VM in this instance. I really wish I had a good way to test the N56VM with the fully clocked GT 540M, though.Arcquist - Saturday, May 26, 2012 - link
Google 'ThrottleStop'. Many of the Acer Aspire users that experience throttling problems in games use it to prevent the under-clocking. Note it will get hot though and you might want to raise the back of the laptop off the desk with a binder or something to improve airflow (intake).JarredWalton - Sunday, May 27, 2012 - link
Funny enough, I actually reran tests with ThrottleStop already and it just didn't seem to help. I'm not sure what's going on, but even with TS enabled and set to a 20x (or 21, 22, 23, Turbo) multiplier, the system still seems to just plug along at the 800-1500MHz range during testing. I've done everything I know of to make it run faster, and it just doesn't seem to matter. But this particular laptop has always been a bit quirky; I might try looking for a new BIOS again just to see if anything has changed, but I doubt it.JarredWalton - Sunday, May 27, 2012 - link
And funny enough, after additional investigation, the issue isn't throttling on the Acer but rather a higher clock on the GT 630M compared to the GT 540M. NVIDIA's updated specs page for the 630M lists 800MHz as the clock, but oddly their control panel is only reporting 475MHz on the ASUS. According to GPU-Z's Sensors tab, however, it really is running an ~800MHz core clock (1600MHz shaders), which accounts for the higher performance compared to the 672MHz GT 540M. I've updated the text in the article to explain this.ananduser - Saturday, May 26, 2012 - link
Suddenly those fancy expensive ultrabooks(Apple or otherwise) seem like extremely poor deals for tech enthusiasts. Then again they were always aimed at bloggers.DanNeely - Saturday, May 26, 2012 - link
... and enthusiasts who want an ultra portable that's a PC not a fondleslab, and which is faster than an atom.ananduser - Sunday, May 27, 2012 - link
No offense, enthusiasts(in the real sense of the word) are always more extreme than your average MBA wielding blogger. If they wanted something light they would spare no expense and would have gone with a VaioZ, or some crazy Japanese Fujitsu that is lithium made, or a moded Sony UX. PC hardware enthusiasm has nothing to do with Apple commodities that try to be as "safe" as possible.Impulses - Monday, May 28, 2012 - link
Suddenly? They were never marketed as gaming rigs, most don't even have dGPUs and Diablo 3 isn't even one of the 5 most demanding games this year. I dunno what you're getting at, ultrabooks are still great for the propose they're meant for. Can you get just as much done with an uglier/thicker/heavier $700 laptop? Sure, you might even get a dedicated GPU to go along with it... They're serving entirely different markets tho.ananduser - Monday, May 28, 2012 - link
Which is why I mentioned tech enthusiasts in my original comment. There's nothing that I dispute from your enumeration.futurepastnow - Saturday, May 26, 2012 - link
Or, perhaps I should say, a concern. You increase the detail setting and the resolution together.What about 1366x768 at high detail? Or 1920x1080 at low detail?
JarredWalton - Saturday, May 26, 2012 - link
I have to stick to a subset of the possible resolution/detail settings or I'd be testing a single game 24/7 for a week. I've already spent probably 20 hours benchmarking Diablo III, and let me tell you: running the same three minute sequence at least a dozen times per laptop gets to be mighty damn tedious. I did run tests at some other settings, which I commented on I believe, but here's a bit more detail.For example, on the N56VM, 1080p with all settings maxed but Shadow Quality set to Low results in performance of 20.1 FPS/18.5 FPS for our test sequences -- so that one setting boosted performance by over 50% compared to having all settings at High/Max. What's more, I also tested at max detail 1080p but with Shadow Quality set to Off, and the scores are 27.1/24.8 -- another 35% improvement over Low shadows. Everything else combined (e.g. 1080p but all other settings at low) only accounts for probably 20%. I could test that as well if you really want, but I have other things to do right now.
futurepastnow - Saturday, May 26, 2012 - link
I'm mostly thinking that a large majority of laptops sold, even now, have 1366x768 displays. It looks like all of the non-Intel laptops handle playable framerates with low detail at that resolution, so I'm curious how that performance falls as the detail goes up.In particular, can Llano and Trinity handle high detail at 1366x768? They are (or will be) sold in budget laptops that won't get high-res screens.
However, I understand the time constraints your working under. Thanks for the comparison, anyway.
kyuu - Saturday, May 26, 2012 - link
I agree with this. I understand time constraints, but honestly, the paradigm that's being followed here (and with a lot of reviews) is simply not representative of real-world usage. It's not the case that people play with low details at low resolutions and high details at high resolutions. *Especially* when you're dealing with laptops. Generally, you're going to have the resolution at the display's native resolution, and going to work with the settings from there.In any case, the article is still appreciated, and it's possible, at least, to make an educated guess at how the game will run at various resolutions and settings based on the presented info. Definitely going to grab myself a nice Trinity-powered laptop soon as one meeting my desired specs comes out.
Also, yet again we see that HD4000 does not match Llano, let alone exceed it, as I've seen some people spreading around.
JarredWalton - Saturday, May 26, 2012 - link
This is the whole purpose of having three different settings, discussing what settings we selected and why, etc. Consider the Value setting a "near-best-case" result while still looking decent; in this case, the only thing you can really do to further improve frame rates is to turn off shadows and/or lower the resolution further. If you look at our Mainstream results, you can see what happens as you start to turn up the dials, and the same goes for Enthusiast. I've discussed in the article exactly how much the various elements impact performance, going so far as to include additional results at "Enthusiast 1080p" but with Shadow Quality on Low/Off.If someone can't get at least a decent idea of where to start in terms of settings and what to expect from their laptop hardware with the information in this article, I'm not sure what I could do to help the situation. Hold their hand and walk through each and every specific setting? Because that tends to come off sounding very condescending if I write that way, and I think most people who care enough to read our articles are much smarter than that.
kyuu - Saturday, May 26, 2012 - link
Fair enough, I understand that. However, I'm not suggesting you write in a hand-holding, condescending manner. Just having three bars on the graph for each resolution (one bar for value, mainstream, and enthusiast settings) would be fine. I understand the time constraints, though, as I said. That would be the ideal, however.CeriseCogburn - Saturday, June 2, 2012 - link
You won't have to worry about that soon for nVidia chipped laptops as nVidia is rolling out that automatic best game play settings in their drivers.That's going to be a wonderful thing for the majority of gamers and laptop users who don't have a clue on game settings - I hope it helps increase the user base so computer games overall gain strength.
Amd needs to follow suit quickly, to help all of us with a larger user base, instead of being stupid and lame on the driver side as usual. Of course, I'm scowling at the idea amd could possibly man up in that area.
gamoniac - Saturday, May 26, 2012 - link
Even though HD4000 is barely playable, I am still impressed by how far Intel has come along. HD4000 is right on the heel of HD6620G, which I didn't expect to happen just 1.5 years ago.geogerf - Saturday, May 26, 2012 - link
Where's the Alienware laptop comparisons...? I have a M17X R3 6900M and D3 is pretty smooth (sorry didn't download FRAPs yet), but I'd like to see some official numbers.Thanks!
erple2 - Sunday, May 27, 2012 - link
Just as soon as that Alienware laptop comes down in price to sub $700 prices, it'll show up on this comparo!Oh, it's not that much in retail? I guess that your Alienware doesn't qualify as a "mainstream" laptop, like these other ones.
geogerf - Monday, May 28, 2012 - link
Sorry, I don't get your sarcasm.. where does it say in the article that these are only sub $700 laptops tested?I'd think Alienware would fit under a "Enthusiast" machine, being gamer oriented and all...
Alchemist07 - Monday, May 28, 2012 - link
I think the ASUS laptop is $1300 or so (was reviewed recently iirc)designerfx - Sunday, May 27, 2012 - link
higher levels get substantially more performance impacting.I'm using a HD 6970 at 1920x1200 with 6 gigs of ram and and an i7 920 and by hell difficulty I've encountered packs of mobs that have brought my machine to it's knees (sub 15FPS).
Blizzard really needs to work on whatever they did poorly with for othis game.
Zingam - Monday, May 28, 2012 - link
Shut the F up and lower that resolution!CeriseCogburn - Saturday, June 2, 2012 - link
Thanks for that bit of info, ignore the fanboy, and continue your observations please - as we have already been told in the last card reviews by so many users here they have 1900X1200 monitors and they are by no means rare and "all real enthusiasts" have sought them.So the information you have there is very valuable to all the amd fans that own their 1900X1200 here that only lost to the new nVidia flagship by 9% at that resolution instead of 14% overall loss at 1920X1080, which anand doesn't show.
Please ignore the sniping, cursing rude person and continue the observations as that one surprised me.
Sabresiberian - Sunday, May 27, 2012 - link
"What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft, DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings."WoW got a major graphics upgrade for the expansion pack Cataclysm, and it is now one of the few DX11 capable MMOGs released. You're overall point is valid in that Blizzard makes games so that people with lower priced systems can play them, but a bit out-of-date when it comes to WoW.
;)
iwod - Monday, May 28, 2012 - link
Anyone old enough would have remember those article about Mobile Graphics, How they sucks, how every year we were suppose to get 50 - 100% performance improvement. How Quake 3 didn't work, we could only play SimCity 2000.And by todays standard, Diablo 3 isn't even ground breaking in terms of Graphics. And yet, most of these laptop dont even play the game at acceptable frame rate ( 30fps ), ( And we are already excluding ANY of the ACT 3 / 4 loads in the game )
And we even have Retina Display resolution coming. We are talking about 2 - 4x Pixel Density.
I really do hope Haswell will provide 3x the performance of Top HD4000 numbers. This way we could push and ensure that everytime i select a discrete graphics in Notebook, i am guarantee to get at least decent graphics performance numbers.
Zingam - Monday, May 28, 2012 - link
Ever heard about heat and battery life. Laptops are not for games! They actually do run games good when they are 5+ years older than the laptop itself. :)So play old games and be happy and they are mostly better than the current breed of graphic intensive crap.
Zingam - Monday, May 28, 2012 - link
BTW That's the reason why consoles are better than PCs for gaming. You invest once and it guarantees you (unless it is XBOX 360 and rings red of death) that you will be able to play all available games until you have it. For the price of a console you cannot even buy a good graphics card.iwod - Wednesday, May 30, 2012 - link
Yes, but Diablo 3 is exactly like a game 3 - 5 years older then my current laptop ( 6 months old ) with 6630M.We have Laptop as Desktop Replacement. But most of those dont even run the game well.
And exactly like you said consoles are MUCH better for gaming. Which leads me to think we are getting less efficient in extracting performance out of GPU.
Computex - Monday, May 28, 2012 - link
I would use it for school since I can afford something like this on my own.........amanstay - Monday, May 28, 2012 - link
i wanna play D3 with laptop, what suitable laptop and what is the requirements that i need? pls help me..JarredWalton - Monday, May 28, 2012 - link
What's your intended resolution and what level of detail are you willing to run at? If you're okay with 1366x768 and Low Shadow Quality, you should be able to play through at least Normal and Nightmare difficulty on any Llano, Trinity, or possibly (if you're tolerant) HD 4000 laptop. If you want higher quality settings or a higher resolution, you'll want probably something with at least a GT 630M level GPU.Best bang for the buck right now, I'd go with the Acer AS4830TG: http://www.microcenter.com/single_product_results....
It's a bit larger (and should therefor run a bit cooler/better) than the 3830TG used in the benchmarks for this article. At a price of $600, I don't see Trinity A10 surpassing it any time soon, though I do suspect the number of 4830TG units currently available is all that's left, so they might go out of stock in the next month or so.
Dark_Archonis - Monday, May 28, 2012 - link
First thing is first, Acers are horrible laptops.Secondly, many gamers now are on laptops, not by choice, but some by necessity. There are quite a few gamers that are, in fact, space-limited and simply don't have the space for a full desktop setup. I am actually one of those at the moment. I will have more space in the future, where I will then get a desktop, but I don't have enough space right now. That is why a laptop is ideal for me. Secondly, a laptop has almost everything integrated and makes it easy to be mobile; speakers, trackpad, keyboard, screen are all in one unit. You can't be lugging around a desktop everywhere. If you're going to a friend's house or visiting somewhere, a laptop allows you to game on the go.
Lastly, laptops are all about cooling. An Acer that's throttling is not going to cut it. The Act 1 benchmarks are not realistic. In Act 3 or 4 with tons of mobs on screen, that will stress both the CPU and GPU a lot more. A properly cooled and properly designed laptop should be hitting max turbo speeds almost always, and should not be throttling at all. Properly cooled the laptop should be running at minimum on base clocks when hooked to the A/C adapter. If you're gaming on the battery, than that's a bad idea. Gaming should be done hooked up the adapter when possible.
With an i5 or i7 hitting max turbo clocks, combined with a 540M or 630M Geforce, D3 should run smoothly at medium/high settings even in Act 3 or 4. If your laptop is throttling, then of course that's a different story. So in the end, it is possible to game pretty well on a laptop, as long as the laptop has strong cooling.
JarredWalton - Monday, May 28, 2012 - link
You'll note that with further investigation into the performance, it does not appear that the Acer is throttling. It simply isn't hitting max turbo during testing because the game doesn't require it. The performance of the much faster CPU in the N56VM is never more than 20% faster than the Acer, and that accounts for the GPU clock speed difference.As for later acts, give us a bit of time and we'll return to the benchmarks with results from late in the game. We have some other stuff that's higher priority, but we are aware of the fact that the Act I numbers are not fully representative of Diablo III performance. It will probably be a couple weeks, though.
tacosRcool - Monday, May 28, 2012 - link
Pretty decent review, I wish more cards could be tested thoslagar - Tuesday, May 29, 2012 - link
"... is making plenty of news, and we managed to get a copy for testing purposes."Nice try, Jarred ;)
JarredWalton - Friday, June 1, 2012 - link
Hey, it pays to work in the industry. One of our hardware contacts managed to get me a code -- actually had to buy a box, open it up, photograph the key, and email that to me. Hahahaha... Something about the address on my Battle.net account not matching the billing address for the CC, so that was just easier than trying to figure it out. Thank goodness for that as well, as there's no way my wife (with a newborn) would be letting me buy Diablo III.justsomedude84 - Wednesday, May 30, 2012 - link
Well, just to give some perspective on the low end laptops, I try and play D3 on a HD3200 & 2ghz dual core AMD, and its pretty horrible most of the time. I even have it set to 800x600! and get about 10-20fps... Im looking for a better, but inexpensive upgrade. I have a q6600 & HD4870 desktop that runs the game pretty well with all setting low or off at the highest resolution and it looks great and runs ok. Im wanting to get a laptop with a 6750M...justsomedude84 - Wednesday, May 30, 2012 - link
BTW, Iv managed to get to Hell act 1 all myself... lol. (mostly on my desktop, but have to do most of my gaming at work)JarredWalton - Friday, June 1, 2012 - link
HD 3200 is sadly very old by today's standards; it's actually not much better than Intel's Arrandale HD Graphics (original Core i3/i5 dual-core laptops). HD 3200 was fine when it came out in early 2008, but then AMD didn't release a significant update (just similarly clocked HD 4200/4250/4290) until the launch of Llano last June. That's over three years without a real improvement in IGP performance, which is pretty much an eternity for GPUs.