All-in-Ones – The AX24 and G24 GE

My view on all-in-ones as compute devices does not seem to align well with what all-in-one manufacturers tell me. Nominally the sole all-in-one that gets much attention online is Apple’s iMac due to the role it has played in Apple’s line-up over the last five or so years. Nonetheless, the all-in-one manufacturers tell me that the PC-behind-a-monitor concept is growing in sales and has sufficient margin and volume to dedicate research and development towards building something customers want. I’ve not seen many AIOs in the wild – most usually in stores as separate stock check/help desks rather than in the home, but Intel has also been pushing the AIO strategy, even to the point of portable all-in-ones (massive 28-inch devices with a battery) in recent keynote speeches in the past couple of years. We’ve covered bits of the AIO market here and there, most often in news or at events such as the first 4K60p AIO at Computex last year, but we have not yet put resources to covering this element of the industry. For companies like MSI, to whom gaming is a focus, it seems that gaming branded all-in-ones are a potential viable market model. As a result, there were a couple of new concepts on display.

The main problem with all-in-ones from a gaming perspective is typically performance and graphics performance – strapping a desktop to the back of a monitor is not always the most space-efficient implementation, especially if you consider heat removal or power delivery. As a result, gaming on integrated graphics, particularly at 1080p or 4K, leaves an element of wanting more (although both Intel and AMD would argue, and our testing suggests, that you can still have a good experience at 1080p with integrated graphics on popular titles). The solution to this starts with equipping the AIO with an MXM graphics card similar to a laptop, which is what the Gaming 24GE 2QE is.

As the name suggests, a 24” IPS panel is paired with a laptop like configuration. The i7 mobile CPU on the HM87 chipset joins forces with a GTX 960M and for the model in front of us, featuring a 1080p display, should provide sufficient horsepower for eSports titles at the highest settings or AAA games at mid-range. I quizzed MSI on a 4K display, and they said that they would react to customer requests in that regard. Other hardware inside includes 16GB of DDR3L-1600, Killer networking and the inclusion of the Nahimic audio software. Storage is provided by dual SSDs in RAID 0 (‘MSI Super RAID’), and the display is a supposed ‘anti-flicker’ screen with an anti-glare finish. MSI’s terminology for this revolves around ‘less blue light’, which is perhaps something similar to the films/transparent glasses designed for at-monitor work to reduce eye-strain. MSI is currently working on the design of their gaming AIOs beyond a simple red-and-black colored livery and an MSI gaming logo.

The GTX960M at the end of the day is still a mid-range mobile component – arguably you could put one or two GTX 980Ms in such a device, much like the GT80 Titan laptop, and it would push some proper pixels. Rather than go down this route, MSI has gone a little mad and actually designed an AIO that supports a full sized graphics card.

On the left hand side we see the extra parts to implement the external graphics:

What we have here is essentially an AIO that loops in a full PCIe 3.0 x16 slot via a riser cable to a GPU mounted on the back of the panel. MSI’s design is not yet final, but they expect to be able to fit almost all reference designs in this bay with a mesh at the front for air intake (supporting blowers and other fan orientations) and a perforated edge to help with air removal:

As you can see, the GPU is mounted a little away from the panel in order to facilitate this around at least three of the edges. The GPU section itself has its own power supply module, requiring a second power brick in order to provide that power. MSI makes power bricks up to 330W, so I would imagine that any single silicon solution (i.e. non-dual GPUs) would work here.

The AX24 is to be designed around Skylake, which given other motherboards we have seen so far we can get multiple M.2 SSDs in PCIe 3.0 mode as part of the package. MSI is also including Killer networking as per usual, and the new Nahimic audio processing software. They were coy about whether the system uses a soldered down CPU or a replaceable CPU though, and at what TDP we might expect it to be. Though they are looking at 4K panels, adaptive sync technology was not mentioned. One of the poignant parts here is that an AIO design is almost like a laptop design, so we might see separate buttons to enable the external GPU, or extra buttons for 100% fan usage when the design is finalized.

MSI also exhibited the Pro 24 2M all-in-one, designed for a more enterprise look and feel. The specifications were pretty much as expected, taking the anti-flicker panel from the G24 but this time using Haswell based processors, integrated graphics, a 1080P 23.6-inch panel with SSD and HDD options.

That doesn’t sound anything great I admit, but there are a couple of non-standard features worth mentioning here. The first is technically an older feature I’ve seen in years gone by to do with webcams – placing a vanity filter via a physical sliding button:

Thus for anyone paranoid about peeping, rather than taping paper over the webcam, here is a physical switch. Alongside this, based on the ‘Pro’ nature of the Pro 24 2M, we get a COM port on the rear:

It’s a bad picture, but to the left of the network port we can see it. The device also comes with USB 3.0 ports, HDMI out, a card reader and gigabit Ethernet.

The MSI Booth - Motherboards Notebooks


View All Comments

  • medi03 - Tuesday, June 30, 2015 - link

    Fanobi Bullshit.
    In fall 2013 $399 R9 290 was on par nVidias $999 GTX Titan.
    R9 290X was the fastest single card GPU (550$) until 780 Titan (699$).

    It's still damn good bung for the buck card, staying within 10% of nVidia's card that costs nearly twice as much.

    Fury X is whole new story, 1199$ crossfire beating 1998$ Titan X SLI.

    And then there is $599 R9 295x2, which wipes the floor with any single card out there in pretty much any game that wasn't released just yesterday.
  • Antronman - Sunday, July 5, 2015 - link

    Ti stands for titanium. Reply
  • Shadow7037932 - Tuesday, June 30, 2015 - link

    Mining on GPUs has been dead for quite a long time, esp. with ASICs available now. Reply
  • will1956 - Thursday, July 9, 2015 - link

    i've gotten a sapphire 7870 GHz OC and was waiting for the Fury (hoping they would release a air cooled version) but now not a chance. i'm getting the 90 Ti.
    I've got a Silverstone FT03 with a H80i so the water cooling kinda throw it out
  • will1956 - Thursday, July 9, 2015 - link

    *980 Ti. Reply
  • TheJian - Monday, June 29, 2015 - link

    You're on 12 gpu related articles in the last 30 days (only one shared with brett), but can't manage a 300 series article for two weeks (and it's written BEFORE launch to up it at NDA release etc correct?) or FuryX article for AMD's two major launches this summer (and sick in summer, sorta odd). That's a pretty big pill to swallow. Quick, someone give me a Heimlich, I'm choking... ;)

    Still time to respond to comments too...Tweet etc...but a major review of a HUGE launch (considering the hype that is...seems like no fury at this point, no new era of gaming either) can wait for a week, never mind the 300's ignored too. Ok...

    Having said that I did notice Jarred now works at maximumpc...Pretty much insinuated AMD is shoveling them around so fast so nobody can thoroughly vet the card. Ouch.
    "We received a card for benchmarking… sort of. The whole of Future US, which includes Maximum PC, PC Gamer, and TechRadar, among others, received one Fury X for testing. We asked for a second, since our GPU testing is done at a different location, but to no avail."

    It gets worse, even mentioning others, and only 10 samples for all of europe which according to him is really odd. Not sure why people keep turning hairworks off when it's just amping up tessellation (ok, may notch it down from 64, but off? Developer wanted us to use it) which happens to run REALLY fast on maxwell. How many other games will use tessellation like this in the future?
    See Beyond3d benchmarks. Hairworks won't be the only thing doing this I'd say.
    "The Fury X still manages just over half the throughput of the GTX 980 Ti in TessMark. "
    Same with Polygon throughput. There are others amd leads in, but this surely shows it isn't Hairworks doing in AMD on witcher3 (or nasty stuff from project cars etc), AMD will just hurt in some stuff period as will NV I guess, but NV seems to get the best of AMD as far as what devs are really doing. Should we be turning stuff off to hide AMD's gpu issues? Would we turn down AMD stuff that highlighted their efficiency in some aspect?

    As techreport says:
    "At the end of the day, the results from these directed tests largely confirm the major contrasts between the Fury X and the GeForce GTX 980 Ti. These two solutions have sharply divergent mixes of resources on tap, not just on paper but in terms of measurable throughput."

    Why hide it when it shows for either? Gamers should know how data like this plays in the real world. It would appear games like Project Cars, WOW Warlords of Draenor, Witcher 3 (with hairworks on, IE tessellation up), Wolfenstein New Order, COD Adv Warfare, Dragon Age Inq etc show some of AMD's weaknesses (games showing ~20% advantage here basically even at 4K). AMD has a few themselves, but not as many and not this big of an advantage (mostly 1/2 NV's advantage in them meaning less than 10%). Some of them are losses at one site a win on another too, like Metro LL at toms a loss at 4K for NV, but win at techpowerup etc.

    "The game uses DirectX 11 without the conventional approach to tessellation. It uses a deferred rendering engine with a custom Ambient Occlusion technique."
    Techpowerup's comment on tessellation in project cars game. Again, this shows what I'm talking about. You can turn crap off to hide AMD sucking wind in some attribute. Are all of the games I mentioned doing some form of something we should turn off? NO.
    63fps vs. 45fps in project cars at 4K. Ouch. Dev wanted us to see their effects not turn them off because AMD sucks at it. At 1080p/1440p the gap gets to 20% on more than I listed (GTA5 etc). Wolfenstein doing something too with Id's Tech5 engine (over 20% to, and even 970 topples FuryX in 1440p and lower). Even 980 beats FuryX at 4K. Really the discussion should focus on 1440p since 95% of us are at or below this, which is even worse in these games and adds more, and this is all before OCing (which adds games like thief at 4K etc at 21% for NV). Devs will more often than not program for 75% share vs. 25% (AMD) too and more often then not, just because that is what they are designing on (75-80% of the workstation market owned too, game designers etc).

    Back to your comment though, a 390x/FuryX release are clickbait articles you aren't interested in? ;) Your excuse is humorous at best, never mind what it is at worst. D. Lister, Chizow etc are correct. Can't wait for the clickbait FuryX article :) Hard to believe you put up a dozen gpu articles in a month but FuryX couldn't get the time of day over one of them...LOL.
  • nightbringer57 - Tuesday, June 30, 2015 - link

    Well, a fast and hastened article to meet the deadline at any price would be kind of clockbait.

    When the dedicated reviewer is not available, it's quite honorable to give up the hype deadline and wait until he's well and ready to give us the great article we will certainly have.
  • mmrezaie - Tuesday, June 30, 2015 - link

    Thanks Ryan. I would prefer to have better in depth review than an ad like article. So get better soon. Reply
  • D. Lister - Tuesday, June 30, 2015 - link

    Ah, well then, the silver lining maybe a more in-depth analysis, considering the absence of a strict deadline. Get well soon mate. Reply
  • just4U - Wednesday, July 1, 2015 - link

    One thing I'd like to see in a video card review right now is 2 390s (not x) in crossfire.. no one has done that yet and they look to be a very solid choice for 4k Gaming on price/performance.

    On topic. I've been using a lot of MSI stuff of late. Really like how the company is handling things these days.. I do wonder if they got their support up to par. That used to be a issue with them.

Log in

Don't have an account? Sign up now