For those in the PC industry, it has been clear to see that MSI’s gaming laptop strategy is quite rigorous. The net is cast far and wide, covering high end and mid-range, but also investing in new technologies which may or may not be part of the future. Brett recently reviewed the 18.4-inch 4.5 kg behemoth in the MSI GT80 Titan, featuring two GTX 980M GPUs in SLI with a Broadwell CPU, which is one such direction that MSI is taking. We saw a few models from MSI at CES, but for Computex the range is expanding. We’ve chosen a few of the most interesting models here.

MSI GS30 Shadow Version 2 – A New Hope Dock

At CES, MSI announced the GS30 Shadow gaming laptop, a seemingly run-of-the-mill 13.3-inch Crystal Well based Haswell with eDRAM, no discrete graphics card, 16GB of DRAM and RAID SSD storage. We reported that it looked half decent as a mobile gaming device, but the interesting element at the time came via the external dock. The external dock allowed the system to run with a full discrete graphics card, supporting up to 450W. The GS30 docked into this directly with PCIe 3.0 x16 from the CPU, allowing a full bandwidth implementation, and the dock would cost around $200 sans GPU. In fact after Computex I visited Singapore, and in the Funan DigitaLife mall there was one retailer who had the GS30 + dock on display. I asked the shop owner about sales, and he said out of the five he received in stock, he had sold three (with the dock) in six weeks which is an interesting number.

There were some issues with the dock that we noted at the time. The dock itself was not particularly user friendly in standard laptop circumstances – due to the shoebox-like shape it meant that the laptop was a good six-plus inches off of the desk and not suitable for typing. The dock was also not hot-swappable, meaning that the system had to be restarted in use. The display of the laptop was also not able to use the discrete card, meaning that the user had to have an external display anyway, making the laptop little more than a mini-PC. A lot of this changes with version 2.

The dock goes angular, allowing for full use at a desk with minimal effort. As we can see on this side, the dock also comes with additional IO ports such as USB, Ethernet and a card reader. This improves the usage of the device considerably, either on or off of the dock. The new design, with a Broadwell-H based Iris Pro GS30, will also allow the display of the laptop to use the graphics power under the hood. This is done more through a video-out from the GPU to a video-in to the laptop rather than directly changing the GPU at the software level from integrated to discrete. Either way, there’s no need to also have an external monitor.

Hot swapping should be as simple as disconnecting the video-in and changing the output monitor via hotkeys, allowing users to seamlessly switch between outputs. The dock will be able to take a GPU up to 330W and the deeper dock might allow for some of the larger AIB manufacturer designs.

Standard specifications apply here – MSI’s SuperRAID and Killer are base add-ins for them at this point, along with Nahimic Audio. As always, MSI is finalizing the design and we should see it on the market in Q3, with exact configurations determined by the retailer. Hopefully MSI hasn’t produced too many of the original versions, because now that this is on the horizon I suspect most review websites will be suggesting to hold out for version 2.

GT72 with Tobii Eye-Tracking

To complement the discrete graphics technology, MSI had a new style of laptop on display in order to gauge opinion. As the sub-title suggests, we have a GT72 laptop with eye tracking technology:

At the hinge we see three red lights which are part of the technology designed by Tobii. The software comes with a quick calibration tool that takes a minute to perform, and thereafter a user can move their eyes to parts of the screen they want to move the mouse to. Note that this function doesn’t click (for example in menus), but the demo provided showed Assassin’s Creed using the technology to adjust where the camera was pointing:

So in this case the WASD keys were for moving the character, and the technology tracked the eyes as to where to look in the game. Even with only 30 seconds playing with the technology, in this environment it seemed a little tough to get to grips with, especially if you looked down at the keyboard or at part of the HUD, but I presume 30 seconds is not enough to get used to it. I did however do the calibration tool, and played the game in the first image which involved moving my eyes to get the grey targets. It was very intuitive, and I scored well at least.

Ultimately I think this is the sort of game where eye-tracking might be more beneficial – indie type games from the Windows Store. These sorts of games would be more preferable to a low-end device rather than a GT72, but ultimately like most manufacturers the goal of new technology is to offer it to the high end first, see uptake, and then filter it down. Depending on the cost, I could see it being brought down to desktop replacement/gaming type models, although something a bit smaller might be a stretch due to added cost.

GT72 with G-Sync

Just before Computex it was announced by NVIDIA that G-Sync on mobile devices was now supported, and that allowed the laptop manufacturers to show off their designs. Naturally MSI was in this mix, and similar to the Tobii model, we get a GT72 variant showing Mobile G-Sync:

What became interesting was the discussion around G-Sync, because as it turns out Optimus (the ability to adjust from discrete graphics to integrated graphics dynamically) will not work with G-Sync at this time. In order to counter this, MSI has added a GPU button to the GT72 in use:

This allows the GPU to switch from integrated to discrete and vice versa, though if I remember correctly the system requires a restart in-between. G-Sync should still work in both modes, but turning off the discrete card should aid battery life significantly when in remote locations. For gaming there is also a turbo mode for the fans, exchanging noise for cooling. With these devices, especially on location, most gaming is performed with headphones anyway so having this option helps keep those temperatures down.

Red GS60

Part of MSI’s Computex event was done in collaboration with Square Enix for the PC release of Final Fantasy Type-0. As an avid Final Fantasy fan it was perhaps a bit of a shock to see Square Enix and MSI collaborate on a joint launch, but along with the game MSI was showing off a full-red GS60 (as well as a silver GS70).

Given the vast array of black laptops we see (or blue ones from HP and color choice ones from Dell), it was good to see something a little different in a premium chassis with the gaming vernacular. Due to most gaming keyboard backlights being red, MSI takes a different tack here and we get a white/blue-ish result. Personally I think it looks better than average, although perhaps something to rival the Dell XPS13 for bezel design would be preferable. That being said, I can’t wait for Type-0 to get to Steam – I never played the PSP version, and I’m currently going through the older ones on Steam 10-15 years after I first completed them with higher resolution texture packs. Final Fantasy 8’s story finally makes sense now, because it certainly didn’t when I was 14!

All-in-Ones: The AX24 and G24 GE Nightblade Mi and Mini-PCs
Comments Locked

50 Comments

View All Comments

  • medi03 - Tuesday, June 30, 2015 - link

    Fanobi Bullshit.
    In fall 2013 $399 R9 290 was on par nVidias $999 GTX Titan.
    R9 290X was the fastest single card GPU (550$) until 780 Titan (699$).

    It's still damn good bung for the buck card, staying within 10% of nVidia's card that costs nearly twice as much.

    Fury X is whole new story, 1199$ crossfire beating 1998$ Titan X SLI.

    And then there is $599 R9 295x2, which wipes the floor with any single card out there in pretty much any game that wasn't released just yesterday.
  • Antronman - Sunday, July 5, 2015 - link

    Ti stands for titanium.
  • Shadow7037932 - Tuesday, June 30, 2015 - link

    Mining on GPUs has been dead for quite a long time, esp. with ASICs available now.
  • will1956 - Thursday, July 9, 2015 - link

    i've gotten a sapphire 7870 GHz OC and was waiting for the Fury (hoping they would release a air cooled version) but now not a chance. i'm getting the 90 Ti.
    I've got a Silverstone FT03 with a H80i so the water cooling kinda throw it out
  • will1956 - Thursday, July 9, 2015 - link

    *980 Ti.
  • TheJian - Monday, June 29, 2015 - link

    You're on 12 gpu related articles in the last 30 days (only one shared with brett), but can't manage a 300 series article for two weeks (and it's written BEFORE launch to up it at NDA release etc correct?) or FuryX article for AMD's two major launches this summer (and sick in summer, sorta odd). That's a pretty big pill to swallow. Quick, someone give me a Heimlich, I'm choking... ;)

    Still time to respond to comments too...Tweet etc...but a major review of a HUGE launch (considering the hype that is...seems like no fury at this point, no new era of gaming either) can wait for a week, never mind the 300's ignored too. Ok...

    Having said that I did notice Jarred now works at maximumpc...Pretty much insinuated AMD is shoveling them around so fast so nobody can thoroughly vet the card. Ouch.

    http://www.maximumpc.com/amd-radeon-fury-x-review/
    "We received a card for benchmarking… sort of. The whole of Future US, which includes Maximum PC, PC Gamer, and TechRadar, among others, received one Fury X for testing. We asked for a second, since our GPU testing is done at a different location, but to no avail."

    It gets worse, even mentioning others, and only 10 samples for all of europe which according to him is really odd. Not sure why people keep turning hairworks off when it's just amping up tessellation (ok, may notch it down from 64, but off? Developer wanted us to use it) which happens to run REALLY fast on maxwell. How many other games will use tessellation like this in the future?

    https://techreport.com/review/28513/amd-radeon-r9-...
    See Beyond3d benchmarks. Hairworks won't be the only thing doing this I'd say.
    "The Fury X still manages just over half the throughput of the GTX 980 Ti in TessMark. "
    Same with Polygon throughput. There are others amd leads in, but this surely shows it isn't Hairworks doing in AMD on witcher3 (or nasty stuff from project cars etc), AMD will just hurt in some stuff period as will NV I guess, but NV seems to get the best of AMD as far as what devs are really doing. Should we be turning stuff off to hide AMD's gpu issues? Would we turn down AMD stuff that highlighted their efficiency in some aspect?

    As techreport says:
    "At the end of the day, the results from these directed tests largely confirm the major contrasts between the Fury X and the GeForce GTX 980 Ti. These two solutions have sharply divergent mixes of resources on tap, not just on paper but in terms of measurable throughput."

    Why hide it when it shows for either? Gamers should know how data like this plays in the real world. It would appear games like Project Cars, WOW Warlords of Draenor, Witcher 3 (with hairworks on, IE tessellation up), Wolfenstein New Order, COD Adv Warfare, Dragon Age Inq etc show some of AMD's weaknesses (games showing ~20% advantage here basically even at 4K). AMD has a few themselves, but not as many and not this big of an advantage (mostly 1/2 NV's advantage in them meaning less than 10%). Some of them are losses at one site a win on another too, like Metro LL at toms a loss at 4K for NV, but win at techpowerup etc.

    "The game uses DirectX 11 without the conventional approach to tessellation. It uses a deferred rendering engine with a custom Ambient Occlusion technique."
    Techpowerup's comment on tessellation in project cars game. Again, this shows what I'm talking about. You can turn crap off to hide AMD sucking wind in some attribute. Are all of the games I mentioned doing some form of something we should turn off? NO.
    https://www.techpowerup.com/reviews/AMD/R9_Fury_X/...
    63fps vs. 45fps in project cars at 4K. Ouch. Dev wanted us to see their effects not turn them off because AMD sucks at it. At 1080p/1440p the gap gets to 20% on more than I listed (GTA5 etc). Wolfenstein doing something too with Id's Tech5 engine (over 20% to, and even 970 topples FuryX in 1440p and lower). Even 980 beats FuryX at 4K. Really the discussion should focus on 1440p since 95% of us are at or below this, which is even worse in these games and adds more, and this is all before OCing (which adds games like thief at 4K etc at 21% for NV). Devs will more often than not program for 75% share vs. 25% (AMD) too and more often then not, just because that is what they are designing on (75-80% of the workstation market owned too, game designers etc).

    Back to your comment though, a 390x/FuryX release are clickbait articles you aren't interested in? ;) Your excuse is humorous at best, never mind what it is at worst. D. Lister, Chizow etc are correct. Can't wait for the clickbait FuryX article :) Hard to believe you put up a dozen gpu articles in a month but FuryX couldn't get the time of day over one of them...LOL.
  • nightbringer57 - Tuesday, June 30, 2015 - link

    Well, a fast and hastened article to meet the deadline at any price would be kind of clockbait.

    When the dedicated reviewer is not available, it's quite honorable to give up the hype deadline and wait until he's well and ready to give us the great article we will certainly have.
  • mmrezaie - Tuesday, June 30, 2015 - link

    Thanks Ryan. I would prefer to have better in depth review than an ad like article. So get better soon.
  • D. Lister - Tuesday, June 30, 2015 - link

    Ah, well then, the silver lining maybe a more in-depth analysis, considering the absence of a strict deadline. Get well soon mate.
  • just4U - Wednesday, July 1, 2015 - link

    One thing I'd like to see in a video card review right now is 2 390s (not x) in crossfire.. no one has done that yet and they look to be a very solid choice for 4k Gaming on price/performance.

    On topic. I've been using a lot of MSI stuff of late. Really like how the company is handling things these days.. I do wonder if they got their support up to par. That used to be a issue with them.

Log in

Don't have an account? Sign up now