Huawei’s GPU Turbo: Valid Technology with Overzealous Marketingby Ian Cutress & Andrei Frumusanu on September 4, 2018 9:00 AM EST
- Posted in
- Neural Networks
- Kirin 970
One of the biggest announcements from Huawei this year is that of its new GPU Turbo technology. The claims that it could provide more performance at less power, without a hardware change gave us quite a bit of pause. Internally, more than a few raised eyebrows appeared. As part of our discussions with Huawei this year at IFA, as well as some pretesting, we actually now have a base understanding of the technology, as well as additional insight into some of the marketing tactics – not all of which are the most honest representations of the new feature.
GPU Turbo: A Timeline
GPU Turbo is said to be a new mechanism that promised great performance and power improvements to new and existing devices. The new ‘technology’ was something that was first introduced in early June with the Chinese release of the Honor Play, and will be updated to a version ‘2.0’ with the launch of EMUI 9.0 later in the year.
Over the next few months Huawei plans to release the technology on all of its mainstream devices, as well as going back through its catalogue. Huawei promises that all devices, irrespective of hardware, will be able to benefit.
|GPU Turbo Rollout|
Mate 10 Pro
Mate 10 Porsche Design
Mate RS Porsche Design
Honor 9 lite
Mate 10 lite
|September||Honor View 10
Mate 9 Pro
|November||Honor 8 Pro|
From the weeks following the release of GPU Turbo on the first few devices, we saw quite a lot of hype and marketing efforts on Honor and Huawei’s side with the goal of promoting GPU Turbo. Over all the presentations, press releases promoted articles, and fuzzy press analysis, one important thing was consistently missing: we saw no technical explanation as to what GPU Turbo actually is and how it works. Everything was about results, but nothing was about details. At AnandTech, it’s the details that really resonate in our understanding, as to whether a new feature is genuine or not.
Huawei, to its credit, did try to reach out to us, but neither the company nor PR ever really responded when we asked for a more technical briefing on the new mechanism. We’re not sure what the reason was for this, as historically the company has often been open to technical discussions. On the plus side, at this year’s IFA, we finally had the chance to meet with a team of Huawei’s hardware and software engineers/managers.
Through these discussions, we developed some detailed explanations that finally made more sense of the past month’s marketing claims. The plus side of this is that we now have a better understanding of what GPU Turbo actually does (and it makes sense), although it also puts a chunk of the marketing slides on the ignore pile.
In this first piece on GPU Turbo, we’re going to go through the story of the feature in stages. First, we’ll look at Huawei’s initial claims about the technology: specifically the numbers. Second, we’ll go deeper into what GPU Turbo actually does. Third, we examine the devices we do have with the technology, to see what differences we can observe, and finally we address the marketing, which really needs to be scrutinized.
It should be noted that time permitting, and resources permitting, we want to go deeper into GPU Turbo. With the upcoming launch of the Mate 20 and the new Kirin 980 SoC inside, we will want to do a more detailed analysis with more data. This is only the beginning of the story into GPU Turbo.
Post Your CommentPlease log in or sign up to comment.
View All Comments
Ian Cutress - Tuesday, September 4, 2018 - linkIn the past, those 'cheats' were often from not rendering parts of the scene. This is still doing the full render that any Mali GPU does, but in a more power efficient way. The key to benchmarking is to test across several titles regardless, which is going to be important moving forward.
Manch - Wednesday, September 5, 2018 - linkDoes Mali or any mobile GPUs do culling of unseen objects? If not, can that be implemented to further reduce load?
The Hardcard - Tuesday, September 4, 2018 - linkThat isn’t a quandry, it solves the problem. The problem before is that the makers showed benchmark performance that they didn’t feel the device could handle in normal user apps. If this pans out and users can have it everyday apps means no harm, no foul.
Having it be a special mode for apps that can use it, while turning it off when it is not necessary is exactly what is needed and what everyone is trying to do and should do.
If they do it properly, then it is on the developers to use it. Sure, older, unupdated apps will be left behind. That is the nature of advancing technology.
melgross - Tuesday, September 4, 2018 - linkA benchmark cheat is just for benchmarks. There’s a reason for that, and it has to do with the fact that the SoC, and the device, as a whole, can’t perform at that level commercially, otherwise something negative will happen, such as overheating, and battery failure.
So, no, they can’t extend cheating to regular apps, and that’s the entire point to the cheat. If they could, then they would, and it wouldn’t be a cheat. This cheating is different from the turbo mode the article is about.
s.yu - Monday, September 10, 2018 - linkThe only way this is working is the apparent popularity of MMO games. They only plan on catering to low end customer who only play whatever "everybody else" plays. I for one avoid them like the plague, IAP rigged games are cheap stimulation, too cheap.
tipoo - Tuesday, September 4, 2018 - linkReminds me of the good old ATI vs Nvidia days when there were notable differences in render quality, usually with the edge to ATI. That all but went away at least as far back as the 8800, maybe before. Now for mobile to repeat that process.
Ian Cutress - Tuesday, September 4, 2018 - linkJust to make sure you're aware, that's kind of orthogonal to GPU Turbo. It's Mali behaviour right now, which explains some of the perf differences, but GPU Turbo is something separate.
Lord of the Bored - Wednesday, September 5, 2018 - linkNot ALWAYS to ATI, though. Sometimes they got a little aggressive in their "optimizations" too.
QUAFF3 NEVER FORGET!
https://techreport.com/review/3089/how-ati-drivers... ffor the kiddos that never saw this one. Back when men were men, and PC gaming was the exclusive domain of nerds that knew what IRQ and DMA meant(but probably not PCMCIA. No one could remember PCMCIA).
Holliday75 - Friday, September 7, 2018 - linkI recently found a PCMCIA 10mb NIC in one of my file cabinets and a 28.8k modem. I looked at them a second like wtf then remembered what they were.
nils_ - Friday, September 7, 2018 - linkPeople Can't Memorize Computer Industry Acronyms