Last year Qualcomm flew me out to their San Diego headquarters to talk about modems and transceivers, and what ended up being of the biggest public disclosures made from them on the modem side. Since then the multimode LTE space has warmed up considerably, we have Intel shipping their first multimode LTE product (XMM7160) in retail devices, NVIDIA’s Tegra 4i with the first fruits of its Icera acquisition nearing launch, Broadcom showing off their multimode LTE products, and a few others (Hisilicon and Samsung) making noise with multimode LTE products of their own. Modem is now a somewhat less veiled part of the mobile story, and especially as players see the AP (CPU and GPU) side of the dialog fleshed out, there’s been a surge in discussion about modem as one of the major differentiators.

Qualcomm invited press and analysts out to a modem workshop this week, and the attempt is to create for modems an event roughly analogous to the blogger benchmarking events we’ve seen for APQ8064 and most recently MSM8974. Since benchmarking and testing modems is a lot more daunting than AP, Qualcomm setup a number of demos with comparisons to competitor products and also talked about their own capabilities. With the benchmarking events we usually have an opportunity to see final silicon before its arrival in a shipping device, unfortunately that wasn’t the case here, we didn’t catch any glimpses of the upcoming MDM9x25 Category 4 modem (although it’s functionally the same as what’s in MSM8974) or hear an announcement about its successor, nor were there any details about WTR1625L or its coprocessor, the upcoming successor to WTR1605L.

First Qualcomm showed its modem throughput performance in both FDD (Frequency Division Duplex) versus a competitor device, and TDD (Time Division Duplex) versus another competitor device. In the FDD scenario, two devices were connected to separate base station emulators (the same Anritsu MD8475A I had loaned to me for a while) running an LTE network on Band 3 with 10 MHz channels and AWGN (Additive White Gaussian Noise) set to two conditions, 27 dB and later a more challenging 14 dB situation. The devices under test were commercial devices with names covered (although it was immediately obvious given the industrial designs what devices these really were), and then tethered over USB to a client. I believe the larger tablet was a Galaxy Tab 3 (10.1) with Intel’s XMM7160 LTE solution, and the smaller one a Note 8 LTE with Qualcomm’s MDM9x15. In both, the Qualcomm based solution showed higher throughput necessary to stream a 4K video without stuttering, while the Tab 3 paused a few times to re-buffer. I didn’t catch the initial throughput but saw that the 14 dB scenario showed Qualcomm’s solution at around 16 Mbps and the other solution at 12 Mbps.

Update: I asked Qualcomm for specifics about the FDD-LTE testing throughput since I didn't catch all the details exactly the first time around, apparently AWGN wasn't used, and there was EVA70 Hz (mobility) fading. 

Scenario 1
·         27 dB SNR
·         4K Video: Landscape Scenery
·         Fading: EVA70Hz
 
Results:
·         QCOM: 40 Mbps
·         Competitor: 28 Mbps
 
 
Scenario 2
·         14 dB SNR
·         4K video: Sony Bravia
·         Fading: EVA70Hz
 
Results:
·         QCOM: 20 Mbps
·         Competitor: 15 Mbps

Next was a comparison on a TDD LTE network, running Band 38 with 20 MHz channels. In this comparison a similar test was run, although this time it was a WiFi hotspot with the competitor solution (I believe a Huawei with Balong 710 inside) versus the Qualcomm solution in what looked like a Galaxy S 4. Here the same video scenario was played, with the competitor stalling to rebuffer occasionally while the other played fine. I noted average throughputs of 38 Mbps on Qualcomm’s demo and 30 on Huawei’s.

Up next was a demo of voice call setup time in traditional 3G WCDMA versus 4G LTE with CSFB back to WCDMA. The point was that call setup is longer but not appreciably slower on LTE. Here two HTC Ones were cabled up, the LTE connection on Band 4 and WCDMA on Band 2, and normal WCDMA on Band 2. The LTE call took 4.799 seconds to setup, while the WCDMA call took 3.989 seconds.

A new feature Qualcomm showed off for the first time is a transmit antenna preprocessing function which apparently will ship in a device shortly called TruSignal Uplink. This feature is designed to mitigate the kind of unintended attenuation we saw with the iPhone 4, specifically signal loss from physical impediments like a hand. Qualcomm is being tight lipped about this feature or how it works, saying little more than that this is a kind of transmit diversity that doesn’t require any network-side interaction to work. The demo showed off a comparison between two test devices both setup with a loss of 15 dB and a file upload running on both devices. Although the demo shown was on WCDMA I’m told there’s no reason this uplink processing can’t also apply in LTE.

Finally Qualcomm showed off power consumption comparisons between the same tablet shown in the FDD-LTE throughput comparison (likely a Tab 3 (10.1) LTE with XMM7160) and an LG G2 (which has MSM8974). This is no doubt a response to claims made by Intel that its solution is lower power than current competitors, although I suspect those claims might’ve been made in the context of the then-current MDM9x15 rather than the freshly-new MSM8974 (and soon to come MDM9x25 it shares an IP block with).

Regardless, the comparison looked at power consumption on a system level the battery terminals similar to how we have. The tablets were off and connected to a 3.7V power supply and National Instruments data acquisition routine and shown first running an LTE data transfer, and later a 3G CS Voice call.

In the LTE data transfer scenario, the Qualcomm solution drew an average of 184mA, while the competitor drew 234 mA. On the voice call, Qualcomm pulled 115mA versus 137mA for the competitor.

It’s interesting to see modem discussion start to get serious now that more LTE competitors are starting to enter the space once occupied essentially solely by Qualcomm. I’d love to run some of these and more comparisons myself as well and there’s a good chance we’ll be able to do so in the near future.

Comments Locked

18 Comments

View All Comments

  • Khato - Friday, November 8, 2013 - link

    With respect to the power consumption comparison - how does a system level measurement do anything more than imply that one system is more efficient than the other? Were baseline idle measurements offered for each platform? Or are the numbers shown the delta versus idle already?

    While there's no question that the 'competitor' system used more power on the LTE test given that the delta went from 22mA to 50mA how do we know that such is attributable to the modem and not other system components?
  • Gondalf - Friday, November 8, 2013 - link

    Soooo, in short words Qualcomm is in a great embarrassment . XMM7160 is pretty competitive with the best, not shipped yet, offerings. Obviously XMM7260 will be another big stone on the street. The worst thing is that actual Intel modem seems better than "actual" Qualcomm device.
    In these conditions, the two companies neck to neck, only price will do the difference. IMO Intel will push on prices to gain fast market share.
  • name99 - Friday, November 8, 2013 - link

    I'd say Qualcomm are battling a fundamental psychological issue here --- and doing so badly.

    Qualcomm are a tech company, they are not Apple. They sell to engineers, and they sell on the basis of superior engineering. An event like this is supposed to maintain that image, that Qualcomm means superior engineering.
    The problem is: if you want to play that game, you HAVE to cater to the desires of your target audience. Which means you have to provide a constant stream of papers, public talks, white papers, etc describing and detailing the technology in your device. Engineers (and engineer wannabe's and engineer fans) don't just want to know that you can hit a throughput of x Mbps under white gaussian noise of y dB --- they want to know what algorithm you're using, the micro-architecture of your device, what its flaws and limitations are, etc etc etc. If you're not willing to give them that, all these press junkets are just a waste of time.

    Or, to put it differently, what is Qualcomm's overall marketing strategy? If they are going for "Qualcomm inside" they need to come up with TV ads, a memorable four note sound phrase, stickers on Android phones, the whole Intel playbook. If they are going for wooing engineers, they need to do so by giving us ENGINEERING data, not canned demos.
  • Impulses - Friday, November 8, 2013 - link

    Qualcomm has already been doing TV ads and stickers on phones... Though the TV spots were limited, they were clearly geared at promoting the Qualcomm/Snapdragon brand (think it even included some sorta little CGI dragon). The stickers I've seen might be more of a regulatory or licensing thing, I've only seen them on CDMA devices I think and it looked more like an FCC badge than "Intel inside". Nonetheless, it's pretty clear Qualcomm's trying to push the brand while they lock up the majority of design wins over the last two years.
  • iwod - Friday, November 8, 2013 - link

    The Price Component Breakdown for Mobile BaseBand Receiver and SoC were from est $24 - $35. That is in comparison to the CPU/GPU SoC $17 - $25, and Display Screen ~$40 with Glass.

    So Basically it is the 3rd most expensive item on the BOM list. And it has risen from $10-$15. There is huge opportunity of cost saving for Mobile Phone makers if they can design and manufacture their own as long as they have the volume. And yet these ( well, two, Apple and Samsung ) still dont do it themselves.

    I would really love to see a reason and explanation on that.
  • lefty2 - Sunday, November 10, 2013 - link

    I'm just wondering is there any reason a Qualcomm modem can't be used with a Silvermont SoC? Does anyone know?
  • Shadowmaster625 - Monday, November 11, 2013 - link

    One of the pieces of equipment in my lab has a 386 controlling it. 16MHz I think.
  • Sesha_Giri - Wednesday, November 20, 2013 - link

    i am wondering where it will be usefull "TruSignal Uplink", with the size of phones coming, the RF chip placing should be a usage specific design.

    with small phones or low end phones would you like to keep this feature.

Log in

Don't have an account? Sign up now