Mobile Data Rates – Hype v. Reality

0
1064

On June 17th, the FCC announced their intention to fine AT&T Mobility $100M for allegedly failing to notify their customers that their mobile data rates were throttled to  about 500 Kbps after certain data usage thresholds (3 GB of 3G data / 5 GB of 4G data) were exceeded.  In this context, it was widely reported that typical non-throttled AT&T Mobility data rates range from 15 to 20 Mbps.

Last week, on July 1, the website 9to5Mac reported that the “iPhone 6s”, expected to be announced by Apple in September, will utilize the Qualcomm 9X35 Gobi LTE Cat6 modem.  The 9to5Mac reporter wrote:

“For end users, the most important new feature from the chip will be the potential for up to 300 Mbps download speeds, doubling the 150 Mbps download speeds found in the current generation iPhone 6 lineup. The new chip has the same 50 Mbps upload speed limit, however, and real-world performance is likely to be closer to 225 Mbps or lower, depending on the cellular network.”

http://9to5mac.com/2015/07/01/phone-6s-twice-as-fast-better-battery/

This enormous 15X bandwidth delta between the typical downlink (DL) bandwidth (20 Mbps) delivered by one of the leading mobile carriers and the peak download bandwidth (300 Mbps) hyped in the tech press naturally leads to confusion and misplaced consumer expectations.

Just what are the key assumptions that underpin the peak DL rates for 3GPP Category 6 (Cat6) User Equipment such as the Samsung Galaxy S6 and expected in the iPhone 6s?  Why does the AT&T Mobility LTE-A network typically deliver 15X lower data rates?

In order to achieve 300 Mbps of DL bandwidth, a mobile service provider’s (such as AT&T Mobility’s) network and the subscriber’s smartphone (the Cat6 modem / digital baseband processor, the RF transceiver and the RF front-end) must support so-called 2xCA providing 40 MHz of RF channel bandwidth.  2xCA is two component carrier aggregation, specifically in this case, interband carrier aggregation of two 20 MHz RF bands.  Furthermore, the mobile service provider’s radio base transceiver stations and the subscriber’s smartphone must support two spatial stream LTE downlink MIMO.  Finally, the subscriber must be close enough to the base station such that the RX signal-to-noise ratio can support 64-QAM modulation.

In AT&T Mobility’s case, their TD-LTE spectrum holdings in the US are band 2, band 4, and band 17, and currently supports 15 MHz carrier aggregation using their band 4 and band 17 assets in major metropolitan areas only.  This constrains the peak DL data rate to about 100 Mbps.  AT&T Mobility has not implemented 1 x 2 (TX x RX) downlink MIMO in their wireless network infrastructure equipment, which further constrains the peak data rate to about 50 Mbps.  Finally, in typical real-world usage, smartphones utilize 16-QAM, taking the downlink data rate to about 30 Mbps for the most advanced Cat6 models such as the Samsung Galaxy S6 and the upcoming Apple iPhone 6s.  Older Cat4 models such as the iPhone 6 / 6 Plus are typically constrained to about 15 Mbps DL bandwidth.

The tech press would do better, in my view, to throttle the hype, and strive to educate the public, setting realistic mobile bandwidth expectations.  Credibility counts!