Demystifying the 4g phenomenon: Part 1




Yüklə 19.29 Kb.
tarix10.04.2016
ölçüsü19.29 Kb.
Demystifying the 4G Phenomenon: Part 1
Mixes of technologies and marketing messages have led to some confusion as to what actually constitutes 4G and Long Term Evolution (LTE) in the wireless world. Sorting out the differences within an overarching terminology can help developers make better choices.
by Todd Mersch, RadiSys
Every time I flip on the television, I come to learn yet another mobile operator is providing “4G” service. Strangely, though, the International Telecommunications Union (ITU) recently acknowledged only two technologies that fit their IMT-Advanced (i.e. 4G) framework:


  • Mobile WiMAX 2, or IEEE 802.16m

  • 3GPP LTE Advanced – LTE Release 10, supporting both paired Frequency Division Duplex (FDD) and unpaired Time Division Duplex (TDD) spectrum

Unfortunately, they slapped an asterisk on this ruling stating that both the forerunners of these technologies (i.e. LTE Rel-8 / Rel-9 and WiMAX 1.0) as well as enhancements to existing 3G networks that provide “substantial level of improvement in performance and capabilities” can be called 4G as well.


Translation: all network upgrades are being marketed as 4G.
However, never fear. It is still possible to help organize the nebulous 4G cloud into discrete technology blocks. That way the next time an executive says, “We should have a 4G strategy” you can respond on the quick.
The Problem with Marketing

This section title itself could be the topic of a whole separate article, but that’s not why we are here. The problem with “4G” marketing is that once one operator claims they have moved from 3G to 4G, everyone else needs a position as well. This is truly what opened the door to a broader technical designation from the ITU and, ultimately, the current confusion. Table 1 outlines the various wireless networks technologies with 4G claims.

From the summary above, a few initial items should jump out at you. First, each of these technologies leverages similar techniques at the air interface, including Multiple Input Multiple Output (MIMO) antenna systems, advanced modulation schemes and—except High Speed Packet Access+ (HSPA+)—Orthogonal Division Multiplexing (OFDM).
Secondly, there are two primary standards bodies driving the definitions: the Third Generation Partnership Program (3GPP) and the Institute of Electrical and Electronics Engineers (IEEE). In this case the 3GPP is the incumbent body driving standards for mobile networks in licensed spectrum and the IEEE became the new player, primarily with WiMAX (802.16 series) growing out of the previously unlicensed spectrum world of WiFi (802.11 series). To date, the majority of operators have embraced LTE, and, in the future, LTE-Advanced (LTE-A) as their path to 4G, but WiMAX remains a significant technology and the foundation of the Sprint / Clearwire network in the US.
Finally, one operator’s 4G may end up slower than the others’! An operator takes a big risk claiming HSPA+ as “4G” if their competitors are deploying LTE, LTE-A, or WiMAX 2. However, time-to-market may be the most valuable parameter for that operator. Unlike the others, HSPA+ builds on top of the existing deployed 3G HSPA network, operating in the same spectrum and Core Network.
4G Building Blocks

As was pointed out above, the essential techniques leveraged to increase radio interface throughput to 4G speeds are shared across the different approaches. The key technical aspects covered here include MIMO, OFDM and coding schemes (e.g. 64 QAM vs 128QAM, etc).


Diversity and MIMO

Diversity and MIMO are both based on leveraging multiple antennas at the base station and user equipment (UE) to increase signal to interference-plus-noise ration (SINR) and provide carrier aggregation, respectively. Diversity refers to using multiple transmit and receive antennas at the base station and UE respectively. It allows for protection against fading by essentially choosing which antenna to transmit or receive on based upon which has the best signal strength. There are two primary configurations. The first is called Transmit Diversity (TxD) in which there are at least two Tx (Tx = transmit) antennas at the base station. The second is termed TxD and Receive Diversity (RxD), where there are at least two Tx at the base station and two Rx at the UE. Using both TxD and RxD results in the greatest protection from fading as well as an option to co-phase beams to maximize SINR. Figure 1 illustrates TxD and RxD. All of the 4G technologies exploit both transmit and receive diversity to deliver better performance and capacity.


MIMO takes this a step further by exploiting multiple downlink (DL) and uplink (UL) antennas to enable the aggregation of multiple carriers both on the DL and UL to multiply the throughput achieved within the same spectrum allocation. Each antenna transmits on its own carrier and is aggregated at the destination to provide the overall effective increase in throughput. Figure 2 shows a DL MIMO example.
The more antennas employed the greater the multiplier. As is shown in Table-1 MIMO is a key component in candidate 4G networks with the most advanced, LTE-A and WiMAX 2, using up to eight DL and eight UL antennas.
OFDM and OFDMA

Orthogonal Frequency Division Multiplexing (OFDM) is not a brand new concept and, in fact, was first introduced in 802.11a to deliver 54 Mbit/s performance on WiFi. At a high-level, OFDM allows for the transmission of modulated carriers (i.e. carries with voice, data, etc applied to them) close together but orthogonal to one another. By making the modulated carriers orthogonal, there is no longer a need to space them to avoid interference and allow for filtering at the receive end. The net result of using OFDM is more data being transmitted over less spectrum—higher capacity with better spectral efficiency. Figure 3 illustrates how the radio interface has evolved from early mobile networks through 4G.


All of the 4G technologies (barring HSPA+), use a form of OFDM. Orthogonal frequency division multiple access (OFDMA) is used by WiMAX 1 and 2 in both the DL and UL, while LTE and LTE-A use OFDMA in the DL and a modified form Singl Carrier-FDMA. These are defined as follows:


  • Orthogonal Frequency Division Multiple Access (OFDMA) extends the concept of OFDM to support multiple users (i.e. multiple access). Multiple Access is achieved by assigning subsets of subcarriers to individual users.

  • Single-Carrier FDMA is leveraged in the UL in LTE and LTE-Advanced as it provides a lower peak-to-average power ratio (PAPR) due to pre-processing of the transmit symbols. The lowered PAPR on the UL allows for greater transmit power efficiency and lower terminal costs. This does not come without a cost, which is lower performance than OFDMA.

All the 4G technologies utilize increasing levels of quadrature amplitude modulation (QAM). QAM is both an analog and digital modulation scheme. In its digital form, which is used in mobile broadband, it transmits multiple digital bit streams by changing (i.e. modulating) the amplitudes of multiple carriers at 90 degrees out of phase with each other. This can be visualized as a grid. The increasing level of QAM (i.e. 16, 64, 128, etc) indicates a denser grid and, hence, a greater amount of data throughput. However, higher order schemes do result in a higher bit error rate, so 4G systems dynamically switch the coding scheme used based on the quality of the signal. If the signal is high quality then a higher order scheme is used and vice versa.


What Makes a Network Really 4G?

All 4G technologies use the same baseline techniques. However, LTE-Advanced and WiMAX 2 not only utilize OFDM, MIMO and higher order coding, but also stretch things further to deliver increased data rates and, most importantly, improved performance at the cell edge.


The cell edge is where performance degrades to its lowest point due primarily to interference, but new techniques are being employed to deliver the best possible performance at the edge. The ITU IMT-Advanced standard, which governs the 4G designation, levies significant performance requirements on the network that can only be achieved through certain approaches. These include: carrier aggregation (both contiguous and non-contiguous), coordinated multi-point (CoMP) and intelligent relays.
The first advance is carrier aggregation, which refers to combining multiple carriers (i.e. channels) to deliver greater throughput. The channels are combined at the physical layer, so they are transparent to the upper layers of the base station. However, the real advance in true 4G technologies is being able to aggregate non-contiguous carriers. The biggest challenge in delivering on the promise of ever increasing data rates in mobile networks remains the scarcity of spectrum. Carrier aggregation requires a lot of spectrum. By leveraging non-contiguous aggregation, operators can mass together previously disparate frequencies and use them to deliver a ubiquitous service throughout their coverage area.
Secondly, and very important to performance at the cell edge, is CoMP. CoMP allows a UE to receive from and transmit to multiple base stations at the same time. This essentially turns the interference found at the cell edge into usable signal. Figure 4 illustrates the concept. Implementing CoMP requires significant coordination across the base stations and requires very fast and very detailed feedback on the channel properties. Finally, there are two modes of CoMP:


  • Joint simultaneous transmission of user data from multiple base stations to a single UE

  • Dynamic cell selection with data transmission from one eNB

The last tool used in LTE-A and WiMAX 2 is relay. The intelligent relays in these standards are, again, targeted at improving the performance at the cell edge. A relay receives, demodulates and decodes the data; applies any error correction; and then re-transmits a new signal. Essentially, it shows a base station interface towards the UE then re-transmits as a UE to another base station. This moves the “serving” base station, from a UE perspective, closer and allows for increased performance.


These intelligent relays come in two primary flavors. Type 1 relays appear as a distinct base station to the UE. They control their own cell identity and transmission of their own synchronization channels and reference symbols. Type 2 relays appear as an extension of the primary base station. In this case the control information is sent directly from the primary base station while the relay processes and concentrates the user data.
Type 2 relays will likely be deployed first as a method to increase the data capacity, while Type 1 will likely come later, increasing both the signaling and data capacity as the number of subscribers increases.
Don’t miss Part 2, which will more closely examine the candidates for true 4G and look at how they are poised to develop in the future.
RadiSys, Hillsboro, OR. (503) 615-1100. [www.radisys.com]


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©azrefs.org 2016
rəhbərliyinə müraciət

    Ana səhifə