Abstract
This document provides an overview on the key measurements required for testing GSM transceivers. GSM mobile performance derivation is also discussed. This overview is intended to help a RF designer with no GSM RF system knowledge to get up to speed with the GSM system. Various systems are compared (GSM900, DCS1800, PCS1900, CT2, and DECT). Key design issues such as phase error and frequency error, TX output power, Adjacent Channel Power (ACP), spectral splatter and spurious are discussed.
A Basic Introduction to GSM
Before going into the mobile handset requirements, we discuss in this chapter an overview on GSM (the global system for mobile communications).
The GSM system was specified by a team of experts who regularly met as part of a body known as the European Telecommunications Standards Institute (ETSI). GSM at the moment is now truly becoming a global system for mobile communications, spanning Europe, Asia, Africa and much of South America, to name but a few.
GSM after launch evolved into GSM900, DCS1800 (also known as PCN) and PCS1900 (in the USA).
PCN started in the UK with Mercury one-to-one and Hutchinson (Orange) offering the first two networks to use DCS1800. It has since spread to other areas worldwide.
2.1. Technology differences
In this section we discuss frequently asked questions as asked by a new GSM engineer.
2.1.1. How is GSM different from CT2 and DECT?
GSM900 and DCS1800 are cellular systems, whereas DECT and CT2 are cordless systems. GSM (like AMPS and TACS allows users to make and receive calls over a wide geographic area. The system uses a register to log the position of all mobiles, allowing calls to be routed to the correct base station.
DECT and CT2 like other cordless systems do not include this tracking capability. They operate in much the same way as conventional domestic cordless phones, (in which calls can only be received when the mobile is within range of a headset's base station, but not at other locations).
2.1.2. How are GSM900, DCS1800 and PCS1900 different?
GSM900 is the original GSM system. It uses frequencies in the 900MHz band (numbered 1 to 124), and is designed for wide area cellular operation, with maximum output powers of 1W to 8W being allowed for mobile applications. A GSM Cell radius can measure 35Km or even up to 60Km depending on antenna pattern.
DCS1800 is an adaptation of GSM900, but is rather centred at 1.8GHz, and has a wider frequency band, which allows it to cope with higher user densities. DCS1800 mobiles are also designed for lower output powers (up to 1W), so cell sizes are inherently smaller than GSM900 cells, measuring about 20Km radius (this can also vary depending on antenna pattern.
In all other aspects GSM900 and DCS1800 are the same. GSM phase2 specification was since written to allow additional bandwidths and channels to be allocated to GSM900, to form what is now called extended band GSM (E-GSM). In addition to this lower power control levels for mobiles were made possible in this spec allowing micro-cell operation.
PCS1900 (also known as DCS1900) was designed in the USA to operate around 1.9GHz. It is essentially GSM technology at 1.9GHz.
2.2. A GSM CELL
The most visible part of the GSM cell is the base station and its antenna tower. It's common for several cells to be sectored around a common antenna tower. The tower will have several directional antennas, each covering a particular area. This co-location of several antennas is sometimes called a cell-site, or just a base station, or a base transceiver station (BTS).
All BTSs produce a broadcast channel (BCH) which is on all the time, and can be viewed as a lighthouse beacon. The BCH signal is received by all mobiles in the cell, whether they are on call or not, in order to:
- allow mobiles to find the GSM network
- allow the network to identify which BTS is closest to a given mobile
- allow coded information like the network identity (e.g. vodaphone, Mannesmann etc) to be known
- allow paging of messages to any mobiles needing to accept a phone call, and a variety of other information
The frequency channel used by the BCH is different in each cell. Channels can only be re-used by distant cells, where the risk of interference is low.
Mobiles on a call use a traffic Channel (TCH), which is a two way channel (known as an uplink and a downlink) used to exchange speech data between the mobile and base station. GSM separates the uplink and the downlink into separate frequency bands.
It's interesting to note that while the TCH uses a frequency channel in both the uplink and the downlink, the BCH occupies a channel in the downlink band only. The corresponding channel in the uplink is effectively left clear. This can be used by the mobile for unscheduled or random access channels (RACH). When the mobile wants to grab the attention of a base station (perhaps to make a call), it can ask for attention by using this clear frequency channel to send a RACH.
2.3. GSM modulation
GSM uses a digital modulation format called 0.3GMSK (Gaussian minimum shift keying). The 0.3 describes the bandwidth of the Gaussian filter with relation to the bit rate.
GMSK is a special type of digital FM modulation. 1's and 0's are represented by shifting the RF carrier by plus or minus 67.708KHz. Modulation techniques which use two frequencies to represent one and zero are denoted FSK (frequency shift keying). In the case of GSM the data rate of 270.833kbit/sec is chosen to be exactly four times the RF frequency shift. This has the effect of minimizing the modulation spectrum and improving channel efficiency. FSK modulation where the bit rate is exactly four times the frequency shift is called MSK (minimum shift keying). In GSM, the modulation spectrum is further reduced by applying a gaussian pre-modulation filter. This slows down the rapid frequency transitions, which would otherwise spread energy into adjacent channels.
0.3GMSK is not phase modulation (i.e. information is not conveyed by absolute phase states, as in QPSK for example). It's the frequency shift, or change of phase state which conveys information. GMSK can be visualized from an I/Q diagram. Without the Gaussian filter, if a constant stream of 1's is being transmitted, MSK will effectively stay 67.708KHz above the carrier centre frequency. If the carrier centre frequency is taken as a stationary phase reference, the 67.708KHz signal will cause a steady increase in phase. The phase will role 360 degrees at a rate of 67,708 revolutions per second. In one bit period (1/270.833KHz), the phase will get a quarter of the way round the I/Q diagram, or 90 degrees. 1's are seen as a phase increase of 90 degrees. Two 1's causes a phase increase of 180 degrees, three 1's 270 degrees and so on. 0's cause the same phase change in the opposite direction.
The exact phase trajectory is very tightly controlled. GSM radios need to use digital filters and I/Q or digital FM modulators to accurately generate the correct trajectory. The GSM specification allows no more than 5 degrees rms and 20 degrees peak deviation form the ideal trajectory.
2.4. TDMA and FDMA
GSM uses TDMA (time division multiple access) and FDMA (frequency division multiple access). The frequencies are divided into two bands. The uplink is for mobile transmission and the down link is for base station transmission. Each band is divided into 200KHz slots called ARFCN (Absolute radio frequency channel number). As well as slicing up frequency, GSM slices up time. Each ARFCN is shared between 8 mobiles, each using it in turn. Each mobile uses the ARFCN for one timeslot (TS) and then waits for its turn to come round again. The combination of a TS number and ARFCN is called a physical channel.
2.5. GSM mobile Power control
As the mobile moves around the cell, its transmitter power needs to be varied. When it's close to the base station, power levels are set low to reduce the interference to other users. When the mobile is further from the base station, its power level is increased to overcome the increase path loss.
All GSM mobiles are able to control their power in 2dB steps as commanded by the base station.
2.6. Timing advance
Timing advance is required in GSM because it uses time division multiple access (TDMA). Since a radio signal can take a finite period of time to travel from the mobile to the base station, there must be some way to make sure the signal arrives at the base station at the correct time.
Without timing advance, the transmitted burst from a user at the edge of a cell would arrive late and corrupt the signal from a user right next to the base station (unless a guard time, between time slots, greater than the longest signal travel time was used). By advancing the timing of the mobiles, their transmissions arrive at the base station at the correct time. As a mobile (MS) moves, the base station (BTS) will signal the MS to reduce its timing advance as it gets closer to the centre of the cell, and increase its timing advance as it moves away from the centre of the cell.
2.7. GSM TDMA power burst
Since GSM is a TDMA system and there are 8 users on a frequency pair, each user must only turn on their transmitter at the allowed time, and have their transmitter off in time so that they don't interfere with other users in the adjacent timeslots.
GSM specifies both an amplitude envelop for the RF burst of timeslots, and the flatness over the active part of the useful bits in the timeslot. The amplitude envelop has greater than 70dB of dynamic range, and needs to measure less than ±1dB flatness over the active part of the timeslot. All of this happens within the 577uS period of a timeslot.
2.8. What happens when a GSM Mobile is switched on
When a mobile first turns on, it searches all 124 channels in the downlink for signals. It will then order the channels by received signal strengths and check to determine if the channel was a BCH (Broadcast channel). Once the MS finds a BCH, it adjusts internal frequency and timing from the frequency correction channel (FCH) and synchronization channel (SCH), then checks to determine if the BCH is from its public land mobile network (PLMN). This involves comparing the allowed network and country codes stored on the SIM card with the information encoded on the BCCH. The mobile repeats this cycle until a good broadcast channel is found. If the mobile recognizes that it's in a different cell from the last time it was used, it needs to tell the network where it is. The network has to keep track of where every mobile is so that it can route calls to the correct cell for the particular mobile. This process of telling the network "here I am" is called a location update.
Once the mobile has synchronized to the basestation, determined that it's allowed to use the network, (and if necessary done a location update), it's camped. Once camped the mobile is ready to send or receive calls.
GSM transceiver measurements
The GSM standards define a radio communications system that works properly only if each component part operates within precise limits. Essentially, mobiles and base stations must transmit enough power, with sufficient fidelity to maintain a call of acceptable quality, without transmitting excessive power into the frequency channels and timeslots allocated to others. Similarly, receivers must have adequate sensitivity and selectivity to acquire and demodulate a low level signal.
GSM mobile transmitter and receiver measurements originate from the ETSI 3GPP standards (section 05.05.V8.12.0, "Radio access network; radio transmission and reception (release 1999)."
This chapter gives an overview on some of the key transmitter and receiver measurements required for testing GSM in order to ensure they conform to the GSM standard.
3.1. Transmitters
Performance is critical in three areas; in channel, out of channel, and out of band.
In channel measurements determine the link quality seen by the user in question. Measurements include:
- Phase error and mean frequency error
- Mean transmitted RF carrier power
- Transmitted RF carrier power versus time
Out of channel measurements determine how much interference the user causes other GSM users. These include:
- Spectrum due to modulation and wideband noise
- Spectrum due to switching
- Tx and Rx band spurious
Out of band measurements determine how much interference the user causes other non GSM users of the radio spectrum (e.g. the military, Police, aviation, etc...). All other spurious (harmonics, wideband etc...) are included here.
3.1.1. Phase error and frequency error
Phase error is one of the parameters used in GSM to characterize modulation accuracy. Poor phase error usually indicates a problem with the I/Q baseband generator, filters, modulator or amplifier in the transmitter circuitry.
Frequency error measurements indicate poor synthesizer/phase lock loop performance (e.g. the synthesizer may not be settling quickly enough as it shifts frequency between transmissions). In a GSM system, poor frequency error can cause the target receiver to fail to gain lock to the transmitted signal, also the transmitter could cause interference with other users.
In order to measure phase and frequency error a test set can be used to sample the transmitted output of the device under test in order to capture the actual phase trajectory. This is then demodulated and mathematically the ideal phase trajectory is derived. Subtracting one from the other gives an error signal. The mean gradient of this signal (phase/time) gives frequency error. The variation of this signal is the phase error and is expressed in terms of root mean square (rms) and peak. The figure below demonstrates this test procedure:
The figure below shows a measurement on one transmitted burst, and how it relates to the limits set by the GSM standard.
3.1.2. Mean transmitted output power
GSM systems use dynamic power control to ensure that each link is maintained sufficiently with a minimum of power. This allows overall system interference to be kept to a minimum, and in the case of a MS, battery life is maximized.
Out of specification power measurements usually indicate a fault in the power amplifier circuitry, the calibration tables, or the power supply. GSM mean output power is measured during the useful part of the GSM burst. In performing this measurement, GSM test equipment derive the correct timing reference by demodulating the incoming signal, and gating over the useful part of the GSM burst.
3.1.3. Transmitted RF carrier power versus time
In GSM systems, transmitters must ramp up and down within the TDMA structure, to prevent adjacent timeslot interference. If transmitters turn on too slowly, data at the beginning of the burst might be lost, degrading link quality, and if they turn off too slowly, the user of the next time slot in the TDMA frame will experience interference.
This measurement is therefore done to assess the envelop of carrier power in the time domain against a prescribed mask. It also checks that the transmitter's turn off is complete. If a transmitter fails the transmitted RF carrier power versus time measurement, this usually indicates a problem with the units PA or power control loop.
3.1.4. Adjacent channel power (ACP)
ACP is defined by two measurements:
- Spectrum due to modulation and wideband noise
- Spectrum due to switching
These two measurements are usually grouped together and called "output RF spectrum" (ORFS).
3.1.4.1. Spectrum due to modulation and wideband noise
The modulation process in a transmitter causes the continuous wave carrier to spread spectrally. The "spectrum due to modulation and wideband noise" measurement is used to ensure that the modulation process does not cause excessive spread, as this would cause interference to adjacent channel users.
To perform this measurement, an analyzer is tuned to a spot frequency, and then time gated across part of the modulated burst. Power is then measured using this mode, and then the analyzer is retuned to the next frequency, or offset of interest. This process continues until all offsets are measured and checked against permissible limits. What results from this is a set of frequency vs. power points that define the spectrum of the signal, however, spectral components that result from the effect of bursting do not appear because the ramps are gated out.
The test limits for this measurement are expressed in dBc, so the first step of the measurement is to take a reading of the centre frequency to which the transmitter is tuned.
3.1.4.2. Spectrum due to switching
GSM transmitters ramp RF power rapidly. The "transmitted RF carrier power vs. time" measurement described earlier ensures that this process happens at the correct times and happens fast enough. However, if RF power is ramped too quickly, undesirable spectral components exist in the transmission. This measurement ensures that these components stay below the acceptable level.
To perform a spectrum due to switching measurement, the analyzer is tuned to and measures multiple offset frequencies in zero span mode with no time gating in this case.
3.1.5. Spurious measurements
These are necessary to ensure GSM transmitters do not put energy into the wrong parts of the spectrum, as this would cause interference to other users of the spectrum.
Conducted spurious are discussed here. These are measured by connecting the test set directly to the antenna connector of the MS. Measurements for this parameter include:
- Tx and Rx band spurious
- Cross band spurious
- Out of band spurious
3.1.5.1. GSM Tx and Rx band spurious
The Tx band spurious measurement relates to spurious that fall within the 925 - 960 MHz GSM Tx band.
The Rx band spurious measurement however, is a measure of how much energy the transmitter puts in the Rx band (880 - 915MHz). This test ensures that Tx spurious don't "jam" or desensitize adjacent receivers (this specification is based on 1m average distance between mobiles).
A Rx band pass filter is usually used in front of the analyzer input when this measurement is performed, in order to attenuate the Tx band signal.
3.1.5.2. Cross band spurious (e.g. GSM900 into DCS1800)
In some countries GSM900 and DCS1800 systems co-exist. For this reason the ETSI 3GPP standards require specific cross band performance to ensure GSM transmitters put minimum energy into the DCS1800 band and vice versa.
3.1.5.3. Out of band spurious
The out of band spurious is a series of spectrum analyzer measurements over a large frequency range from 100KHz through to 12.75GHz . The 3GPP standards have been written to include Wideband spurious limits to which an MS must conform.
3.2. Receivers
This section defines and discusses some of the key receiver performance parameters used in GSM receiver definition.
3.2.1. Sensitivity
Sensitivity is the fundamental measure of receiver performance. It specifies the minimum signal level for a specified percentage of errors in the demodulated information. The reported value for all receiver measurements is BER (bit error rate) or a variation thereof namely:
- FER (frame erasure rate); It is the percentage of erased frames compared to the total number of frames sent during an observation period
- RBER (residual bit error rate); When frames are erased, only the BER of the remaining frames is measured. This parameter RBER defines this measurement
BER is a ratio of bits received erroneously versus total number of bits received. It is measured as follows. The test system outputs a signal carrying a known bit pattern (usually a Pseudo random bit sequence, (PRBS)). PRBS signals are usually labeled PNx, where x is the number of bits being permutated in the sequence (e.g. PN9 = 2^9 - 1 or 511 bits).
The receiver under test then attempts to demodulate and decode this pattern and via a return path (using a method known as loop back), sends the resultant bits back to the test system for comparison. The test system then calculates the required metrics. GSM handsets are tested using this loopback method.
3.2.2. Co-channel rejection
Most receivers are required to maintain a specified BER in the presence of an interfering signal within the channel. For GSM this parameter is measured as below:
co channel is tested 20 dB above sensi,
with fading,
with GMSK modulated interferer.
A digitally modulated signal power is set 20dBs above receiver sensitivity, at the centre of the receiver's passband, and combined with a GMSK modulated interferer (on the same frequency as the wanted), and a fading profile. The combined signal is then injected into the antenna port of the receiver. The power level of the interfering signal is then set to a nominal level at which the BER of the receiver must not exceed the receiver sensitivity spec. The difference in power levels between the two signals is the interference ratio.
3.2.3. Receiver Blocking
This parameter constitutes one of the out of channel receiver tests. Blocking tests verify correct receiver operation in the presence of out of channel signals and monitor the receiver's susceptibility to internally generated spurious responses. Three key tests define a receivers blocking performance:
- Spurious immunity
- Intermodulation distortion
- Adjacent channel selectivity
3.2.3.1. Spurious immunity
It is the ability of the receiver to prevent single, out of channel, interference signals from causing an undesired in channel response at the output of the receiver. Spurious may be generated within the receiver from; power supply harmonics, system clock harmonics, or LO spurious.
3.2.3.2. Intermodulation immunity
Is a measure of the receivers performance in the presence of distortion products that are generated when more than one tone is present at the input of the receiver, and non linearly mix to form third order intermodulation products, which lie within the passband of the receiver.
3.2.3.3. Adjacent channel selectivity
This is a measure of the receiver's ability to process the desired modulated signal in the presence of a strong signal in the adjacent channel. Alternate channel selectivity is a similar test in which the interfering signal is two RF channels away from the passband of the receiver.
GSM mobile RF transceiver derivation
The document has been split up into three main sections namely:
- Receiver analysis
- Transmitter analysis
- LO phase noise requirements analysis
4.1. Receiver analysis
4.1.1. Rx Noise figure / sensitivity
Receiver sensitivity is related to receiver NF according to the relation:
Sensitivity, S = -174 + 10logBi + S/N+Gimp + NF ...........[1]
Where:
Bi = receiver bandwidth ( = 180KHz for GSM)
S/N = baseband signal to noise ratio
Gimp = RF and BB implementation gain
The GSM standard specifies a minimum -102dBm sensitivity requirement. Given a worst case baseband S/N ratio of 9dB as what is required by a given baseband chipset to correctly decode a received signal, and a 2dB implementation margin, then from equation [1] above we can calculate the worst case NF for this receiver as:
NF = -174 + 10logBi + S/N + Gimp - S
= -174 + 10log(180,000) + 9 + 2 - (-102)
= 8.5 dB
Given this worst case NF, the receiver designer can then investigate various front end gain and NF partition options according to the equation:
NF = 10log(F, the Rx. Noise factor) = 10 log [ F1 + (F2-1)/G1 + (F3-1)/G1.G2 + ....] ...[eq.2]
Where Fi = the noise factor of the i'th block in the partition { i = 1,2,3...}
Although equation 2 shows that the higher the gain of the first active stage, the lower the NF of the system would be, the receiver designer needs to ensure that the first active stage does not compress the stages after it, as this would degrade receiver linearity.
This shows that system sensitivity is a compromise between receiver NF (dominated by choice of front end components), and receiver linearity. The following receiver front end options are typically investigated:
The main benefit of option 2 compared with Option 1 is that in Option 2 the individual LNA noise figure and gain requirements are significantly relaxed. Whereas in Option 1 the front end LNA would need to be tightly specified in order to achieve the same system NF.
The main disadvantage of Option 2 is typically increased cost and potentially the extra supply current required due to addition of a second LNA.
4.1.2. Rx blocking analysis
Frequency band | MS Blocking signal level | Description |
600KHz |f-fo| < 800KHz | -43dBm | In band blocking |
800KHz |f-fo| < 1.6MHz | -43dBm | |
1.6MHz |f-fo| < 3MHz | -33dBm | |
3MHz |f-fo| | -23dBm | |
900 - 915MHz | -5dBm | Out of band blocking |
0.1 - < 915MHz | ||
980 - 12750MHz | 0dBm |
A GSM receiver designer specifies a receive strip's compression points based on the above listed in band blocking signal level specifications, and uses the out of band blocking signal levels to define the filter rejection specifications, in order to avoid signal path compression.
For example in band blocking at 3MHz offset (i.e. -23dBm) sets the compression point required for the front end. Assuming a 1dB loss switch and a 2.5dB loss filter prior to a LNA stage in the receive strip, places a total 3.5dB loss prior to the LNA stage. This means the LNA compression point must in the worst case be -26.5dBm (i.e. -23 - 3.5dBm).
4.1.3. Rx intermodulation
GSM Receiver intermodulation performance is predominantly affected by the front end circuitry, if the IF filter chosen has good enough attenuation at ±800KHz and ±1600KHz (the offset frequencies for which this parameter is tested as specified by the GSM 05.05 standard).
The equation typically used to determine system IP3 requirement is:
IP3 (min) = Pi + (Pi - Pu + C/I)/2
Where Pi = interference signal level = -49dBm (from GSM 05.05 spec)
Pu = useful signal level = GSM sensitivity level + 3dB = -102 + 3 = -99dBm
C/I = carrier to interference ration for which the receiver is designed
Say for an 8dB C/I the minimum GSM receiver input intercept is given by:IP3 (min) = -49dBm + (99 - 49 + 8) /2 = -20dBm
4.2. LO phase noise requirements analysis
Oscillator phase noise specification is one of the most critical parts of GSM receiver design. In general, the system designer puts the minimum permissible specification on the channel synthesizer, and then specifies all other VCOs in the system to be significantly better (e.g. 8 ~ 10dBs better). Doing this makes the channel synthesizer's VCO phase noise to dominate any other VCO in affecting the system performance.
4.2.1. GSM RF VCO phase noise derivation
The "far out" RF VCO phase noise (i.e. phase noise out of the loop bandwidth) is determined by four main factors defined in the GSM 05.05 specification. These are:
- Modulation spectrum (see section 5.2.1.1 for 05.05 spec.)
- Spurious emissions (see section 5.2.1.2 for 05.05 spec.)
- Receiver blocking (see section 5.2.1.3 for 05.05 spec.)
- Adjacent channel performance (see section 5.2.1.4 for 05.05 spec.)
The close in phase noise of the oscillators will contribute directly to the system noise figure, by adding noise inside the system bandwidth. Some of the above specifications (in particular those related to the far out phase noise) overlap in frequency. The system designer therefore designs the VCO phase noise to meet the hardest requirement at a given offset (with margin), which then guarantees that all other requirements at the given offset are met.
Worst case phase noises to meet each performance requirement can then be compiled to give the final channel synthesizer VCO phase noise requirements.
In the next section some examples are shown on how some of the VCO phase noises requirements are derived to meet the spectrum due to modulation and transmit noise in receive band.
4.2.1.1. Modulation Spectrum [GSM05.05 paragraph]
05.05 provides specification in dBc/BW whereas VCO phase noise is specified in dBc/Hz. So, to convert dBc/BW to dBc/Hz, the formula used is:
dBc/Hz = 10log(Band Width specified in 05.05) + |dBc| value specified in 05.05- [eq.3] e.g. at 200KHz offset, the specification for spectrum due to modulation is -30dBc/30KHz. This translates (according to eq. 3) to a VCO phase noise requirement at 200KHz offset to be worst case
= [10 log(30,000) + 30] dBc/Hz
= 75dBc/Hz
The above method is used to calculate the minimum VCO phase noise required to meet the spectrum due to modulation specification.
Offset frequency | dBc/BW | Derived Phase noise (dBc/Hz) |
±200KHz | -30/30KHz | -75 |
±250KHz | -33/30KHz | -78 |
±400KHz | -60/30KHz | -105 |
±600 - 1200KHz | -60/30KHz | -105 |
±1200 - 1800KHz | -60/30KHz | -105 |
±1800 - 3000KHz | -63/100KHz | -113 |
±3000 - 6000KHz | -65/100KHz | -115 |
> ±6000KHz | -71/100KHz | -121 |
4.2.1.2. Spurious emissions
Offset frequency | 05.05 specification dBc/BW |
> ±1.8MHz | -30dBc/30KHz (in band) |
> ±6MHz | -33dBc/100KHz (in band) |
> ±2MHz | -60dBc/30KHz (out of band) |
> ±5MHz | -60dBc/100KHz (out of band) |
> ±10MHz | -60dBm/300KHz (out of band) |
> ±20MHz | -63dBm/1MHz (out of band) |
> ±30MHz | -65dBc/3MHz (out of band) |
10 ~ 20MHz (i.e. 925 ~ 935MHz) | -67dBc/100KHz (out of band) |
> 20MHz (i.e. 935 ~ 980MHz) | -79dBc/100KHz (out of band) |
To calculate the required phase noise required to meet the transmit noise in receive band (i.e. 925 ~ 980MHz) relative though to 33dBm Tx output power the following calculation is done:
- For 925 ~ 935MHz
The specification is -67dBm in 100KHz BW, which translates to -117dBc/Hz. However relative to 33dBm Tx power, the required specification for VCO phase noise in the 925 ~ 935MHz band is therefore - (117dBc/Hz + 33)dBc/Hz = -150dBc/Hz.
Similarly
- For 935 ~ 980MHz
The specification is -79dBm in 100KHz BW, which translates to -162dBc/Hz
Conclusion
This article can be used by a new GSM system designer to get a better appreciation of GSM module specifications and how they can affect system performance.
Maxim Integrated offers world-leading, integrated solutions that combine GSM transceiver technology with other technologies, such as WCDMA, GPRS, and EDGE. Our solutions offer the lowest current and size.
Most of these solutions have been published on the Maxim website or have been announced as future products in Maxim's wireless design guides, also on the website.
A similar version of this article appeared in the June, 2003 issue of RF Design magazine.