Signal Statistics for OFDM

Here is some­thing that recent­ly piqued my inter­est (and kept me busy for the bet­ter part of last week). It could be (mar­gin­al­ly) inter­est­ing for peo­ple which work on OFDM or Nyquist WDM and – as far as I know – no one has looked at this in more detail. In both, OFDM and Nyquist WDM / OTDM$^1$, each sam­ple of the com­pound sig­nal is a super­po­si­tion of mul­ti­ple sub­chan­nel sym­bols (OFDM) or neigh­bor­ing sym­bols (OTDM). When there are more than about 8 to 10 super­posed sym­bols, the involved dis­tri­b­u­tions begin to con­verge toward the Gauss­ian. How­ev­er, they are not tru­ly Gauss­ian. This arti­cle is about the dif­fer­ences between the true OFDM sig­nal sta­tis­tics and true Gauss­ian sta­tis­tics.

First of all, for a finite num­ber of sub­chan­nels, the prob­a­bil­i­ty dis­tri­b­u­tion will be a dis­crete one, described by a prob­a­bil­i­ty mass func­tion (PMF). Sec­ond, unlike the Gauss­ian, the sup­ports of the involved dis­tri­b­u­tions are bound­ed and there exists a max­i­mum val­ue that occurs with finite prob­a­bil­i­ty. So, what could be more inter­est­ing than to fig­ure out the exact PMFs involved? Ulti­mate­ly, we are inter­est­ed in the sta­tis­tics of the sam­ple pow­ers. Often, these are giv­en as a peak-to-aver­age-pow­er ratio (PAPR) [1]. We will dis­cuss why this has almost no mean­ing in just a lit­tle while.

We’ll start with OFDM in this arti­cle since it allows us to use some results from com­bi­na­torics to save some work. OTDM, on the oth­er hand, will require a brute force – or com­pre­hen­sive – approach, and we’ll save that for lat­er.

Caveat emp­tor: As with almost all arti­cles on this blog, the con­tent pre­sent­ed is not just repro­duced from some book, but derived with the help of the cit­ed sources only. I can­not guar­an­tee that it is cor­rect or that it is not treat­ed else­where in a more com­plete way. If you find it dis­cussed some­where not cit­ed here, or if you find an error, please let me know in the com­ments.

OFDM Recap

In the arti­cle on OFDM Basics an OFDM sym­bol $C$ was described as

$$C(t) = \sum_{k=0}^{K-1} C_k(t) = \sum_{k=0}^{K-1} c_k \cdot \exp\bigl(i\omega_k t\bigr)\tag{1}$$

where the $c_k$ are the coef­fi­cients con­tain­ing the data. Often, these are tak­en from a com­plex M-QAM alpha­bet which resem­bles a square in phase space when $M = 2^{2m}$ with inte­ger $m$, e.g. $M=4$, $M=16$ or $M=64$. If we (ini­tial­ly) restrict our analy­sis to the case $t=0$, i.e. a par­tic­u­lar sam­pling point in the dura­tion of the sym­bol, then the expo­nen­tials van­ish for all $k$ and all sub­chan­nel sym­bols $C_k$ would use the same alpha­bet – with­out this restric­tion, the alpha­bet for each sym­bol would be rotat­ed by a $t$- and $k$-dependent amount.

We start by look­ing at the sig­nal sta­tis­tics for a sin­gle quad­ra­ture (I or Q), since both are equal with square alpha­bets / con­stel­la­tion dia­grams, and then com­bine these sta­tis­tics into a PMF for the sym­bol pow­er.

Single Quadrature

The sig­nal in a sin­gle quad­ra­ture of an M-QAM sym­bol can take $N = \sqrt{M}$ dif­fer­ent val­ues. The exact pos­si­ble val­ues depend on our nor­mal­iza­tion. If we nor­mal­ize to aver­age pow­er $\bigl\langle P \bigr\rangle = 1$ then we have the fol­low­ing alpha­bets

$$\begin{gather}
A_\mathrm{4-QAM} = \frac{1}{\sqrt{K} \sqrt{2}} \left\lbrace -1, 1 \right\rbrace \\
A_\mathrm{16-QAM} = \frac{1}{\sqrt{K} \sqrt{10}} \left\lbrace -3, -1, 1, 3 \right\rbrace \tag{2}\\
A_\mathrm{64-QAM} = \frac{1}{\sqrt{K} \sqrt{42}} \left\lbrace -7, -5, -3, -1, 1, 3, 5, 7 \right\rbrace
\end{gather}$$

Since all coef­fi­cients $C_k(t=0)$ are cho­sen from the same alpha­bet, the order in which they appear in the sum $C(t=0)$ does not mat­ter, so we are reduced to the prob­lem of fig­ur­ing out the pos­si­ble com­bi­na­tions of sym­bols in $A$ in a sequence of length $K$ and the prob­a­bil­i­ty of each pos­si­bil­i­ty occur­ring. The prob­lem is equiv­a­lent to the com­bi­na­tor­i­cal prob­lem of choos­ing $K$ ele­ments from the alpha­bet $A$ with rep­e­ti­tions, but with­out order (since the order of the occur­rence of the var­i­ous sym­bols is not rel­e­vant for their sum), and the num­ber of com­bi­na­tions is giv­en by the bino­mi­al coef­fi­cient [2]

$$\begin{pmatrix} N + K - 1 \\ K \end{pmatrix} \tag{3}$$

The prob­a­bil­i­ty of each such com­bi­na­tion is then deter­mined by the multi­n­o­mi­al dis­tri­b­u­tion [3].

$$p = \begin{cases}\frac{K!}{\prod x_n!}\left(\frac{1}{N}\right)^K & \text{when} \; \sum_{n=1}^N x_n = K \\ 0\vphantom{\frac{1}{1}} & \text{otherwise}\end{cases} \tag{4}$$

where $x_n$ is the num­ber of occur­rences of sym­bol $n$ from alpha­bet $A$. In (4) we assumed that all sym­bols appear with equal prob­a­bil­i­ty $N^{-1}$. Fig­ure 1 com­pares the num­ber of com­bi­na­tions for var­i­ous $N$ as described by (3) and with a com­pre­hen­sive approach, which has $N^K$ com­bi­na­tions. The com­pre­hen­sive approach pro­duces com­bi­na­tion counts which are orders of mag­ni­tude larg­er for even small val­ues of the sequence length $K$, and can only be ana­lyzed using Monte Car­lo tech­niques except for small $N$ and $K$.

Fig­ure 1: Num­ber of pos­si­ble com­bi­na­tions of choos­ing K times from a set of size N (with rep­e­ti­tions); regard­ing order (sol­id lines, com­pre­hen­sive) and dis­re­gard­ing order (dashed lines).

The MATLAB code to com­pute the PMF of a sin­gle quad­ra­ture sig­nal is quite straightforward$^2$:

clear variables
NoS = 8; % number of subchannels
NoV = 4; % number of possible values in each quadrature for 16-QAM - only powers of 2 allowed
normalization = 1 / sqrt(NoS) / sqrt(2/3*(NoV^2 - 1)); % alphabet normalization according to (2)
    % see http://www.dsplog.com/2007/09/23/scaling-factor-in-qam/
    % for the origin of the factor (2/3*(NoV^2 - 1)
probability = 1 / NoV; % (uniform) probability of each value to occur

values = 0:2:2*(NoV-1); values = values - NoV + 1; % generate integer part of the alphabet

support = min(values)*NoS:max(values)*NoS; % range of integers that the signal can take

[count, tuples] = nsumk(NoV, NoS);
    % listing of (ordered) NoV-tuples of non-negative integers adding up to NoS
    % see http://www.mathworks.com/matlabcentral/fileexchange/28340-nsumk for details on nsumk()

p = factorial(NoS) ./ prod(factorial(tuples), 2) * probability^NoS; % multinomial PDF of each tuple row
sums = sum(tuples .* repmat(values, [size(tuples, 1), 1]), 2); % sum of all symbols

% add PMFs of equal sum values but different addends (if any)
PMF = zeros(size(support));
for ii = 1:size(p)
    index = find(support == sums(ii));
    PMF(index) = PMF(index) + p(ii);
end

support = normalization * support;

and a typ­i­cal PMF plot will look like this:

Fig­ure 2: Prob­a­bil­i­ty mass func­tions for two exem­plary com­bi­na­tions of sig­nal lev­els (N) and num­ber of sub­chan­nels (K). More sig­nal lev­els will lead to more com­bi­na­tion pos­si­bil­i­ties and thus a fin­er-grained PMF.

The PMF is unfor­tu­nate­ly not a very clear tool due to the col­ored dots all over the place – often it is eas­i­er to com­pare dis­tri­b­u­tions with dis­crete val­ues by their cumu­la­tive den­si­ty func­tion which is – like for con­tin­u­ous prob­a­bil­i­ty den­si­ties – the inte­gral of the $\mathrm{PDF}_C(C)$.

$$\mathrm{CDF}_C(C) = \intop_{-\infty}^C \mathrm{PDF}_C(C’) \, dC’ \tag{5}$$

which in MATLAB can be done via

CDF = (sum(tril(repmat(PMF, [size(PMF, 2), 1])),2);

The CDFs are step func­tions:



Fig­ure 3: Cumu­la­tive dis­tri­b­u­tion func­tions of sin­gle-quad­ra­ture sig­nal lev­els for var­i­ous QAM for­mats and K = 8 (top), K = 16 (mid­dle) and K = 32 (bot­tom) sub­chan­nels, togeth­er with the CDF for a Gauss­ian dis­tri­b­u­tion with equal pow­er.

We can see that each QAM for­mat and sub­chan­nel count will result in a slight­ly dif­fer­ent dis­tri­b­u­tion of val­ues. As $N$ and $K$ increase, the CDFs (and thus PDFs) approach the Gauss­ian dis­tri­b­u­tion. How­ev­er, each $(N,K)$ pair has a dis­tinc­tive min­i­mum val­ue $C_\mathrm{min} = K \cdot \min(A)$ and asso­ci­at­ed prob­a­bil­i­ty $\mathrm{PDF}_C(C_\mathrm{min})$ for which we will give a closed expres­sion lat­er. The CDF can be used to deter­mine clip­ping prob­a­bil­i­ties and the CDF of the clipped por­tion of the sig­nal (where we also have to pay atten­tion to the pos­i­tive side which does, how­ev­er, have the same sta­tis­tics due to sym­me­try).

Signal Power

For each in-phase ampli­tude $C_i$ with asso­ci­at­ed prob­a­bil­i­ty $\mathrm{PMF}_C(C_i)$ and quad­ra­ture ampli­tude $C_q$ with asso­ci­at­ed prob­a­bil­i­ty $\mathrm{PMF}_C(C_q)$ we can assign a sam­ple pow­er

$$P = C_i^2 + C_q^2\tag{6a}$$

with joint prob­a­bil­i­ty

$$\mathrm{PMF}_P(P) = \mathrm{PMF}_C(C_i) \cdot \mathrm{PMF}_C(C_q) \tag{6b}$$

since the data in both quad­ra­tures is inde­pen­dent. We sim­ply cal­cu­late all pow­er val­ues and asso­ci­at­ed prob­a­bil­i­ties, clean the list by com­bin­ing the prob­a­bil­i­ties of equal pow­er val­ues and sort the values$^2$:

ll = length(support);
power = repmat(support,[ll, 1]).^2 + repmat(support.',[1, ll]).^2; % all possible power values
p = repmat(PMF_C,[ll, 1]) .* repmat(PMF_C.',[1, ll]); % joint probability

support_P = unique(power(:)); % find unique power values and sort them
PMF_P = zeros(size(support_P));

for ii = 1:length(p(:))
    index = find(support_P == power(ii));
    PMF_P(index) = PMF_P(index) + p(ii);
end

The result looks like this:

Fig­ure 4: Prob­a­bil­i­ty mass func­tion of the sam­ple pow­er for N = 8 (64-QAM), K = 8 sub­chan­nels and unit mean pow­er.

Again, it’s more use­ful to plot the CDF. In par­tic­u­lar, the com­pli­men­ta­ry CDF can be used to deter­mine the prob­a­bil­i­ty of the sam­ple pow­er being larg­er than some val­ue of inter­est, which can be relat­ed to the PAPR.



Fig­ure 5: Cumu­la­tive dis­tri­b­u­tion func­tions of sam­ple pow­er for var­i­ous QAM for­mats and K = 8 (top), K = 16 (mid­dle) and K = 32 (bot­tom) sub­chan­nels, togeth­er with the CDF for a χ-squared dis­tri­b­u­tion with two degrees of free­dom and equal pow­er.

While there are sig­nif­i­cant devi­a­tions from the $\chi^2$ distribution$^3$ vis­i­ble in Fig­ure 5, these occur most­ly at low prob­a­bil­i­ties.

Other Sample Times

Gen­er­al­ly, a sin­gle OFDM sym­bol com­pris­ing $K$ sub­chan­nels is made up of at least $K$ sam­ples, more if there is over­sam­pling and/or cyclic pre­fix. Since sam­pling times oth­er than mul­ti­ples of $\Delta\omega \cdot t = \pi/2$ result in dif­fer­ent coef­fi­cient alphabets$^1$ for dif­fer­ent sub­chan­nels, we can­not apply our pre­vi­ous sim­pli­fi­ca­tions and have to resort to com­pre­hen­sive meth­ods. We’ll dis­cuss these in more detail in a sep­a­rate post on Nyquist WDM sig­nal sta­tis­tics.

As an exam­ple, we’ll look at the high­est sam­ple pow­er at var­i­ous sam­pling instances of OFDM sym­bols with eight 16-QAM sub­chan­nels obtained by exhaus­tive­ly search­ing all com­bi­na­tions (which takes about half an hour PER POINT to com­pute on my machine and which could prob­a­bly be done much faster with a lit­tle more think­ing). We would expect the max­i­mum to occur at $\Delta\omega \cdot t = m \cdot \pi/2$ (quar­ter rota­tions of the con­stel­la­tion alpha­bet between neigh­bor­ing sub­chan­nels) because only then do the con­stel­la­tion points with the high­est pow­ers in each sub­chan­nel poten­tial­ly add in phase for all sub­chan­nels. Fig­ure 6 con­firms our assump­tion:

Fig­ure 6: Max­i­mum pos­si­ble sin­gle-sam­ple pow­er depend­ing on the posi­tion of the sam­ple with­in the OFDM sym­bol for N = 4 and K = 8. To reduce cal­cu­la­tion times, only the non-repeat­ing part (sol­id) was actu­al­ly cal­cu­lat­ed and then repeat­ed (dashed). The sym­bols cor­re­spond to the col­ors in Fig­ure 7.

The CDFs will of course also dif­fer with sam­ple tim­ing, but not by much as Fig­ure 7 shows.

Fig­ure 7: Cumu­la­tive dis­tri­b­u­tion func­tions of sam­ple pow­er for 16-QAM for­mat and K = 8 for dif­fer­ent sam­ple times with­in the OFDM sym­bol, togeth­er with the CDF for a χ-squared dis­tri­b­u­tion with two degrees of free­dom and equal pow­er.

Limit Probabilities and PAPR

Usu­al­ly, it’s com­pu­ta­tion­al­ly not fea­si­ble to cal­cu­late the CDFs when there are either a lot of QAM lev­els $M$ or sub­chan­nels $K$. We can, how­ev­er, quite eas­i­ly deter­mine the prob­a­bil­i­ty of the high­est pos­si­ble sam­ple pow­er to occur. We know that it appears for $t=0$ and we can thus use the deriva­tions made above. We also know that it is just the square of $K$ times the largest-ampli­tude sig­nal in a sin­gle sub­chan­nel, or with (6),

$$\max(P) = \max\left(C_i^2 + C_q^2\right) = 2 K^2 \cdot \bigl[\max(A)\bigr]^2 \tag{7}$$

with the alpha­bet $A$ as in (2). Since or nor­mal­iza­tion is such that $\left\langle P \right\rangle=1$, $\mathrm{max}(P)$ is also the “true” PAPR of an OFDM sig­nal.

Let us check how prob­a­ble an occur­rence of $\max(P)$ actu­al­ly is. We start with the prob­a­bil­i­ty for all coef­fi­cients $C_k$ being equal in a sin­gle quad­ra­ture, which cor­re­sponds to all $x_n$ in (4) being zero except for the one denot­ing that par­tic­u­lar coef­fi­cient which is equal to $K$ (it does not mat­ter which one that is when all coef­fi­cients are equal­ly like­ly).

$$p_\text{equal quad­ra­ture} = \left(\frac{1}{N}\right)^K\tag{8}$$

The prob­a­bil­i­ty that the oth­er quad­ra­ture also has the same coef­fi­cient in each sub­chan­nel (but pos­si­bly a dif­fer­ent one than the first quad­ra­ture) is

$$p_\text{equal sym­bol} = p^2_\text{equal quad­ra­ture} = \left(\frac{1}{N}\right)^{2K} = \left(\frac{1}{M}\right)^{K}\tag{9}$$

where $M$ is the num­ber QAM con­stel­la­tion points, since we still assume the data in both quad­ra­tures to be inde­pen­dent. This is the prob­a­bil­i­ty that any sin­gle con­stel­la­tion point appears simul­ta­ne­ous­ly in all sub­chan­nels. There are four con­stel­la­tion points which cause the sym­bol to have the max­i­mum pos­si­ble pow­er – the cor­ner points. Hence the prob­a­bil­i­ty for the max­i­mum pow­er to occur is

$$p_{\max(P)} = 4 \left(\frac{1}{M}\right)^{K} \tag{10}$$

This can be very, very small, as we can also see in Fig­ure 5, eas­i­ly down to the $10^{-50}$s. Fur­ther­more, (10) is valid only for the sam­ple times with­in the OFDM sym­bol for which $\Delta\omega \cdot t = m \cdot \pi/2$, as dis­cussed above. For all oth­er sam­ple times, the prob­a­bil­i­ty for $\max(P)$ to occur is zero.

Such prob­a­bil­i­ties are utter­ly irrel­e­vant for com­mu­ni­ca­tion even at Terabit/s. If just observ­ing an OFDM sig­nal for a while and giv­ing the PAPR as the ratio involv­ing the high­est observed sam­ple pow­er, the result would be some­what ran­dom and should only be giv­en with prop­er con­fi­dence bounds. It would make more sense to define a pow­er val­ue (or rather, ratio) which is only exceed­ed with a prob­a­bil­i­ty low­er than some val­ue like $10^{-3}$ or $10^{-6}$ maybe. At these prob­a­bil­i­ties, the dif­fer­ences between the dif­fer­ent sam­ple tim­ing (cf. Fig­ure 7) seem to be neg­li­gi­ble. As far as I know, such a def­i­n­i­tion does not exist (yet).

Any­way, if PAPR is a seri­ous issue in a sys­tem, one should maybe look at Nyquist WDM which has nicer over­all sam­ple sta­tis­tics (non-Gauss­ian).


1 Here, $\Delta\omega$ is the sub­chan­nel fre­quen­cy sep­a­ra­tion. When $\Delta\omega \cdot t = \pi/2$, then all sub­chan­nel alpha­bets are rotat­ed by mul­ti­ples of a quar­ter cir­cle (neigh­bor­ing sub­chan­nels are rotat­ed a quar­ter cir­cle against each oth­er, the next sub­chan­nels a half cir­cle, and so forth) which results in iden­ti­cal alpha­bets for square QAM. The length of the “core” OFDM sym­bol is giv­en by $\Delta\omega \cdot t = 2\pi$.

2 This is the fastest code I could hack togeth­er in a short time. Take it as a start­ing point.

3 The (cen­tral) $\chi^2$ dis­tri­b­u­tion with 2 degrees of free­dom describes the dis­tri­b­u­tion of pow­er in a sig­nal with two-dimen­sion­al Gauss­ian ampli­tude dis­tri­b­u­tion [4].

[1] PAPR [Wikipedia]
[2] Com­bi­na­tions [Wikipedia]
[3] Multi­n­o­mi­al Dis­tri­b­u­tion [Wikipedia]
[4] $\chi^2$ dis­tri­b­u­tion [Wikipedia]


last posts in OFDM:

No Comments

Post a Comment

Your email is never shared.