Orthogonality#

In geometry, vectors are considered orthogonal if they are perpendicular to each other, which is determined by their dot product being zero (i.e., the sum of the products of their corresponding elements equals zero).

xy=xycos(θ)=n=xnyn=0

This means that the θ between the vectors is 90 (or π/2 radians), and the cosine of θ is zero.

Consequently, the measure of similarity of two orthogonal discrete-time signals equals zero:

cxy=xyxy=n=xnynk=xn2k=yn2=0

Orthogonality in the context of functions extends the concept of orthogonality in vectors. For continuous functions, orthogonality is determined by an inner product, where the functions are treated as if they had infinitely many components. In the "continuous world," the sum operator becomes an integral. The inner product of two functions is found by multiplying the functions together and integrating the product over a given interval.

Two real non-zero functions f(x) and g(x) are said to be orthogonal on an interval axb if their inner product on the interval equals zero:

f(x),g(x)=abf(x)g(x)dx=0

Sine and cosine are among the most well-known and widely used orthogonal functions. Their orthogonality plays a crucial role in various mathematical, engineering, and scientific applications, including Fourier analysis.

As we will see later, sine and cosine functions form the basis of the Fourier series, which decomposes periodic functions into sums of sines and cosines. This decomposition is essential for analyzing and synthesizing periodic signals.

To prove the orthogonality of sine and cosine functions, we must show that their inner product over one period is zero. This is typically done over the interval [0,2π] or any interval of length 2π.

02πsin(θ)cos(θ)dθ

Let us use the trigonometric identity:

sin(α)cos(β)=12[sin(αβ)+sin(α+β)]

02πsin(θ)cos(θ)dθ=1202πsin(2θ)dθ=14[cos(2θ)]02π

=14(cos(4π)cos(0))=0

Let us work through examples with sampled (discrete) sine and cosine functions.

Assume sine and cosine functions sampled at 2π/3 (120°) intervals.

Sampled sine and cosine - 3 samples

The table below shows the sampled values:

n Angle (θ) sin(θ) cos(θ) sin(θ)cos(θ)
0 0 0 1 0
1 2π3 32 12 34
2 4π3 32 12 34

In the discrete domain, the integral becomes a summation.

n=02sin(θn)cos(θn)=034+34=0

Geometric Interpretation

In this example, the sine and cosine functions are sampled at three equally spaced points, resulting in two 3-dimensional vectors:

sin(θ)=[03232],cos(θ)=[11212]

As shown in the figure, the angle between these two vectors is 90 (or π/2 radians), indicating that the sine and cosine vectors are orthogonal.

Sampled sine and cosine vectors

Now, let’s increase the sampling rate to 8 samples per period.

Sampled sine and cosine - 8 samples

The table below shows the sampled values:

n Angle (θ) sin(θ) cos(θ) sin(θ)cos(θ)
0 0 0 1 0
1 π4 12 12 12
2 π2 1 0 0
3 3π4 12 12 12
4 π 0 1 0
5 5π4 12 12 12
6 3π2 1 0 0
7 7π4 12 12 12

The dot product is:

n=07sin(θn)cos(θn)=0+12+012+0+12+012=0

As observed, the sine and cosine functions remain orthogonal even with increased sampling density.

Geometric Interpretation

In this example, the sine and cosine functions are sampled at eight equally spaced points, producing two 8-dimensional vectors:

sin(θ)=[012112012112],cos(θ)=[112012112012]

These vectors, although hard to visualize due to their 8-dimensional nature, can be mathematically analyzed. The key point is that in the space defined by these eight dimensions, the angle between these vectors is 90 (or π/2 radians). This implies that the sine and cosine vectors are orthogonal.

This example illustrates how mathematical abstraction extends beyond our physical intuition, enabling us to analyze properties in higher-dimensional spaces. The concept of multiplying two vectors and obtaining a zero result is generalized in mathematics as orthogonality.

Orthogonal Functions in Fourier Transform#

The following functions are integral expressions that demonstrate the orthogonality of sine and cosine functions over a period T.

0Tsin(mωt)cos(nωt)dt=0

0Tsin(mωt)sin(nωt)dt={0mnT2m=n

0Tcos(mωt)cos(nωt)dt={0mnT2m=n

Where m and n are integers.

The derivations for the above integrals are provided in Appendix  B.

The integrals show that sine and cosine functions of different frequencies (mω and nω) are orthogonal, meaning their integrals over one period T are zero. This orthogonality is key to isolating individual frequency components in a signal when performing a Fourier transform.

On the other hand, the integrals for m=n provide the energy of the sine and cosine functions over a period T. This allows the Fourier transform to precisely quantify the contribution of each frequency to the signal's total energy.

These orthogonal relationships are fundamental to the Fourier transform.

Other orthogonal functions#

Orthogonal functions are essential in many areas of science and engineering due to their unique properties and applications.

In communication systems, orthogonal codes are utilized in the Code-Division Multiple Access (CDMA) technique, allowing multiple signals to share the same channel or frequency band without causing mutual interference.

Orthogonal Frequency Division Multiplexing (OFDM) is another communication technique that leverages orthogonality. OFDM uses closely spaced orthogonal sub-carrier signals to carry data. The orthogonality ensures that even though the sub-carriers overlap in the frequency domain, they have no mutual interference. This allows for efficient use of the available spectrum.