What Is Noise And Noise Cancellation?
Basic Adaptive Filters
Applications Of Adaptive Filters
Various Adaptive Algorithms For Noise Cancellation
Affine Projection Algorithm
WHAT IS NOISE AND NOISE
•Noise consists of unwanted waveforms that can
interfere with communication.
• Noise can be internal or external to the system.
•Sound noise: interferes with your normal hearing
•White noise (AWGN)
•NOISE CANCELLATION: Noise cancellation is
a method to reduce or cancel out undesirable
components of the signal
A filter which adapts itself to the input signal given to it.
It is Non Linear And Time Variant.
Best suited when signal conditions are slowly changing.
Relies on recursive algorithm.
It has Adaptation algorithm for adjusting parameters for
It is meant to monitor the environment and varies the
filter transfer function accordingly.
The basic operation of adaptive
filter involves two processes :
•produces an output signal in response to a given input signal.
•aims to adjust the filter parameters to the environment.
Applications of Adaptive
to improve SNR.
Used to provide a
prediction of the
present value of a
that provides an
Used to cancel
a primary signal
VARIOUS ADAPTIVE ALGORITHMS FOR
Properties of an ideal algorithm:
Practical to implement
Adapt to coefficient quickly to minimize error
Provide the Desired Performance
Different algorithms used are:
Least Mean Squares (LMS) algorithm
The Normalized Least Mean Squares(NLMS) algorithm
The Recursive Least Squares (RLS) algorithm
Affine Projection Algorithm(APA)
We have taken an input random signal of N samples as reference
We have taken random noise or adaptive Gaussian noise.
Then we are adding the noise signal with the input signal.
So the problem is to extract the input signal from output signal by
eliminating the noise.
Adjusts the weight w(n) of the filter
Adaptively adjusts the filter tap weights according to equation:
Acts as negative feedback to minimize error signal
It is robust in nature.
Slow in convergence and sensitive to variations in step size
Requires number of iterations equals to dimensionality of the
• In structural terms both NLMS filter is exactly same as a
standard LMS filter.
• From one iteration to the next, the weight of an adaptive filter
should be changed in a minimal manner.
One of the drawback of LMS is selection of step size
In order to solve this difficulty, we can use the NLMS
(Normalized Least Mean Square) algorithm. Here the step size
parameter is normalised.
So the NLMS algorithm is a time varying step-size algorithm,
calculating the convergence factor μ as :
Where || x(n) ||² is the squared Euclidean norm of the input
Here α is the adaption constant, which optimizes the
convergence rate of the algorithm
Range of alpha is: 0<α<2
c is the constant term for normalization and always c<1.
The updated filter weight is:
ADVANTAGES AND DISADVANTAGES:
• As here µ is normalized this
algorithm converges faster than
• Here estimated error value between
the desired signal and filter output
is less than LMS.
•But LMS is less complex than
NLMS and more stable.DISADVANTAGES
Recursively finds the filter coefficients that minimize a
weighted linear least squares cost function relating to the input
In this algorithm the filter tap weight vector is updated
Whitens the input data by using inverse correlation
matrix of data.
The Cost function C(n) should be minimized.
β(n,i) is weighting vector
β(n,i)=λn-i , where λ=forgetting factor
The sum of weighted error squares:
A regularizing term:
Let Φ(n) is the correlation matrix of input u(i)
Φ(n)= λn-i u(i) uH(i)+ δλnI
Then the average cross correlation vector z(n) is given by-
z(n)=Φ(n)ŵ(n) , n=1,2………
Using the matrix inversion lemma, we can find the inverse of correlation
Φ-1(n) = P(n) (let)
Cost function is always expressed in terms of gain where ,K(n) is the
k(n) = P(n)u(n) = Φ-1(n)u(n)
The tap weight vector ŵ(n)
From the above equations, we summarize the RLS Algorithm as-
π(n) = P(n-1)u(n)
ξ(n) = d(n) – ŵH(n-1)u(n)
ŵ(n) = ŵ(n-1) + k(n)ξ*(n)
P(n) = λ-1 P(n-1) – λ-1 k(n) uH(n) P(n-1)
ADVANTAGES AND DISADVANTAGES OF
•RLS converges faster than
LMS, NLMS and APA.
•Its noise cancellation
capacity is the most.
•This is the most complex
algorithm of all the four
AFFINE PROJECTION ALGORITHM
Generalization of the well known normalized least mean
square (NLMS) adaptive filtering algorithm.
Fast convergence compared to NLMS.
Computational complexity increases.
Convergence gets better with increase in filter order N.
Faster tracking capabilities than NLMS.
Better performance in steady state mean square error (MSE)
than other algorithms.
A(n) = input data matrix [N*N]
AH(n) =input data matrix in hermitian transpose[N*N]
d(n)= desired response [N*1]
Error can be computed as-
The updated tap weight vector can be calculated as-
ŵ(n+1)=ŵ(n)+μ AH(n)(A(n) AH(n))-1e(n)
CONVERGENCE & STABILITY OF
The learning curve of an APA consists of the sum of exponential
It converges at a rate faster than that of a NLMS filter.
As more delayed versions of tap input vector is used, the rate of
convergence improves, so does the computational complexity.
APA is less stable than LMS and NLMS algorithms, whereas it
is more stable than RLS algorithm.
LMS,NLMS,APA AND RLS-----
• RLS converges faster than APA,
APA converges faster than NLMS and
NLMS converges faster than LMS.
• RLS is the most complex algorithm
among the four algorithms. Hence ,
complexity is inversely proportional to
• Difference between final and initial
SNR is highest in case of RLS then
APA then NLMS then LMS.
We studied the behavior of LMS, NLMS, APA and RLS algorithms
by implementing them in the adaptive filter for noise cancellation.
LMS was the simplest and easiest to implement but it converges at
the slowest rate.
NLMS has a normalized step size making it converge faster than
LMS but complexity also increases along with convergence rate.
APA is the improved version of NLMS with increasing
RLS is the fastest converging algorithm with maximum
computational complexity. But it cancels maximum noise by
minimizing error with the rapidest rate.
So we are making a tradeoff between computational complexity and
convergence rate here to get the most noise free signal.
RLS is the best algorithm as it is faster than the other three.
Adaptive Filter Theory by Simon Haykin: 3rd edition, Pearson Education
Adaptive Signal Processing by John G Proakis, 3rd edition, Perntice Hall
B. Widow, "Adaptive noise canceling: principles and applications",
Proceedings of the IEEE, vol. 63, pp. 1692-1716, 1975.
A Family of Adaptive Filter Algorithms in Noise cancellation for Speech
Enhancement By Sayed. A. Hadei, Student Member IEEE and M. lotfizad.
Steven L. Gay and Sanjeev Tavathia, “The Fast Affine Projection
Algorithm”, Acoustics Research Department, AT&T Bell Laboratories.
Sundar G. Sankaran, Student Member, IEEE, and A. A. (Louis) Beex,
Senior Member, IEEE” Convergence Behavior of Affine Projection