Multi-Kernel Recursive Least Squares in the Presence of Sparse Outliers
MetadataShow full item record
This is an era of data deluge as large volumes of data are generated in real time due to advent of online social media, Internet and global-scale communications. Thus, there is consequent need of online, scalable optimization algorithms for processing of information contained in such large volumes of data. For online learning, adaptive filtering is the efficient technique, which performs its operation using recursive algorithm. The recursive algorithm like Recursive Least Square (RLS) algorithm has applications in linear signal processing, communications and controls. However, there exists nonlinearities in practical applications. Thus, kernel methods are used to devise nonlinear algorithms. Kernel Recursive Least Square (KRLS) algorithm is nonlinear version of RLS algorithm; which recursively finds minimum mean-squared-error-solutions to nonlinear least-squares problems. Despite the widespread use of KRLS in nonlinear signal processing, the presence of outliers in estimation data deteriorates the resulting predictor's performance. Thus, in this thesis, Outlier Robust Kernel Recursive Least Square algorithm is developed using convex and nonconvex penalty. Moreover, the assumption of availability of reasonable kernel, prior to adaptation process, is not realistic in case of nonstationary data. Hence, set of kernels are used to develop efficient nonlinear adaptive filtering techniques. Thus, the objective of this thesis is to develop "Multi-Kernel Recursive Least Squares in the Presence of Sparse Outliers."