AbstractsComputer Science

Time Series Analysis with Information Theoretic Learning and Kernel Methods

by Puskal P Pokharel




Institution: University of Florida
Department: Electrical and Computer Engineering
Year: 2007
Keywords: Electrical and Computer Engineering
Record ID: 1811219
Full text PDF: http://ufdc.ufl.edu/UFE0021482


Abstract

The major goal of our research is to develop simple and effective nonlinear versions of some basic time series tools for signal detection, optimal filtering, and on-line adaptive filtering. These extensions shall be based on concepts being developed in information theoretic learning (ITL) and kernel methods. In general all ITL algorithms can be interpreted from kernel methods because ITL is based on extracting higher order information (that beyond second order as given by the autocorrelation function) directly from the data samples by exploiting nonparametric density estimation using translation invariant kernel functions. ITL in general is still lacking in providing tools to better exploit time structures in the data because the assumption of independently distributed data samples is usually an essential requirement. Kernel methods provide an elegant means of obtaining nonlinear versions of linear algorithms expressed in terms of inner products by using the so called kernel trick and Mercer's theorem. This has given rise to a variety of algorithms in the field of machine learning but most of them are computationally very expensive using a large Gram matrix of dimension same as the number of data points. Since these large matrices are usually ill-conditioned, they require an additional regularization step in methods like kernel regression. Our goal is to design basic signal analysis tools for time signals that extract higher order information from the data directly like ITL and also avoid the complexities of many kernel methods. We present new methods for time series analysis (matched filtering and optimal adaptive filtering) based on the newly invented concept in ITL, correntropy and kernel methods. Correntropy induces a RKHS that has the same dimensionality as the input space but is nonlinearly related to it. It is different from the conventional kernel methods, in both scope and detail. This in effect helps us to derive some elegant versions of a few tools that form the basic building block of signal processing like the matched filter (correlation receiver), the Wiener filter, and the least mean square (LMS) filter.