Abstracts Mathematics

Add abstract

Want to add your dissertation abstract to this database? It only takes a minute!

Search abstract

Search for abstracts by subject, author or institution

Share this abstract

Simultaneous Feature Selection and Parameter Estimation forHidden Markov Models

by Stephen Adams

Institution: University of Virginia
Year: 2015
Keywords: informative priors; feature selection; hidden markov models; parameter estimation
Posted: 02/05/2017
Record ID: 2101699
Full text PDF: http://libra.virginia.edu/catalog/libra-oa:9605


Abstract

Prior knowledge about a system is crucial for accurate modeling. Bayesian parameter estimation theory, specifically the use of informative prior distributions, offers one method for conveying prior knowledge to the learning algorithm that may not be present in a data set. This dissertation primarily focuses on the problem of feature selection for hidden Markov models with respect to the test cost of the individual features. Test costs include the financial cost of acquiring the feature, the difficulty in collecting the feature, the time required to collect the feature, etc. We propose using a feature saliency hidden Markov model (FSHMM) that simultaneously selects features and estimates model parameters. We assume that the number of states is known, and use the expectation maximization algorithm for parameter estimation. Informative prior distributions are used to convey the test cost to the learning algorithm. Three formulations are derived for the FSHMM: a maximum likelihood formulation using no priors, and two maximum a posteriori (MAP) formulations using informative priors. These are compared to an existing formulation that uses non-informative priors and variational Bayesian methods for parameter estimation. The proposed formulations are extended to numerous conditional feature distributions, including the gamma distribution and the Poisson distribution, and a semi-Markov model. The FSHMM is tested using synthetic data, a tool-wear data set, an activity-recognition data set, and an event-detection data set. We find that the MAP formulation using a truncated exponential distribution on the feature saliencies generally outperforms the other FSHMM formulations and conventional feature selection techniques in terms of predictive performance and selecting a feature subset. Advisors/Committee Members: Beling, Peter (advisor).

Add abstract

Want to add your dissertation abstract to this database? It only takes a minute!

Search abstract

Search for abstracts by subject, author or institution

Share this abstract

Relevant publications

Book cover thumbnail image
Proof in Alonzo Church's and Alan Turing's Mathema... Undecidability of First Order Logic
by Chimakonam, Jonathan Okeke
   
Book cover thumbnail image
New Splitting Iterative Methods for Solving Multid...
by Tagoudjeu, Jacques
   
Book cover thumbnail image
A Reusable Learning Object Design Model for Elemen...
by Reece, Amanda A.
   
Book cover thumbnail image
Finding the Real Odds Attrition and Time-to-Degree in the FSU College of...
by Lightfoot, Robert C.
   
Book cover thumbnail image
Modelling and Simulation of Stochastic Volatility ...
by Kahl, Christian
   
Book cover thumbnail image
Radiative Transfer Using Boltzmann Transport Theor...
by Littlejohn, Carnell
   
Book cover thumbnail image
Modeling Credit Risk and Pricing Credit Derivative...
by Wolf, Martin P.
   
Book cover thumbnail image
Canonical Auto and Cross Correlations of Multivari...
by Bulach, Marcia Woolf