Abstract
Many real classification tasks are oriented to sequence (neighbor) labeling,
that is, assigning a label to every sample of a signal while taking
into account the sequentiality (or neighborhood) of the samples.
This is normally approached by first filtering the data and then performing
classification. In consequence, both processes are optimized
separately, with no guarantee of global optimality. In this work we
utilize Bayesian modeling and inference to jointly learn a classifier
and estimate an optimal filterbank. Variational Bayesian inference is
used to approximate the posterior distributions of all unknowns, resulting
in an iterative procedure to estimate the classifier parameters
and the filterbank coefficients. In the experimental section we show,
using synthetic and real data, that the proposed method compares
favorably with other classification/filtering approaches, without the
need of parameter tuning.
Original language | English |
---|---|
Title of host publication | Proceedings of IEEE International Conference on Image Processing |
State | Published - 2014 |
Event | Proceedings of IEEE International Conference on Image Processing - Paris, France Duration: Oct 27 2014 → … |
Conference
Conference | Proceedings of IEEE International Conference on Image Processing |
---|---|
Period | 10/27/14 → … |