Abstract
Undirected graphical models encode in a graph G the dependency structure of a random vector Y. In many applications, it is of interest to model Y given another random vector X as input. We refer to the problem of estimating the graphG(x) of Y conditioned on X = x as "graph-valued regression". In this paper, we propose a semiparametric method for estimating G(x) that builds a tree on the X space just as in CART (classification and regression trees), but at each leaf of the tree estimates a graph. We call the method "Graph-optimized CART", or Go-CART.We study the theoretical properties of Go-CART using dyadic partitioning trees, establishing oracle inequalities on risk minimization and tree partition consistency. We also demonstrate the application of Go-CART to a meteorological dataset, showing how graph-valued regression can provide a useful tool for analyzing complex data.
Original language | English (US) |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 23 |
Subtitle of host publication | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 |
State | Published - Dec 1 2010 |
Event | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada Duration: Dec 6 2010 → Dec 9 2010 |
Conference
Conference | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 |
---|---|
Country | Canada |
City | Vancouver, BC |
Period | 12/6/10 → 12/9/10 |
ASJC Scopus subject areas
- Information Systems