Abstract
Truth maintenance systems provide caches of beliefs and inferences that support explanations and search. Traditionally, the cost of using a TMS is monotonic growth in the size of this cache. In some applications this cost is too high; for example, intelligent learning environments may require students to explore many alternatives, which leads to unacceptable performance. This paper describes an algorithm for fact garbage collection that retains the explanation-generating capabilities of a TMS while eliminating the increased storage overhead. We describe the application context that motivated this work and the properties of applications that benefit from this technique. We present the algorithm, showing how to balance the tradeoff between maintaining a useful cache and reclaiming storage, and analyze its complexity. We demonstrate that this algorithm can eliminate monotonic storage growth, thus making it more practical to field large-scale TMS-based systems.
Original language | English (US) |
---|---|
Title of host publication | Proceedings of the National Conference on Artificial Intelligence |
Editors | Anon |
Publisher | AAAI |
Pages | 614-620 |
Number of pages | 7 |
Volume | 1 |
State | Published - Dec 1 1996 |
Event | Proceedings of the 1996 13th National Conference on Artificial Intelligence, AAAI 96. Part 1 (of 2) - Portland, OR, USA Duration: Aug 4 1996 → Aug 8 1996 |
Other
Other | Proceedings of the 1996 13th National Conference on Artificial Intelligence, AAAI 96. Part 1 (of 2) |
---|---|
City | Portland, OR, USA |
Period | 8/4/96 → 8/8/96 |
ASJC Scopus subject areas
- Software