Abstract
Many models such as Long Short Term Memory (LSTMs), Gated Recurrent Units (GRUs) and transformers have been developed to classify time series data with the assumption that events in a sequence are ordered. On the other hand, fewer models have been developed for set based inputs, where order does not matter. There are several use cases where data is given as partially-ordered sequences because of the granularity or uncertainty of time stamps. We introduce a novel transformer based model for such prediction tasks, and benchmark against extensions of existing order invariant models. We also discuss how transition probabilities between events in a sequence can be used to improve model performance. We show that the transformer-based equal-time model outperforms extensions of existing set models on three data sets.
Original language | English (US) |
---|---|
Title of host publication | Artificial Neural Networks and Machine Learning – ICANN 2021 - 30th International Conference on Artificial Neural Networks, Proceedings |
Editors | Igor Farkaš, Paolo Masulli, Sebastian Otte, Stefan Wermter |
Publisher | Springer Science and Business Media Deutschland GmbH |
Pages | 294-305 |
Number of pages | 12 |
ISBN (Print) | 9783030863616 |
DOIs | |
State | Published - 2021 |
Event | 30th International Conference on Artificial Neural Networks, ICANN 2021 - Virtual, Online Duration: Sep 14 2021 → Sep 17 2021 |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Volume | 12891 LNCS |
ISSN (Print) | 0302-9743 |
ISSN (Electronic) | 1611-3349 |
Conference
Conference | 30th International Conference on Artificial Neural Networks, ICANN 2021 |
---|---|
City | Virtual, Online |
Period | 9/14/21 → 9/17/21 |
Keywords
- Recurrent Neural Networks
- Timeseries
- Transformers
ASJC Scopus subject areas
- Theoretical Computer Science
- Computer Science(all)