ArXiv Preprint
Irregular sampling occurs in many time series modeling applications where it
presents a significant challenge to standard deep learning models. This work is
motivated by the analysis of physiological time series data in electronic
health records, which are sparse, irregularly sampled, and multivariate. In
this paper, we propose a new deep learning framework for this setting that we
call Multi-Time Attention Networks. Multi-Time Attention Networks learn an
embedding of continuous-time values and use an attention mechanism to produce a
fixed-length representation of a time series containing a variable number of
observations. We investigate the performance of our framework on interpolation
and classification tasks using multiple datasets. Our results show that our
approach performs as well or better than a range of baseline and recently
proposed models while offering significantly faster training times than current
state-of-the-art methods.
Satya Narayan Shukla, Benjamin M. Marlin
2021-01-25