Time Series Contrastive Learning with Information-Aware Augmentations Conference

Luo, D, Cheng, W, Wang, Y et al. (2023). Time Series Contrastive Learning with Information-Aware Augmentations . 37 4534-4542.

cited authors

  • Luo, D; Cheng, W; Wang, Y; Xu, D; Ni, J; Yu, W; Zhang, X; Liu, Y; Chen, Y; Chen, H; Zhang, X

authors

abstract

  • Various contrastive learning approaches have been proposed in recent years and achieve significant empirical success. While effective and prevalent, contrastive learning has been less explored for time series data. A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples, such that an encoder can be trained to learn robust and discriminative representations. Unlike image and language domains where “desired” augmented samples can be generated with the rule of thumb guided by prefabricated human priors, the ad-hoc manual selection of time series augmentations is hindered by their diverse and human-unrecognizable temporal structures. How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question. In this work, we address the problem by encouraging both high fidelity and variety based upon information theory. A theoretical analysis leads to the criteria for selecting feasible data augmentations. On top of that, we propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning. Experiments on various datasets show highly competitive performance with up to 12.0% reduction in MSE on forecasting tasks and up to 3.7% relative improvement in accuracy on classification tasks over the leading baselines.

publication date

  • June 27, 2023

International Standard Book Number (ISBN) 13

start page

  • 4534

end page

  • 4542

volume

  • 37