Self-supervised models have been shown to produce comparable or better visual representations than their su-pervised counterparts when trained offline on unlabeled data at scale. However, their efficacy is catastrophically reduced in a Continual Learning (CL) scenario where data is presented to the model sequentially. In this paper, we show that self-supervised loss functions can be seamlessly converted into distillation mechanisms for CL by adding a predictor network that maps the current state of the repre-sentations to their past state. This enables us to devise a framework for Continual self-supervised visual representation Learning that (i) significantly improves the quality of the learned representations, (ii) is compatible with several state-of-the-art self-supervised objectives, and (iii) needs little to no hyperparameter tuning. We demonstrate the ef-fectiveness of our approach empirically by training six pop-ular self-supervised models in various CL settings. Code: github.com/DonkeyShot21/cassle.

Self-Supervised Models are Continual Learners

Ricci, Elisa;
2022-01-01

Abstract

Self-supervised models have been shown to produce comparable or better visual representations than their su-pervised counterparts when trained offline on unlabeled data at scale. However, their efficacy is catastrophically reduced in a Continual Learning (CL) scenario where data is presented to the model sequentially. In this paper, we show that self-supervised loss functions can be seamlessly converted into distillation mechanisms for CL by adding a predictor network that maps the current state of the repre-sentations to their past state. This enables us to devise a framework for Continual self-supervised visual representation Learning that (i) significantly improves the quality of the learned representations, (ii) is compatible with several state-of-the-art self-supervised objectives, and (iii) needs little to no hyperparameter tuning. We demonstrate the ef-fectiveness of our approach empirically by training six pop-ular self-supervised models in various CL settings. Code: github.com/DonkeyShot21/cassle.
2022
978-1-6654-6946-3
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/335792
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact