Contrastive Language-Audio Pretraining (CLAP) became of crucial importance in the field of audio and speech processing. Its employment ranges from sound event detection to text-to- audio generation. However, one of the main limitations is the considerable amount of data required in the training process and the overall computational complexity during inference. This paper investigates how we can reduce the complexity of contrastive language-audio pre-trained models, yielding an efficient model the we call tinyCLAP. We derive an unimodal distillation loss from first principles and explore how the di- mensionality of the shared, multimodal latent space can be reduced via pruning. tinyCLAP uses only 6% of the original Microsoft’s CLAP parameters with a minimal reduction (less than 5%) in zero-shot classification performance across the three sound event detection datasets on which it was tested.

tinyCLAP: distilling constrastive language-audio pretrained models

Francesco Paissan;Elisabetta Farella
2024-01-01

Abstract

Contrastive Language-Audio Pretraining (CLAP) became of crucial importance in the field of audio and speech processing. Its employment ranges from sound event detection to text-to- audio generation. However, one of the main limitations is the considerable amount of data required in the training process and the overall computational complexity during inference. This paper investigates how we can reduce the complexity of contrastive language-audio pre-trained models, yielding an efficient model the we call tinyCLAP. We derive an unimodal distillation loss from first principles and explore how the di- mensionality of the shared, multimodal latent space can be reduced via pruning. tinyCLAP uses only 6% of the original Microsoft’s CLAP parameters with a minimal reduction (less than 5%) in zero-shot classification performance across the three sound event detection datasets on which it was tested.
File in questo prodotto:
File Dimensione Formato  
tinyCLAP_CR.pdf

non disponibili

Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 235.68 kB
Formato Adobe PDF
235.68 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/348527
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact