We address the known problem of detecting a previous compression in JPEG images, focusing on the challenging case of high and very high quality factors (>= 90) as well as repeated compression with identical or nearly identical quality factors. We first revisit the approaches based on Benford--Fourier analysis in the DCT domain and block convergence analysis in the spatial domain. Both were originally conceived for specific scenarios. Leveraging decision tree theory, we design a combined approach complementing the discriminatory capabilities. We obtain a set of novel detectors targeted to high quality grayscale JPEG images.
Forensics of high quality and nearly identical JPEG image recompression
Cecilia Pasquini;
2016-01-01
Abstract
We address the known problem of detecting a previous compression in JPEG images, focusing on the challenging case of high and very high quality factors (>= 90) as well as repeated compression with identical or nearly identical quality factors. We first revisit the approaches based on Benford--Fourier analysis in the DCT domain and block convergence analysis in the spatial domain. Both were originally conceived for specific scenarios. Leveraging decision tree theory, we design a combined approach complementing the discriminatory capabilities. We obtain a set of novel detectors targeted to high quality grayscale JPEG images.File | Dimensione | Formato | |
---|---|---|---|
IH2016.pdf
solo utenti autorizzati
Licenza:
NON PUBBLICO - Accesso privato/ristretto
Dimensione
1.71 MB
Formato
Adobe PDF
|
1.71 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.