An effective fragile watermarking scheme for color image tampering detection and self-recovery

Javier Molina-Garcia, Beatriz P. Garcia-Salgado, Volodymyr Ponomaryov, Rogelio Reyes-Reyes, Sergiy Sadovnychiy, Clara Cruz-Ramos

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


© 2019 In this paper, a fragile watermarking scheme for color-image authentication and self-recovery is proposed. Original image is divided into non-overlapping blocks, and for each i-th block, the watermarks used for recovery and authentication are generated, which are embedded into a different block according to an embedding sequence given by a permutation process. The designed scheme embeds the watermarks generated by each block within the 2-LSB, where a bit-adjustment phase is subsequently applied to increase the quality of the watermarked image. In order to increase the quality of the recovered image, we use in the post-processing stage the bilateral filter that efficiently suppresses noise preserving image edges. Additionally, in the tamper detection process high accuracy is achieved employing a hierarchical tamper detection algorithm. Finally, to solve tampering coincidence problem, three recovery watermarks are embedded in different positions to reconstruct a specific block, and a proposed inpainting algorithm is implemented to regenerate those regions affected by this problem. Simulation results demonstrate that the watermarked images appear to demonstrate higher quality, and the proposed novel scheme can reconstruct the alteration of extremely high rates (up to 80%), obtaining good quality for altered regions that are self-recovered with higher visual performance compared with a similar scheme from state of the-art methods.
Original languageAmerican English
JournalSignal Processing: Image Communication
StatePublished - 1 Feb 2020


Dive into the research topics of 'An effective fragile watermarking scheme for color image tampering detection and self-recovery'. Together they form a unique fingerprint.

Cite this