Volume 16, Issue 4 (December 2020)                   IJEEE 2020, 16(4): 474-486 | Back to browse issues page


XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Fattahi A, Emadi S. Detection of Copy-Move Forgery in Digital Images Using Scale Invariant Feature Transform Algorithm and the Spearman Relationship. IJEEE 2020; 16 (4) :474-486
URL: http://ijeee.iust.ac.ir/article-1-1443-en.html
Abstract:   (2603 Views)
Increased popularity of digital media and image editing software has led to the spread of multimedia content forgery for various purposes. Undoubtedly, law and forensic medicine experts require trustworthy and non-forged images to enforce rights. Copy-move forgery is the most common type of manipulation of digital images. Copy-move forgery is used to hide an area of the image or to repeat a portion in the same image. In this paper, a method is presented for detecting copy-move forgery using the Scale-Invariant Feature Transform (SIFT) algorithm. The spearman relationship and ward clustering algorithm are used to measure the similarity between key-points, also to increase the accuracy of forgery detection. This method is invariant to changes such as rotation, scale change, deformation, and light change; it falls into the category of blind forgery detection methods. The experimental results show that with its high resistance to apparent changes, the proposed method correctly detects 99.56 percent of the forged images in the dataset and reveals the forged areas.
Full-Text [PDF 933 kb]   (1329 Downloads)    
Type of Study: Research Paper | Subject: Image Processing
Received: 2019/02/26 | Revised: 2019/11/20 | Accepted: 2019/11/29

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Creative Commons License
© 2022 by the authors. Licensee IUST, Tehran, Iran. This is an open access journal distributed under the terms and conditions of the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.