Identification and extraction of digital forensic evidence from multimedia data sources using multi-algorithmic fusion
Document Type
Journal Article
Publication Title
ICISSP 2019: 5th International Conference on Information Systems Security and Privacy
Publisher
SciTePress
School
Security Research Institute
RAS ID
31032
Abstract
With the enormous increase in the use and volume of photographs and videos, multimedia-based digital evidence has come to play an increasingly fundamental role in criminal investigations. However, given the increase in the volume of multimedia data, it is becoming time-consuming and costly for investigators to analyse the images manually. Therefore, a need exists for image analysis and retrieval techniques that are able to process, analyse and retrieve images efficiently and effectively. Outside of forensics, image annotation systems have become increasingly popular for a variety of purposes and major software/IT companies, such as Amazon, Microsoft and Google all have cloud-based image annotation systems. The paper presents a series of experiments that evaluate commercial annotation systems to determine their accuracy and ability to comprehensively annotate images within a forensic image analysis context (rather than simply single object imagery, which is typically the case). The paper further proposes and demonstrates the value of utilizing a multi-algorithmic approach via fusion to achieve the best results. The results of these experiments show that by existing systems the highest Average Recall was achieved by imagga with 53%, whilst the proposed multi-algorithmic system achieved 77% across the selected datasets. These results demonstrate the benefit of using a multi-algorithmic approach.
DOI
10.5220/0007399604380448
Access Rights
free_to_read
Comments
Mashhadani, S., Clarke, N., & Li, F. (2019). Identification and extraction of digital forensic evidence from multimedia data sources using multi-algorithmic fusion. In ICISSP 2019: 5th International Conference on Information Systems Security and Privacy (pp. 438-448). Prague, Czech Republic: SciTePress. Available here