Matthew Gaber: Peekaboo Transformer Models
Author Identifiers
Matthew Gaber
https://orcid.org/0000-0003-1684-1392
Mohiuddin Ahmed
https://orcid.org/0000-0002-4559-4768
Helge Janicke
Publication Date
2024
Document Type
Dataset
Publisher
Edith Cowan University
School or Research Centre
School of Science
Description
Finding automated AI techniques to proactively defend against malware has become increasingly critical. The ability of an AI model to correctly classify novel malware is dependent on the quality of the features it is trained with. In turn, the authenticity and quality of the features is dependent on the analysis tool and the dataset. Peekaboo, a Dynamic Binary Instrumentation tool defeats evasive malware to capture its genuine behavior. Transformer models trained with Peekaboo data excel in detecting new malicious functions, outperforming prior approaches in novel ransomware detection.
This dataset contains the fine tuned models and the Colab scripts used for training and testing.
Additional Information
The various fine tuned models are located in the Models folder and the scripts used for training and testing are located in the Scripts folder. The Peekaboo data is available at https://doi.org/10.25958/85p1-4w32
DOI
10.25958/z82g-1e40
Research Activity Title
Assembly Language with Transformer Models for Novel Ransomware Detection
Start of data collection time period
2024
End of data collection time period
2024
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial 4.0 License
Contact
Matthew Gaber
Citation
Gaber, M. G., Ahmed, M., & Janicke, H. (2024). Matthew Gaber: Peekaboo Transformer Models. Edith Cowan University. https://doi.org/10.25958/z82g-1e40