PROTECTION AGAINST ADVERSARIAL ATTACKS ON AUDIO AND IMAGES IN ARTIFICIAL INTELLIGENCE MODELS USING THE SGEC METHOD
In the modern world, the use of artificial intelligence (AI) is increasingly facing the risk of adversarial attacks on audio and images. This article explores this issue and presents the SGEC method as a means to minimize these risks. Various types of attacks on audio and images are discussed, including label manipulation, white-box and black-box attacks, leakage through trained models, and hardware-level attacks. The main focus is on the SGEC method, which offers data encryption and ensures their integrity in AI models. The article also examines other approaches to protect audio and images, such as dual verification and ensemble methods, access restriction and data anonymization, as well as the use of provably robust AI models.
Gerasimov V.M., Maslova M.A., Khalilayeva E.I. Protection against adversarial attacks on audio and images in artificial intelligence models using the SGEC method // Research result. Information technologies. – Т.8, №2, 2023. – P. 53-60. DOI: 10.18413/2518-1092-2022-8-2-0-7
While nobody left any comments to this publication.
You can be first.
1. Esmaeilpour M., Cardinal P., Koerich A.L. A robust approach for securing audio classification against adversarial attacks //IEEE transactions on information forensics and security. – 2019. – T. 15. – P. 2147-2159.
2. Xu H. et al. Adversarial attacks and defenses in images, graphs and text: A review // International Journal of Automation and Computing. – 2020. – T. 17. – P. 151-178.
3. Certificate of state registration of the computer program No. 2022663168 Russian Federation. SGEC-system "BIOM" for encrypting and hiding the voice data of users on the server: No. 2022662279: App. 06/27/2022: publ. July 12, 2022 / V.M. Gerasimov, M.A. Maslova; applicant Federal State Autonomous Educational Institution of Higher Education "Sevastopol State University". – EDN FJQWGB.
4. Clark D., Hunt S., Malacaria P. Quantitative analysis of the leakage of confidential data // Electronic Notes in Theoretical Computer Science. – 2002. – T. 59. – №. 3. – P. 238-251.
5. Martin K. The penalty for privacy violations: How privacy violations impact trust online // Journal of Business Research. – 2018. – T. 82. – P. 103-116.
6. Yang J. et al. Msta-net: forgery detection by generating manipulation trace based on multi-scale self-texture attention // IEEE transactions on circuits and systems for video technology. – 2021. – T. 32. – №. 7. – P. 4854-4866.
7. Li G. et al. DeSVig: Decentralized swift vigilance against adversarial attacks in industrial artificial intelligence systems //IEEE Transactions on Industrial Informatics. – 2019. – T. 16. – №. 5. – P. 3267-3277.
8. Meeßen S. M. et al. Trust is essential: positive effects of information systems on users’ memory require trust in the system //Ergonomics. – 2020. – T. 63. – №. 7. – P. 909-926.
9. Lupton M. Some ethical and legal consequences of the application of artificial intelligence in the field of medicine //Trends Med. – 2018. – T. 18. – №. 4. – P. 100147.
10. Gerasimov, V. M. Comprehensive system for protecting a biometric voice print from the effects of cyber fraudsters / V.M. Gerasimov // XI Congress of Young Scientists: Collection of scientific papers, St. Petersburg, April 04–08, 2022. - St. Petersburg: Federal State Autonomous Educational Institution of Higher Education "National Research University ITMO", 2022. – P. 72-76. – EDN VTVBBS.
11. Gerasimov, V.M. Possible threats and attacks on the user's voice identification system / V.M. Gerasimov, M.A. Maslova // Scientific result. Information Technology. – 2022. – V. 7, No. 1. – P. 32-37. – DOI 10.18413/2518-1092-2022-7-1-0-4. – EDN JBCXMF.
12. Ozhiganova M. I., Arvanova S. M., Abitov A. A., Unachev I. A. Development of a software module for a face recognition system using the Viola-Jones method // Digital transformation of science and education: Collection of scientific papers II International Scientific and Practical Conference, NALCHIK, October 01–04, 2021. - NALCHIK, 2021. – P. 271-277. – EDN NRFFLF.
Работа выполнена в рамках Соглашения от 30.06.2022 г. № 40469-21/2022-к