<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.2 20190208//EN" "http://jats.nlm.nih.gov/publishing/1.2/JATS-journalpublishing1.dtd">
<article article-type="research-article" dtd-version="1.2" xml:lang="ru" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><front><journal-meta><journal-id journal-id-type="issn">2518-1092</journal-id><journal-title-group><journal-title>Научный результат. Информационные технологии</journal-title></journal-title-group><issn pub-type="epub">2518-1092</issn></journal-meta><article-meta><article-id pub-id-type="doi">10.18413/2518-1092-2022-8-2-0-5</article-id><article-id pub-id-type="publisher-id">3145</article-id><article-categories><subj-group subj-group-type="heading"><subject>ИСКУССТВЕННЫЙ ИНТЕЛЛЕКТ И ПРИНЯТИЕ РЕШЕНИЙ</subject></subj-group></article-categories><title-group><article-title>&lt;strong&gt;МЕТОД ВЫЯВЛЕНИЯ КОСВЕННЫХ ПРИЗНАКОВ КОРРУПЦИОННЫХ ДЕЯНИЙ ПО ВИДЕОЗАПИСЯМ ВЫСТУПЛЕНИЙ ГОССЛУЖАЩИХ&lt;/strong&gt;</article-title><trans-title-group xml:lang="en"><trans-title>&lt;strong&gt;THE METHOD OF IDENTIFYING INDIRECT SIGNS OF CORRUPTION ACTS BASED ON VIDEO RECORDINGS OF SPEECHES OF CIVIL SERVANTS&lt;/strong&gt;</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Крайновских</surname><given-names>Вероника Игоревна</given-names></name><name xml:lang="en"><surname>Krainovskikh</surname><given-names>Veronika Igorevna</given-names></name></name-alternatives><email>vikraynova@edu.hse.ru</email></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Комарова</surname><given-names>Алёна Алексеевна</given-names></name><name xml:lang="en"><surname>Komarova</surname><given-names>Alyona Alekseevna</given-names></name></name-alternatives></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Басов</surname><given-names>Олег Олегович</given-names></name><name xml:lang="en"><surname>Basov</surname><given-names>Oleg Olegovich</given-names></name></name-alternatives><email>oobasov@mail.ru</email></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Басов</surname><given-names>Олег Олегович</given-names></name><name xml:lang="en"><surname>Basov</surname><given-names>Oleg Olegovich</given-names></name></name-alternatives><email>oobasov@mail.ru</email></contrib></contrib-group><pub-date pub-type="epub"><year>2023</year></pub-date><volume>8</volume><issue>2</issue><fpage>0</fpage><lpage>0</lpage><self-uri content-type="pdf" xlink:href="/media/information/2023/2/ИТ_НР_8.2_5_Ql0RJfM.pdf" /><abstract xml:lang="ru"><p>На данный момент проблема противодействию коррупционным действиям в сфере государственной службы по&amp;ndash;прежнему не теряет свою актуальность. Предполагается, что при совершении коррупционных деяний люди проявляют определенные вербальные и невербальные сигналы, с помощью которых можно выявить коррупционные признаки. В статье рассматривается проблема нарушения антикоррупционного законодательства в государственных и муниципальных учреждениях с точки зрения психоэмоционального состояния чиновников. Предлагается использовать методы машинного обучения для анализа видео и аудио записей выступлений с неподготовленной речью чиновников различного уровня власти, где они отвечают на вопросы журналистов и общественности, с целью определения эмоций и выявления агрессии, неуверенности, а также уклончивости в ответах. Результаты исследования могут быть полезны для государственных органов, занимающихся борьбой с коррупцией, а также для общественности, заинтересованной в прозрачности и честности деятельности государственных служащих.</p></abstract><trans-abstract xml:lang="en"><p>At the moment, the problem of countering corruption in the field of public service still does not lose its relevance. It is assumed that when committing acts of corruption, people show certain verbal and non-verbal signals, with the help of which it is possible to identify signs of corruption. The article deals with the problem of violation of anti-corruption legislation in state and municipal institutions from the point of view of the psycho-emotional state of officials. It is proposed to use machine learning methods to analyze video and audio recordings of speeches with unprepared speech of officials of various levels of government, where they answer questions from journalists and the public, in order to determine emotions and identify aggression, uncertainty, and evasiveness in answers. The results of the study may be useful for government agencies involved in the fight against corruption, as well as for the public interested in transparency and honesty of the activities of civil servants.</p></trans-abstract><kwd-group xml:lang="ru"><kwd>противодействие коррупции</kwd><kwd>государственное управление</kwd><kwd>машинное обучение</kwd><kwd>вербальный анализ</kwd><kwd>анализ аудио-видеозаписей</kwd><kwd>анализ психоэмоционального состояния</kwd></kwd-group><kwd-group xml:lang="en"><kwd>anti–corruption</kwd><kwd>public administration</kwd><kwd>machine learning</kwd><kwd>verbal analysis</kwd><kwd>analysis of audio-video recordings</kwd><kwd>analysis of psycho-emotional state</kwd></kwd-group></article-meta></front><back><ref-list><title>Список литературы</title><ref id="B1"><mixed-citation>Овсянникова В. В. К вопросу о классификации эмоций: категориальный и многомерный подходы // Финансовая аналитика: проблемы и решения. &amp;ndash; 2013. &amp;ndash; №. 37. &amp;ndash; С. 43-48.</mixed-citation></ref><ref id="B2"><mixed-citation>Рогов Е. И. Настольная книга практического психолога: Учебное пособие // М.: ВЛАДОС. &amp;ndash; 1998.&amp;nbsp;&amp;ndash; С. 134-142.</mixed-citation></ref><ref id="B3"><mixed-citation>Самигулин Т. Р., Смирнов И. З., Лаушкина А. А. Определение маркеров агрессивного поведения человека на основе анализа аудио и текстового каналов // Научный результат. Информационные технологии. &amp;ndash; 2022. &amp;ndash; Т. 7. &amp;ndash; №. 2. &amp;ndash; С. 55-62.</mixed-citation></ref><ref id="B4"><mixed-citation>Bazarevsky V. et al. Blazeface: Sub-millisecond neural face detection on mobile gpus //arXiv preprint arXiv:1907.05047. &amp;ndash; 2019.</mixed-citation></ref><ref id="B5"><mixed-citation>Burkhardt F. et al. A database of German emotional speech // Interspeech. &amp;ndash; 2005. &amp;ndash; Т. 5. &amp;ndash; С. 1517-1520.</mixed-citation></ref><ref id="B6"><mixed-citation>Chow A., Louie J. Detecting lies via speech patterns. &amp;ndash; 2017.</mixed-citation></ref><ref id="B7"><mixed-citation>Devyatkin D. A. et al. Intelligent analysis of manifestations of verbal aggressiveness in network community texts //Scientific and Technical Information Processing. &amp;ndash; 2014. &amp;ndash; Т. 41. &amp;ndash; С. 377-389.</mixed-citation></ref><ref id="B8"><mixed-citation>Goupil L. et al. Listeners&amp;rsquo; perceptions of the certainty and honesty of a speaker are associated with a common prosodic signature // Nature communication. &amp;ndash; 2021. &amp;ndash; Т. 12. &amp;ndash; №. 1. &amp;ndash; С. 861.</mixed-citation></ref><ref id="B9"><mixed-citation>Gournay P., Lahaie O., Lefebvre R. A canadian french emotional speech dataset //Proceedings of the 9th ACM multimedia systems conference. &amp;ndash; 2018. &amp;ndash; С. 399-402.</mixed-citation></ref><ref id="B10"><mixed-citation>Korobov M. Morphological analyzer and generator for Russian and Ukrainian languages //Analysis of Images, Social Networks and Texts: 4th International Conference, AIST 2015, Yekaterinburg, Russia, April 9&amp;ndash;11, 2015, Revised Selected Papers 4. &amp;ndash; Springer International Publishing, 2015. &amp;ndash; С. 320-332.</mixed-citation></ref><ref id="B11"><mixed-citation>Kossaifi J. et al. Sewa db: A rich database for audio-visual emotion and sentiment research in the wild // IEEE transactions on pattern analysis and machine intelligence. &amp;ndash; 2019. &amp;ndash; Т. 43. &amp;ndash; №. 3. &amp;ndash; С. 1022-1040.</mixed-citation></ref><ref id="B12"><mixed-citation>Laushkina, Anastasia &amp;amp; Smirnov, Ivan &amp;amp; Medvedev, Anatoly &amp;amp; Laptev, Andrey &amp;amp; Sinko, Mikhail. (2022). Detecting incongruity in the expression of emotions in short videos based on a multimodal approach. Cybernetics and Physics. 210-216.</mixed-citation></ref><ref id="B13"><mixed-citation>Luna-Jim&amp;eacute;nez C. et al. A Proposal for Multimodal Emotion Recognition Using Aural Transformers and Action Units on RAVDESS Dataset // Applied Sciences. 2021. &amp;ndash; Vol. 12. &amp;ndash; № 1. &amp;ndash; P. 327.</mixed-citation></ref><ref id="B14"><mixed-citation>Marcolla F., de Santiago R., Dazzi R. Novel Lie Speech Classification by using Voice Stress // Proceedings of the 12th International Conference on Agents and Artificial Intelligence. SCITEPRESS &amp;ndash; Science and Technology Publications. &amp;ndash; 2020. &amp;ndash; С. 742&amp;ndash;749.</mixed-citation></ref><ref id="B15"><mixed-citation>McFee B. et al. librosa: Audio and music signal analysis in python //Proceedings of the 14th python in science conference. &amp;ndash; 2015. &amp;ndash; Т. 8. &amp;ndash; С. 18-25.</mixed-citation></ref><ref id="B16"><mixed-citation>Oviatt S. et al. (ed.). The Handbook of Multimodal-Multisensor Interfaces: Signal Processing, Architectures, and Detection of Emotion and Cognition-Volume 2. &amp;ndash; Association for Computing Machinery and Morgan &amp;amp; Claypool, 2018.</mixed-citation></ref><ref id="B17"><mixed-citation>S. Haq P.J.B.J. Multimodal Emotion Recognition / ed. Wang W. IGI Global, 2010. C. 398&amp;ndash;423.</mixed-citation></ref><ref id="B18"><mixed-citation>Saeed H. H., Shahzad K., Kamiran F. Overlapping toxic sentiment classification using deep neural architectures //2018 IEEE international conference on data mining workshops (ICDMW). &amp;ndash; IEEE, 2018. &amp;ndash; С. 1361-1366.</mixed-citation></ref><ref id="B19"><mixed-citation>Sinko M. et al. Method of constructing and identifying predictive models of human behavior based on information models of non-verbal signals //Procedia Computer Science. &amp;ndash; 2022. &amp;ndash; Т. 212. &amp;ndash; С. 171-180.</mixed-citation></ref><ref id="B20"><mixed-citation>Tenney I., Das D., Pavlick E. BERT rediscovers the classical NLP pipeline // arXiv preprint arXiv:1905.05950. &amp;ndash; 2019.</mixed-citation></ref><ref id="B21"><mixed-citation>Tsai Y. H. H. et al. Learning factorized multimodal representations // arXiv preprint arXiv:1806.06176.&amp;nbsp;&amp;ndash; 2018.</mixed-citation></ref><ref id="B22"><mixed-citation>Tsai Y. H. H. et al. Multimodal transformer for unaligned multimodal language sequences //Proceedings of the conference. Association for Computational Linguistics. Meeting. &amp;ndash; NIH Public Access, 2019. &amp;ndash; Т. 2019. &amp;ndash; С. 6558.</mixed-citation></ref><ref id="B23"><mixed-citation>Tzirakis P. et al. End-to-end multimodal emotion recognition using deep neural networks //IEEE Journal of selected topics in signal processing. &amp;ndash; 2017. &amp;ndash; Т. 11. &amp;ndash; №. 8. &amp;ndash; С. 1301-1309.</mixed-citation></ref><ref id="B24"><mixed-citation>В МВФ оценили потери мировой экономике от коррупции [Электронный ресурс]. &amp;ndash; Режим доступа: https://www.rbc.ru/economics/18/09/2017/59bfead89a794704d063c4f0 (дата обращения: 03.04.2023).</mixed-citation></ref><ref id="B25"><mixed-citation>Краткая характеристика состояния преступности в Российской Федерации за январь-июнь 2022 года. URL: https://xn--b1aew.xn--p1ai/reports/item/31209853/ (дата обращения: 03.04.2023).</mixed-citation></ref><ref id="B26"><mixed-citation>НИУ ВШЭ предложил использовать искусственный интеллект для предотвращения коррупции в РФ [Электронный ресурс]. &amp;ndash; Режим доступа: https://tass.ru/ekonomika/14304599 (дата обращения: 03.04.2023).</mixed-citation></ref><ref id="B27"><mixed-citation>Bert for Sequence Classification (Question vs Statement) &amp;nbsp;[Электронный ресурс]. &amp;ndash; Режим доступа: https://sparknlp.org/2021/11/04/bert_sequence_classifier_question_statement_en.html (дата обращения: 15.04.2023).</mixed-citation></ref></ref-list></back></article>