<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.2 20190208//EN" "http://jats.nlm.nih.gov/publishing/1.2/JATS-journalpublishing1.dtd">
<article article-type="research-article" dtd-version="1.2" xml:lang="ru" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><front><journal-meta><journal-id journal-id-type="issn">2518-1092</journal-id><journal-title-group><journal-title>Research result. Information technologies</journal-title></journal-title-group><issn pub-type="epub">2518-1092</issn></journal-meta><article-meta><article-id pub-id-type="doi">10.18413/2518-1092-2024-9-1-0-7</article-id><article-id pub-id-type="publisher-id">3407</article-id><article-categories><subj-group subj-group-type="heading"><subject>ARTIFICIAL INTELLIGENCE AND DECISION MAKING</subject></subj-group></article-categories><title-group><article-title>&lt;strong&gt;COMPARISON OF THE EFFICIENCY OF MACHINE LEARNING ALGORITHMS BY THE EXAMPLE OF FORECASTING&amp;nbsp;THE AVERAGE ELECTRICITY CONSUMPTION&amp;nbsp;OF INTEGRATED CONSUMER METERING DEVICES&lt;/strong&gt;</article-title><trans-title-group xml:lang="en"><trans-title>&lt;strong&gt;COMPARISON OF THE EFFICIENCY OF MACHINE LEARNING ALGORITHMS BY THE EXAMPLE OF FORECASTING&amp;nbsp;THE AVERAGE ELECTRICITY CONSUMPTION&amp;nbsp;OF INTEGRATED CONSUMER METERING DEVICES&lt;/strong&gt;</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Korzhavykh</surname><given-names>Vladislav Valerievi</given-names></name><name xml:lang="en"><surname>Korzhavykh</surname><given-names>Vladislav Valerievi</given-names></name></name-alternatives><email>Korzhavyh.VV@mrsk-1.ru</email></contrib></contrib-group><pub-date pub-type="epub"><year>2024</year></pub-date><volume>9</volume><issue>1</issue><fpage>0</fpage><lpage>0</lpage><self-uri content-type="pdf" xlink:href="/media/information/2024/1/ИТ_НР_9.1_7.pdf" /><abstract xml:lang="ru"><p>Finding and reducing electricity losses is one of the key activities of network organizations to improve financial results. Forecasting based on a large number of criteria and comparing with actual electricity consumption is the preferred way to detect losses. However, this process requires a high degree of automation. Therefore, to solve this problem, this paper considers the use of three machine learning algorithms, as well as a comparison of their effectiveness. The author formed a training sample from the database of one of the districts of electrical networks, and also conducted experiments on the implementation of the following algorithms on it: k-nearest neighbors, linear regression and random forest. To compare the resulting models, the author used such performance indicators as mean square error (MSE), absolute mean error (MAE) and coefficient of determination (R^2). The results of the experiment showed the greatest efficiency of the random forest method in comparison with other considered algorithms.</p></abstract><trans-abstract xml:lang="en"><p>Finding and reducing electricity losses is one of the key activities of network organizations to improve financial results. Forecasting based on a large number of criteria and comparing with actual electricity consumption is the preferred way to detect losses. However, this process requires a high degree of automation. Therefore, to solve this problem, this paper considers the use of three machine learning algorithms, as well as a comparison of their effectiveness. The author formed a training sample from the database of one of the districts of electrical networks, and also conducted experiments on the implementation of the following algorithms on it: k-nearest neighbors, linear regression and random forest. To compare the resulting models, the author used such performance indicators as mean square error (MSE), absolute mean error (MAE) and coefficient of determination (R^2). The results of the experiment showed the greatest efficiency of the random forest method in comparison with other considered algorithms.</p></trans-abstract><kwd-group xml:lang="ru"><kwd>machine learning</kwd><kwd>power loss</kwd><kwd>k-nearest neighbors’ algorithm</kwd><kwd>linear regression</kwd><kwd>random forest</kwd><kwd>mean square error</kwd><kwd>mean absolute error</kwd><kwd>coefficient of determination</kwd></kwd-group><kwd-group xml:lang="en"><kwd>machine learning</kwd><kwd>power loss</kwd><kwd>k-nearest neighbors’ algorithm</kwd><kwd>linear regression</kwd><kwd>random forest</kwd><kwd>mean square error</kwd><kwd>mean absolute error</kwd><kwd>coefficient of determination</kwd></kwd-group></article-meta></front><back><ref-list><title>Список литературы</title><ref id="B1"><mixed-citation>1. Box J. Time series analysis. Forecasting and management / Box J, Jenkins G. M.: Mir, V.1, 1974. &amp;ndash; 406 p.</mixed-citation></ref><ref id="B2"><mixed-citation>2. Gavrilova T.A. Knowledge bases of intellectual systems. Textbook / Gavrilova T.A., Khoroshevsky V.F. &amp;ndash; SPb.: Piter, 2000. &amp;ndash; 384 p.</mixed-citation></ref><ref id="B3"><mixed-citation>3. Galushkin A.I. Neuromathematics (problems of development) / M.: Radiotekhnika, 2003. 40 p.</mixed-citation></ref><ref id="B4"><mixed-citation>4. Donskoy D.A. Application of analytical technologies in control systems and informatics / Donskoy D.A., Sleptsov N.V., Shcherbakov M.A. &amp;ndash; Penza, 2005.</mixed-citation></ref><ref id="B5"><mixed-citation>5. Zhelezko Yu.S. Calculation, analysis and rationing of the electric power losses in the electric networks / Yu.S. Zhelezko // M.: NU ENAS, 2002. &amp;ndash; 280 p.</mixed-citation></ref><ref id="B6"><mixed-citation>6. Ivanov V.L. Electronic textbook: knowledge control systems (in Russian) // Informatics and Education. &amp;ndash; 2002. &amp;ndash; № 1.</mixed-citation></ref><ref id="B7"><mixed-citation>7. Kazanskaya A.A. The use of machine learning in investment activity / A.A. Kazanskaya, L.G. Mishura // Scientific Journal of NIU ITMO. Series: Economics and Environmental Management. &amp;ndash; 2020. &amp;ndash; № 2. &amp;ndash; P. 23-34. &amp;ndash; DOI: 10.17586/2310-1172-2020-13-2-23-34. &amp;ndash; EDN MUJXYZ.</mixed-citation></ref><ref id="B8"><mixed-citation>8. Kaftannikov I.L. Problems of training sample formation in machine learning tasks / I.L. Kaftannikov, A.V.&amp;nbsp;Parasich // Vestnik SUSU. Series &amp;quot;Computer technologies, management, radio electronics&amp;quot;. &amp;ndash; 2016. &amp;ndash; Т. 16, №&amp;nbsp;3. &amp;ndash; P. 15-24. DOI: 10.14529/ctcr160302</mixed-citation></ref><ref id="B9"><mixed-citation>9. Kudashev K. Commercial electricity losses without borders, 2017. URL: http://www.bigpowernews.ru/interview/document76022.phtml (date of reference: 23.11.2023</mixed-citation></ref><ref id="B10"><mixed-citation>10. Find Leakage, 2021 URL: https://www.kommersant.ru/doc/4877601 (date access: 23.11.2023)</mixed-citation></ref><ref id="B11"><mixed-citation>11. Fuzzy linear regression in estimation problems / E.V. Vishnyakova, E.V. Ivanova, S.M. Kamalov [et al.]&amp;nbsp;// Scientific Notes of Young Researchers. &amp;ndash; 2015. &amp;ndash; № 5. &amp;ndash; P. 14-29.</mixed-citation></ref><ref id="B12"><mixed-citation>12. Jones T. Programming of Artificial Intelligence in Applications / Per. from Engl. Osipov A.I. &amp;ndash; M.: DMK Press, 2011. &amp;ndash; 312 p.</mixed-citation></ref><ref id="B13"><mixed-citation>13. Toady T. Transforming categorical data: A practical guide to handling non-numeric variables for machine learning algorithms, 2023 URL: https://dev-gang.ru/article/preobrazovanie-kategorialnyh-dannyh-prakticzeskoe-rukovodstvo-po-obrabotke-neczislovyh-peremennyh-dlja-algoritmov-mashinnogo-obuczenija-buyh1q4ttt/</mixed-citation></ref><ref id="B14"><mixed-citation>14.&amp;nbsp;Tricoz D.V. Neural networks: how to do it? Computers + Programs N 4(5). 1993. &amp;ndash; 14-20 p.</mixed-citation></ref><ref id="B15"><mixed-citation>15. Flach P. Machine learning / P. Flach // M.: DMK Press, 2015. p. 25</mixed-citation></ref><ref id="B16"><mixed-citation>16. Haykin S. Neural networks: a complete course / S. Haykin. &amp;ndash; M.: Dialectics, 2019. &amp;ndash; 1104 p.</mixed-citation></ref><ref id="B17"><mixed-citation>17. Andrzej С. Neural Networks for Optimization and Signal Processing [Text] / C. Andrzej, R. Unbehauen, J. Wiley and Sons Ltd, 1993. &amp;ndash; 526 p.</mixed-citation></ref><ref id="B18"><mixed-citation>18. Hyndman R.J., Koehler A.B. Another look at measures of forecast accuracy // International Journal of Forecasting. &amp;ndash;2006. &amp;ndash; № 22(4). &amp;ndash; P. 679-688.</mixed-citation></ref><ref id="B19"><mixed-citation>19. Shcherbakov M.V., Brebels A. Outliers and anomalies detection based on neural networks forecast procedure: Proceedings of the 31st Annual International Symposium on Forecasting (ISF 2011) / Prague: International Institute of Forecasters, 2011. &amp;ndash; pp. 21-22. URL: http://www.forecasters.org/isf/pdfs/ISF11_Proceedings.pdf</mixed-citation></ref><ref id="B20"><mixed-citation>20. Yu, Chong Ho. Exploratory data analysis in the context of data mining and resampling // International Journal of Psychological Research. 3. 2010.</mixed-citation></ref></ref-list></back></article>