<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.2 20190208//EN" "http://jats.nlm.nih.gov/publishing/1.2/JATS-journalpublishing1.dtd">
<article article-type="research-article" dtd-version="1.2" xml:lang="ru" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><front><journal-meta><journal-id journal-id-type="issn">2518-1092</journal-id><journal-title-group><journal-title>Research result. Information technologies</journal-title></journal-title-group><issn pub-type="epub">2518-1092</issn></journal-meta><article-meta><article-id pub-id-type="doi">10.18413/2518-1092-2025-10-3-0-4</article-id><article-id pub-id-type="publisher-id">3903</article-id><article-categories><subj-group subj-group-type="heading"><subject>ARTIFICIAL INTELLIGENCE AND DECISION MAKING</subject></subj-group></article-categories><title-group><article-title>&lt;strong&gt;NEURAL INFERENCE OF OBJECT LOCALIZATION&amp;nbsp;FROM DISCONTINUOUS TIME-OF-ARRIVAL SEQUENCES&lt;/strong&gt;</article-title><trans-title-group xml:lang="en"><trans-title>&lt;strong&gt;NEURAL INFERENCE OF OBJECT LOCALIZATION&amp;nbsp;FROM DISCONTINUOUS TIME-OF-ARRIVAL SEQUENCES&lt;/strong&gt;</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Minina</surname><given-names>Anna Valerievna</given-names></name><name xml:lang="en"><surname>Minina</surname><given-names>Anna Valerievna</given-names></name></name-alternatives><email>minina.annette@gmail.com</email></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Nikulin</surname><given-names>Rostislav Ruslanovich</given-names></name><name xml:lang="en"><surname>Nikulin</surname><given-names>Rostislav Ruslanovich</given-names></name></name-alternatives><email>1470970@bsuedu.ru</email></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Sidorenko</surname><given-names>Igor Alexandrovich</given-names></name><name xml:lang="en"><surname>Sidorenko</surname><given-names>Igor Alexandrovich</given-names></name></name-alternatives></contrib></contrib-group><pub-date pub-type="epub"><year>2025</year></pub-date><volume>10</volume><issue>3</issue><fpage>0</fpage><lpage>0</lpage><self-uri content-type="pdf" xlink:href="/media/information/2025/3/ИТ_НР_10_3_4.pdf" /><abstract xml:lang="ru"><p>The article addresses the problem of recovering the coordinates of a moving object in conditions of temporary signal loss or degradation (blackout), typical for complex radio environments. A method based on a Long Short-Term Memory (LSTM) recurrent neural network is proposed for interpolating and predicting positions using sequences of Time Difference of Arrival (TDoA) measurements from three fixed anchors. The developed model was trained on synthetic trajectories simulating object movement and demonstrates high robustness to missing data, with a median error of less than 30 meters even under significant signal degradation. A comparative analysis with alternative methods, including GRU, TCN, and Kalman Filter, confirms the superior performance of the LSTM architecture in unstable environments with limited measurements. The results indicate the proposed approach is promising for real-time applications in autonomous navigation and positioning systems.</p></abstract><trans-abstract xml:lang="en"><p>The article addresses the problem of recovering the coordinates of a moving object in conditions of temporary signal loss or degradation (blackout), typical for complex radio environments. A method based on a Long Short-Term Memory (LSTM) recurrent neural network is proposed for interpolating and predicting positions using sequences of Time Difference of Arrival (TDoA) measurements from three fixed anchors. The developed model was trained on synthetic trajectories simulating object movement and demonstrates high robustness to missing data, with a median error of less than 30 meters even under significant signal degradation. A comparative analysis with alternative methods, including GRU, TCN, and Kalman Filter, confirms the superior performance of the LSTM architecture in unstable environments with limited measurements. The results indicate the proposed approach is promising for real-time applications in autonomous navigation and positioning systems.</p></trans-abstract><kwd-group xml:lang="ru"><kwd>positioning</kwd><kwd>TDoA</kwd><kwd>LSTM</kwd><kwd>blackout</kwd><kwd>neural networks</kwd><kwd>localization</kwd><kwd>navigation</kwd><kwd>coordinate regression</kwd></kwd-group><kwd-group xml:lang="en"><kwd>positioning</kwd><kwd>TDoA</kwd><kwd>LSTM</kwd><kwd>blackout</kwd><kwd>neural networks</kwd><kwd>localization</kwd><kwd>navigation</kwd><kwd>coordinate regression</kwd></kwd-group></article-meta></front><back><ref-list><title>Список литературы</title><ref id="B1"><mixed-citation>Seysenbayev B. I., Esmagulova A. M. Positioning Algorithms in Wireless Networks. &amp;ndash; Almaty: KazNTU, 2020. &amp;ndash; 123 p.</mixed-citation></ref><ref id="B2"><mixed-citation>Goodfellow I., Bengio Y., Courville A. Deep Learning. &amp;ndash; Moscow: Williams, 2018. &amp;ndash; 720 p. (Translated from English)</mixed-citation></ref><ref id="B3"><mixed-citation>Liu H., Darabi H., Banerjee P., Liu J. Survey of wireless indoor positioning techniques and systems // IEEE Transactions on Systems, Man, and Cybernetics. &amp;ndash; 2007. &amp;ndash; Vol. 37, No. 6. &amp;ndash; P. 1067&amp;ndash;1080.</mixed-citation></ref><ref id="B4"><mixed-citation>Hochreiter S., Schmidhuber J. Long short-term memory // Neural Computation. &amp;ndash; 1997. &amp;ndash; Vol. 9. &amp;ndash; No. 8.&amp;nbsp;&amp;ndash; P. 1735&amp;ndash;1780.</mixed-citation></ref><ref id="B5"><mixed-citation>Gers F. A., Schmidhuber J., Cummins F. Learning to forget: Continual prediction with LSTM // Neural Computation. &amp;ndash; 2000. &amp;ndash; Vol. 12. &amp;ndash; No. 10. &amp;ndash; P. 2451&amp;ndash;2471.</mixed-citation></ref><ref id="B6"><mixed-citation>Graves A. Supervised sequence labelling with recurrent neural networks. &amp;ndash; Springer, 2012. &amp;ndash; 150 p.</mixed-citation></ref><ref id="B7"><mixed-citation>Cho K. et al. Learning phrase representations using RNN encoder&amp;ndash;decoder for statistical machine translation // arXiv preprint arXiv:1406.1078, 2014.</mixed-citation></ref><ref id="B8"><mixed-citation>Kingma D., Ba J. Adam: A method for stochastic optimization // Proceedings of ICLR, 2015.</mixed-citation></ref><ref id="B9"><mixed-citation>Zhang Z. et al. TDOA-based localization using LSTM networks in multipath environments // Sensors. &amp;ndash; 2021. &amp;ndash; Vol. 21. &amp;ndash; No. 11. &amp;ndash; P. 3851.</mixed-citation></ref><ref id="B10"><mixed-citation>Mazuelas S. et al. Robust indoor positioning provided by real-time RSSI values in unmodified WLAN networks // IEEE Journal on Selected Areas in Communications. &amp;ndash; 2009. &amp;ndash; Vol. 27. &amp;ndash; No. 6. &amp;ndash; P. 1091&amp;ndash;1102.</mixed-citation></ref><ref id="B11"><mixed-citation>Alarifi A. et al. Ultra wideband indoor positioning technologies: Analysis and recent advances // Sensors.&amp;nbsp;&amp;ndash; 2016. &amp;ndash; Vol. 16. &amp;ndash; No. 5. &amp;ndash; P. 707.</mixed-citation></ref><ref id="B12"><mixed-citation>Ioffe S., Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift // Proceedings of ICML, 2015.</mixed-citation></ref><ref id="B13"><mixed-citation>LeCun Y., Bengio Y., Hinton G. Deep learning // Nature. &amp;ndash; 2015. &amp;ndash; Vol. 521. &amp;ndash; P. 436&amp;ndash;444.</mixed-citation></ref><ref id="B14"><mixed-citation>Yassin M. et al. Recent advances in indoor localization: A survey on theoretical approaches and applications // IEEE Communications Surveys &amp;amp; Tutorials. &amp;ndash; 2017. &amp;ndash; Vol. 19. &amp;ndash; No. 2. &amp;ndash; P. 1327&amp;ndash;1346.</mixed-citation></ref><ref id="B15"><mixed-citation>Zhao M., Adib F., Katabi D. Emotion recognition using wireless signals // Communications of the ACM.&amp;nbsp;&amp;ndash; 2018. &amp;ndash; Vol. 61. &amp;ndash; No. 9. &amp;ndash; P. 91&amp;ndash;100.</mixed-citation></ref></ref-list></back></article>