<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.2 20190208//EN" "http://jats.nlm.nih.gov/publishing/1.2/JATS-journalpublishing1.dtd">
<article article-type="research-article" dtd-version="1.2" xml:lang="ru" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><front><journal-meta><journal-id journal-id-type="issn">2518-1092</journal-id><journal-title-group><journal-title>Научный результат. Информационные технологии</journal-title></journal-title-group><issn pub-type="epub">2518-1092</issn></journal-meta><article-meta><article-id pub-id-type="doi">10.18413/2518-1092-2019-4-2-0-1</article-id><article-id pub-id-type="publisher-id">1698</article-id><article-categories><subj-group subj-group-type="heading"><subject>КОМПЬЮТЕРНОЕ МОДЕЛИРОВАНИЕ</subject></subj-group></article-categories><title-group><article-title>АВТОМАТИЗИРОВАННАЯ СИСТЕМА МОНИТОРИНГА, ОЦЕНКИ И ПРОГНОЗИРОВАНИЯ РОСТА И РАЗВИТИЯ РАСТЕНИЙ В УСЛОВИЯХ IN VITRO</article-title><trans-title-group xml:lang="en"><trans-title>AUTOMATED SYSTEM FOR MONITORING, ESTIMATION AND PREDICTING THE GROWTH AND DEVELOPMENT OF PLANTS IN VITRO</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Маслаков</surname><given-names>Юрий Николаевич</given-names></name><name xml:lang="en"><surname>Maslakov</surname><given-names>Yuriy Nikolayevich</given-names></name></name-alternatives><email>maslakov.yn@gmail.com</email></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Бережной</surname><given-names>Владислав Александрович</given-names></name><name xml:lang="en"><surname>Berezhnoy</surname><given-names>Vladislav Alexandrovich</given-names></name></name-alternatives><email>vaber93@mail.ru</email></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Иващук</surname><given-names>Ольга Александровна</given-names></name><name xml:lang="en"><surname>Ivashchuk</surname><given-names>Olga Alexandrovna</given-names></name></name-alternatives><email>ivaschuk@bsu.edu.ru</email></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="ru"><surname>Яценко</surname><given-names>Владимир Михайлович</given-names></name><name xml:lang="en"><surname>Yatsenko</surname><given-names>Vladimir Mikhaylovich</given-names></name></name-alternatives><email>vowwva@mail.ru</email></contrib></contrib-group><pub-date pub-type="epub"><year>2019</year></pub-date><volume>4</volume><issue>2</issue><fpage>0</fpage><lpage>0</lpage><self-uri content-type="pdf" xlink:href="/media/information/2019/2/ИТ_1.pdf" /><abstract xml:lang="ru"><p>В статье рассматриваются подходы к решению проблемы точной и объективной регистрации и оценке роста и развития растений (отдельных его частей) на разных по составу питательных средах и различных стадиях развития, выращиваемых в условиях in&amp;nbsp;vitro (в пробирке), что позволяет получить высококачественный безвирусный посадочный материал. Процесс фиксации параметров роста растений при натурных измерениях нарушает микроклимат, сформированный для оптимального развития растения, вносит серьезные погрешности из-за влияния человеческого фактора, и кроме того связан с необходимостью обработки больших объемов разнородных данных. Все это определяет перспективность использования современных информационных технологий, методов и средств автоматизации и моделирования, построения на этой основе автоматизированной системы мониторинга, оценки и прогнозирования роста и развития растений в условиях in vitro. На сегодняшний день существуют различные аппаратно-программные комплексы, предлагаемые зарубежными компаниями для фотографирования растений и оценки их морфометрических показателей, однако, они не учитывают условий нахождения растений в пробирке за стеклом, что вносит искажения изображения из-за запотевания и дефектов стекла, нелинейности на границах пробирки, блики и отражения окружающих объектов. В данной работе предложен прототип автоматизированной системы, при помощи которой осуществляется фотосъемка растения, производится сбор объективной информации о морфометрических параметрах растений в процессе их роста и производится оценка состояния растений на основе создания их объемной реконструкции.</p></abstract><trans-abstract xml:lang="en"><p>The paper discusses approaches to solving the problems of accurate and objective registration and estimation of the growth and development of plants (and its individual parts) depending on the composition of nutrient media and various stages of development grown in vitro, which allows to obtain high-quality virus-free planting material. Manual measurements of plant parameters could violate the microclimate in vitro formed for the optimal plants development, causing serious errors due to the influence of human factors, and in addition, requires to handle a lot of information. The problems above could be solved by building an automated system based on modern information technologies, methods and tools for modeling. Existing various hardware and software systems for photographing plants and evaluating their morphometric parameters do not take into account the in vitro conditions. Photos of plants in vitro could have a condensate layer&amp;ensp;and various glass defects and distortions. In this paper, proposed prototype of an automated system, the main aim of which is to collect plants photos and calculate plants morphometric parameters with taking into account in vitro conditions.</p></trans-abstract><kwd-group xml:lang="ru"><kwd>реконструкция изображения</kwd><kwd>калибровка изображения</kwd><kwd>сегментация изображения</kwd><kwd>облако точек</kwd><kwd>сегментация облака точек</kwd><kwd>извлечение признаков</kwd></kwd-group><kwd-group xml:lang="en"><kwd>image reconstruction</kwd><kwd>image calibration</kwd><kwd>image segmentation</kwd><kwd>point cloud</kwd><kwd>point cloud segmentation</kwd><kwd>feature extraction</kwd></kwd-group></article-meta></front><back><ref-list><title>Список литературы</title><ref id="B1"><mixed-citation>Smith, R.H. (2013), Plant tissue culture: techniques and experiments, Academic Press, Texas, TX.</mixed-citation></ref><ref id="B2"><mixed-citation>Reekie, E. and Bazzaz, F.A. (2011), Reproductive allocation in plants. Academic Press, San Diego, SA.</mixed-citation></ref><ref id="B3"><mixed-citation>Gibbs, J., Pound, M.P., Wells, D.M., Murchie, E.H., French, A.P., and Pridmore, T.P. (2014), Three-dimensional reconstruction of plant shoots from multiple images using an active vision system, Plant Physiology, 166 (4), 1688&amp;ndash;1698.</mixed-citation></ref><ref id="B4"><mixed-citation>Kumar, P., Connor, J. and Mikiavcic, S. (2014), High-throughput 3D reconstruction of plant shoots for phenotyping, 13th International Conference on Control Automation Robotics &amp;amp; Vision (ICARCV), Singapore, 211-216.</mixed-citation></ref><ref id="B5"><mixed-citation>Pound, M.P., French, A.P., Murchie, E. H., and Pridmore, T.P. (2014), Automated recovery of 3D models of plant shoots from multiple colour images, Plant Physiology, 114.</mixed-citation></ref><ref id="B6"><mixed-citation>G&amp;uuml;neş E.O. and Ayg&amp;uuml;n S. (2017), Growth monitoring of plants using active contour technique, 6th International Conference on Agro-Geoinformatics, Fairfax, USA, 1-5.</mixed-citation></ref><ref id="B7"><mixed-citation>Gai, J. (2016), &amp;ldquo;Plants detection, localization and discrimination using 3D machine vision for robotic intra-row weed control&amp;rdquo;, Ph.D. Thesis, Iowa State University, Ames, Iowa.</mixed-citation></ref><ref id="B8"><mixed-citation>Krainin, M., Henry, P., Ren X., and Fox, D. (2010), Manipulator and object tracking for in hand model acquisition., IEEE International Conference on Robots and Automation In Proceedings, Funchal, Portugal.</mixed-citation></ref><ref id="B9"><mixed-citation>Gehan, M., Fahlgren, N. (2018), &amp;ldquo;Summary of Output Measurements&amp;rdquo; [Online], available at: https://plantcv.readthedocs.io/en/latest/output_measurements/ (Accessed 23.06.2019)</mixed-citation></ref><ref id="B10"><mixed-citation>Zhang, T.Y. and Suen, C.Y. (1984), A fast parallel algorithm for thinning digital patterns, Communications of the ACM, 27(3).</mixed-citation></ref><ref id="B11"><mixed-citation>Lee, T.C., Kashyap, R. L. and Chu C.N. (1994), Building skeleton models via 3-D medial surface/axis thinning algorithms, Computer Vision, Graphics, and Image Processing, 56(6), 462-478.</mixed-citation></ref><ref id="B12"><mixed-citation>Abu-Ain, W., Abdullah, S.N., Bataineh, B., Abu-Ain, T. and Omar, K. (2013), Skeletonization algorithm for binary images, Procedia Technology, 11, 704-709.</mixed-citation></ref><ref id="B13"><mixed-citation>Furukawa, Y. and Hern&amp;aacute;ndez, C. (2015), Multi-view stereo: A tutorial. Foundations and Trends&amp;reg; in Computer Graphics and Vision, now Publishers Inc., Hanover, MA.</mixed-citation></ref><ref id="B14"><mixed-citation>Furukawa, Y. and Ponce, J. (2010), Accurate, dense, and robust multiview stereopsis, 2007 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Minneapolis, Minnesota, 32(8), 1362-1376.</mixed-citation></ref><ref id="B15"><mixed-citation>Torok, M.M. (2012), &amp;ldquo;Autonomous sample collection using image-based 3d reconstructions&amp;rdquo;,&amp;nbsp; Ph.D. Thesis, Virginia Polytechnic Institute and State University, Blacksburg, Virginia.</mixed-citation></ref><ref id="B16"><mixed-citation>Matusik, W., Buehler, C., Raskar, R., Gortler, S. J. and McMillan, L. (2000), Image-based visual hulls, 27th annual conference on Computer graphics and interactive techniques, &amp;nbsp;New York, NY, 369-374.</mixed-citation></ref><ref id="B17"><mixed-citation>Laurentini, A. (1994), The visual hull concept for silhouette-based image understanding, IEEE Transactions on pattern analysis and machine intelligence, 16(2), 150-162.</mixed-citation></ref><ref id="B18"><mixed-citation>Slabaugh, G., Schafer, R., Malzbender, T. and Culbertson, B. (2001), A survey of methods for volumetric scene reconstruction from photographs, In Volume Graphics 2001, Springer, Vienna.</mixed-citation></ref><ref id="B19"><mixed-citation>Vogiatzis, G., Esteban, C.H., Torr, P.H. and Cipolla, R. (2007), Multiview stereo via volumetric graph-cuts and occlusion robust photo-consistency, IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(12), 2241-2246.</mixed-citation></ref><ref id="B20"><mixed-citation>Gupta, P. (2007), &amp;ldquo;Gray code composite pattern structured light illumination&amp;rdquo;, Ph.D. Thesis, University of Kentucky, Lexington, Kentucky.</mixed-citation></ref><ref id="B21"><mixed-citation>Young, M., Beeson, E., Davis, J., Rusinkiewicz, S. and Ramamoorthi, R. (2007), Coded structured Light, IEEE Conference on Computer Vision and Pattern Recognition, 2007, Minneapolis, MN, 1-8</mixed-citation></ref><ref id="B22"><mixed-citation>Kawasaki, H., Furukawa, R., Sagawa, R. and Yagi, Y., (2008), Dynamic scene shape reconstruction using a single structured light pattern, IEEE Conference on Computer Vision and Pattern Recognition, 2008, Anchorage, Alaska, 1-8.</mixed-citation></ref><ref id="B23"><mixed-citation>Li, B., Heng, L., Koser, K. and Pollefeys, M. (2013), A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 1301-1307.</mixed-citation></ref><ref id="B24"><mixed-citation>Vogiatzis, G. and Hern&amp;aacute;ndez, C. (2010), &amp;ldquo;Automatic camera pose estimation from dot pattern&amp;rdquo; [Online], available at: http://george-vogiatzis.org/calib/ (Accessed 23.06.2019)</mixed-citation></ref><ref id="B25"><mixed-citation>Tagliasacchi, A., Zhang, H. and Cohen-Or, D. (2009), Curve skeleton extraction from incomplete point cloud, In ACM Transactions on Graphics (TOG), 28(3), 71.</mixed-citation></ref></ref-list></back></article>