COMPARISON OF THE STRUCTURE, EFFICIENCY, AND SPEED OF OPERATION OF FEEDFORWARD, CONVOLUTIONAL, AND RECURRENT NEURAL NETWORKS
This article examines the efficiency of fully connected, recurrent, and convolutional neural networks in the context of developing a simple model for weather forecasting. The architectures and working principles of fully connected neural networks, the structure of one-dimensional and two-dimensional convolutional neural networks, as well as the architecture, features, advantages, and disadvantages of recurrent neural networks—specifically, simple recurrent neural networks, LSTM, and GRU, along with their bidirectional variants for each of the three aforementioned types—are discussed. Based on the available theoretical materials, simple neural networks were developed to compare the efficiency of each architecture, with training time and error magnitude serving as criteria, and temperature, wind speed, and atmospheric pressure as training data. The training speed, minimum and average error values for the fully connected neural network, convolutional neural network, simple recurrent network, LSTM, and GRU, as well as for bidirectional recurrent neural networks, were examined. Based on the results obtained, an analysis was conducted to explore the possible reasons for the effectiveness of each architecture. Graphs were plotted to show the relationship between processing speed and error magnitude for the three datasets examined: temperature, wind speed, and atmospheric pressure. Conclusions were drawn about the efficiency of specific models in the context of forecasting time series of meteorological data.
Shapalin V.G., Nikolayenko D.V. Comparison of the structure, efficiency, and speed of operation of feedforward, convolutional, and recurrent neural networks // Research result. Information technologies. – Т.9, №4, 2024. – P. 21-35. DOI: 10.18413/2518-1092-2024-9-4-0-3
While nobody left any comments to this publication.
You can be first.