Другие журналы

scientific edition of Bauman MSTU

SCIENCE & EDUCATION

Bauman Moscow State Technical University.   El № FS 77 - 48211.   ISSN 1994-0408

Time Series Prediction based on Hybrid Neural Networks

# 12, December 2016
DOI: 10.7463/1216.0852597
Article file: SE-BMSTU...o246.pdf (1359.63Kb)
authors: S.A. Yarushev1, A. V. Fedotova2,*, V.B. Tarasov2, A.N. Averkin3



1 Dubna State University, Dubna, Russia

2 Bauman Moscow State Technical University, Moscow, Russia

3 Institution of Russian Academy of Sciences Dorodnicyn
Computing Centre of RAS, Moscow, Russia

 Time series prediction is a vast area that is flourishing most rapidly. An increasingly changing situation in all areas of life encourages all this. These are economics and politics and a variety of the other areas, which directly affect everyone’s lives. Forecasting methods are also evolving in the wake of the changing market conditions and more and more new challenges. Changing industrial and economic processes, changing the legislative framework that regulates these processes, new processes of production and social spheres, all this leads to the emergence of physically short time series, since these processes, the indicators were not subject to statistical records. Nonlinear processes and noisy time series also present difficulties for forecasting. Proceeding from this situation, it is advisable to develop new forecasting methods, and as the studies show the hybrid methods are the best to cope with.
The idea of modular neural networks is based on the principle of decomposition of complex tasks into simpler ones. This idea is similar to the way a biological nervous system is constructed. The system has a very important property - in case one of the modules fails, others continue to work properly. Modular construction of the hybrid neural network systems allows having the universal and sustainable systems.
Hybrid and modular architectures of neural networks have a wide range of advantages over traditional neural networks. Among them, we can distinguish the capability of these networks to expand without having to retrain the whole neural network that delivers quite a few problems for developers. Suffice it to retrain one module and the network may work. Hybrid networks are much more stable to noise, they are much faster studying and a learning process is easier. This is only part of the characteristics the paper sets out in detail.
Proposed several modifications of modular neural networks, based on self-organizing Kohonen maps, such as the Vector-Quantized Temporal Associative Memory (VQTAM), Recurrent SOM (RSOM), Modular SOM. Given a detailed description of the neural network data architecture.
This work was supported by RFBR project № 16-37-50023 and 14-07-00603.

References
  1. Haykin S. Neural networks: a comprehensive foundation. Macmillan. New York, 1994.
  2. Efremova N., Asakura N., Inui T. Natural object recognition with the view-invariant neural network. In: 5th International Conference of Cognitive Science, 2012, pp. 802-804.
  3. Trofimov, A., Povidalo I. And Chernetsov S. Usage of the self-learning neural networks for the blood glucose level of patients with diabetes mellitus type 1 identification. Science and education, 2010, vol. 5. Availible at: http://technomag.edu.ru/doc/142908.html, accessed 18.12.2016.
  4. Haykin S. Neural networks: a comprehensive foundation. Macmillan. New York, 1994.
  5. Perugini N., Engeler W.E. Neural network learning time: Effects of network and training set size. In: International Joint conference on neural networks, 2: 1989, pp. 395-401.
  6. Gomi H., Kawato M. Recognition of manipulated objects by motor learning with hybrid architecture networks. Neural Networks, 6: 1993, pp. 485–497.
  7. Azam F., Vanlandingham H.F. A hybrid neural network method for robust handwritten character recognition. In: Artificial Neural Networks for Intelligent Engineering, ANNIE'98, 1998, vol. 8, pp. 503-508.
  8. Lee T. Structure level adaptation for artificial neural networks. Kluwer Academic Publishers, 1991.
  9. Kosslyn S. Image and Brain. MIT Press, Massachusetts, 1994.
  10. Stork B. Non-optimality via pre-adaptation in simple neural systems, In: Artificial Life II, Proceedings of the Workshop on Artificial Life. Held February, 1990, Santa Fe, New Mexico, 1991, vol. 3, pp. 409-429.
  11. French R. Catastrophic forgetting in connectionist networks. In: Trends in Cognitive Sciences, 3(4): 1999, pp. 128-135.
  12. Gustavo L., Souza M, Barreto A. Multiple Local ARX Modeling for System Identification Using the Self-Organizing Map. In: II European Symposium on Time Series Prediction, 2008, pp. 215-224.
  13. Koskela T. Neural network methods in analyzing and modelling time varying processes. Helsinki University of Technology, Espoo, 2003. pp. 1-72.
  14. Tokunaga K., Furukawa T. SOM of SOMs. Neural Networks. 2009, vol.22, pp. 463-478.
  15. Tokunaga K., Furukawa T. Hybrid network SOM. Neural Networks. 2008, №22, pp. 82-90.
  16. Vetter T., Hurlbert A., Poggio T. View-based Models of 3D Object Recognition: Invariance to Imaging Transformations. Cerebral Cortex. 1995, vol. 3, pp. 261-269.
Поделиться:
 
SEARCH
 
elibrary crossref ulrichsweb neicon rusycon
Photos
 
Events
 
News



Authors
Press-releases
Library
Conferences
About Project
Rambler's Top100
Phone: +7 (915) 336-07-65 (строго: среда; пятница c 11-00 до 17-00)
  RSS
© 2003-2017 «Наука и образование»
Перепечатка материалов журнала без согласования с редакцией запрещена
 Phone: +7 (915) 336-07-65 (строго: среда; пятница c 11-00 до 17-00)