15th European Conference on Artificial Intelligence
|
July 21-26 2002 Lyon France |
[full paper] |
Rafal Mikolajczak, Jacek Mandziuk
This paper presents experimental comparison between selected neural architectures for chaotic time series prediction problem. Several feed-forward architectures (Multilayer Perceptrons) are compared with partially recurrent nets (Elman, extended Elman, and Jordan) based on convergence rate, prediction accuracy, training time requirements and stability of results. Results for chaotic logistic map series presented in the paper indicate that prediction accuracy of MLPs with two hidden layers is superior to other tested architectures. Although potential superiority of MLPs needs to be confirmed on other chaotic time series before any general conclusions can be drawn, it is conjectured here that on the contrary to the common beliefs in several cases feed-forward nets may be better suited for short-term prediction task than partially recurrent nets. It is worth noting that significant improvement in prediction accuracy for all tested networks was achieved by rescaling the data from interval(0,1) to (0.2, 0.8). Moreover, it is experimentally shown that with a proper choice of learning parameters all tested architectures produce stable (repeatable) results. Open problems left for future research include verification of the above results on another chaotic time series and on financial or business data.
Keywords: Time Series Prediction, Logistic Map Series, Neural Networks
Citation: Rafal Mikolajczak, Jacek Mandziuk: Chaotic Time Series Prediction with Neural Networks - Comparison of Several Architectures. In F. van Harmelen (ed.): ECAI2002, Proceedings of the 15th European Conference on Artificial Intelligence, IOS Press, Amsterdam, 2002, pp.493-497.