Academia.eduAcademia.edu

Recurrent Neural Networks for Temporal Sequences Recognition

2000

Time is the center of many human tasks. To talk, to listen, to read or to write are examples of time related tasks. To integrate the time notion into neural network is very important in order to deal with such tasks. This report presents various tasks that are based on temporal pattern processing and the different neural network architectures, simulated to tackle the problem. We examine the main components of connectionist models that process time varying patterns: the memory that records past informations, the pattern of connectivity among units of the network, and the rule used to update connection strength during training. We explore two different network architectures, one presented by Elman [8] and the other by Stornetta et al. , and analyze their ability to learn and recognize a finite state machine. Variant of these architectures are explored in order to know whether better results may be reached or not. We finally compare the results obtained by these architectures for the particular task of learning contingencies implied by a finite state machine.