Academia.eduAcademia.edu

Neural Networks

2007, Chapman & Hall/CRC Computer & Information Science Series

AI-generated Abstract

Artificial neural networks have been utilized extensively in machine learning, demonstrating effectiveness in various applications such as robotics, vision, and pattern recognition. These networks, characterized by their ability to learn complex associations and their function of encoding information through adjustable weights, offer advantages like parallel computation and noise tolerance. Theoretical advancements have provided insights into their computational capabilities and learning processes, particularly regarding continuous activation functions and backpropagation methodologies.

Key takeaways

  • In subsequent discussions, we distinguish between two types of neural networks, commonly known as the "feedforward" neural nets and the "recurrent" neural nets.
  • Threshold circuits, i.e., feedforward nets with threshold activation functions, have been quite well studied, and upper/lower bounds for them have been obtained while computing various Boolean functions (see, for example, [21,22,42,43,46,53] among many other works).
  • However, the complexity of the loading problem for sigmoidal neural nets still remains an open problem, though some partial results when the net is somewhat restricted appeared in references such as [24].
  • The key value of this conceptualization is that it allows one to regard the problem of training a recurrent network as the corresponding problem of training a feedforward neural network with certain constraints imposed on its weights.
  • Recurrent nets include feedforward nets and thus the results for feedforward nets apply to recurrent nets as well.