Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2007, Chapman & Hall/CRC Computer & Information Science Series
…
1 file
Artificial neural networks have been utilized extensively in machine learning, demonstrating effectiveness in various applications such as robotics, vision, and pattern recognition. These networks, characterized by their ability to learn complex associations and their function of encoding information through adjustable weights, offer advantages like parallel computation and noise tolerance. Theoretical advancements have provided insights into their computational capabilities and learning processes, particularly regarding continuous activation functions and backpropagation methodologies.
Preface We have made this report file on the topic neural network; we have tried our best to elucidate (clarify) all the relevant detail to the topic to be included in the report. While in the beginning we have tried to give a general view about this topic. Our efforts and wholehearted co-corporation of each and everyone has ended on a successful note. we express our sincere gratitude to MR UGWUNNA CHARLES O. who have been there for us throughout the preparation of this topic. We thank
A STUDY, 2018
First step towards AI is taken by Warren McCulloch a neurophysist and a mathematician Walter Pitts. They modelled a simple neural network with electrical circuits and got the results very accurate and derived a remarkable ability of neurons to perceive information from complicated and imprecise data. During the present study it was observed that trained neural network expert in analyzing the information has been provided with other advantages as Adaptive learning, Real Time operation, self-organization and Fault tolerance as well. Apart from convectional computing, neural networking use different processing units (Neurons) in parallel with each other. These need not to be programmed. They function just like human brain. We need to give it examples to solve different problems and these examples must be selected carefully so that it would not be waste of time.we use combination of neural networking and computational programming to achieve maximal efficiency right now but neural networking will eventually take over in future. We introduced artificial neural networking in which electronic models where used as neural structure of brain. Computers can store data as ledgers etc. but have difficulty in recognizing patterns but brain stores information as patterns. Further as artificial neural networking was introduced which has artificial neurons who act as real neurons and do functions as they do. They are used for speech, hearing, reorganization, storing information as patterns and many other functions which a human brain can do. These neural networks were combined and dynamically self-combined which is not true for any artificial networking. These neurons work as groups and sub divide the problem to resolve it. These are grouped in layers and it is art of engineering to make them solve real world problems. The most important thing is the connections between the neurons, it is glue to system as it is excitation inhibition process as the input remains constant one neuron excites while other inhibits as in subtraction addition process. Basically, all ANN have same network that is input, feedback or hidden and output.
Computers & Mathematics with Applications, 1996
The presented technical report is a preliminary English translation of selected revised sections from the rst part of the book Theoretical Issues of Neural Networks 75] by the rst author which represents a brief introduction to neural networks. This work does not cover a complete survey of the neural network models but the exposition here is focused more on the original motivations and on the clear technical description of several basic type models. It can be understood as an invitation to a deeper study of this eld. Thus, the respective b a c kground is prepared for those who have not met this phenomenon yet so that they could appreciate the subsequent theoretical parts of the book. In addition, this can also be pro table for those engineers who want t o a p p l y the neural networks in the area of their expertise. The introductory part does not require deeper preliminary knowledge, it contains many pictures and the mathematical formalism is reduced to the lowest degree in the rst chapter and it is used only for a technical description of neural network models in the following chapters. We will come back to the formalization of some of these introduced models within their theoretical analysis. The rst chapter makes an e ort to describe and clarify the neural network phenomenon. It contains a brief survey of the history of neurocomputing and it explains the neurophysiological motivations which led to the mathematical model of a neuron and neural network. It shows that a particular model of the neural network can be determined by means of the architectural, computational, and adaptive dynamics that describe the evolution of the speci c neural network parameters in time. Furthermore, it introduces neurocomputers as an alternative to the classical von Neumann computer architecture and the appropriate areas of their applications are discussed. The second chapter deals with the classical models of neural networks. First, the historically oldest model | the network of perceptrons is shortly mentioned. Further, the most widely applied model in practice | the multi{layered neural network with the back-propagation learning algorithm, is described in detail. The respective description, besides various variants of this model, contains implementation comments as well. The explanation of the linear model MADALINE, adapted according to the Widrow rule, follows. The third chapter is concentrated on the neural network models that are exploited as autoassociative or heteroassociative memories. The principles of the adaptation according to Hebb law are explained on the example of the linear associator neural network. The next model is the well{known Hop eld network, motivated by p h ysical theories, which is a representative of the cyclic neural networks. The analog version of this network can be used for heuristic solving of the optimization tasks (e. g. traveling salesman problem). By the physical analogy, a temperature parameter is introduced into the Hop eld network and thu s , a s t o c hastic model, the so{called Boltzmann machine is obtained. The information from this part of the book can be found in an arbitrary monograph or in survey articles concerning neural networks. For its composition we issued namely from the works 16, 24, 26, 27, 35, 36, 45, 73].
Neural Networks are relatively crude electronic models based on the neural structure of the brain. The brain basically learns from experience. It is natural proof that some problems that are beyond the scope of current computers are indeed solvable by small energy efficient packages.In this paper we propose the fundamentals of neural network topologies, activation function and learning algorithms based on the flow of information in bi-direction or uni-directions. We outline themain features of a number of popular neural networks and provide an overview on their topologies and their learning capabilities.
Corr, 2005
The Artificial Neural Network (ANN) is a functional imitation of simplified model of the biological neurons and their goal is to construct useful 'computers' for real-world problems and reproduce intelligent data evaluation techniques like pattern recognition, classification and generalization by using simple, distributed and robust processing units called artificial neurons. ANNs are fine-grained parallel implementation of non-linear static-dynamic systems. The intelligence of ANN and its capability to solve hard problems emerges from the high degree of connectivity that gives neurons its high computational power through its massive parallel-distributed structure. The current resurgent of interest in ANN is largely because ANN algorithms and architectures can be implemented in VLSI technology for real time applications. The number of ANN applications has increased dramatically in the last few years, fired by both theoretical and application successes in a variety of disciplines. This paper presents a survey of the research and explosive developments of many ANN-related applications. A brief overview of the ANN theory, models and applications is presented. Potential areas of applications are identified and future trend is discussed.
Automatic Learning Techniques in Power Systems, 1998
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2021
The purpose of this study is to familiarise the reader with the foundations of neural networks. Artificial Neural Networks (ANNs) are algorithm-based systems that are modelled after Biological Neural Networks (BNNs). Neural networks are an effort to use the human brain's information processing skills to address challenging real-world AI issues. The evolution of neural networks and their significance are briefly explored. ANNs and BNNs are contrasted, and their qualities, benefits, and disadvantages are discussed. The drawbacks of the perceptron model and their improvement by the sigmoid neuron and ReLU neuron are briefly discussed. In addition, we give a bird's-eye view of the different Neural Network models. We study neural networks (NNs) and highlight the different learning approaches and algorithms used in Machine Learning and Deep Learning. We also discuss different types of NNs and their applications. A brief introduction to Neuro-Fuzzy and its applications with a comprehensive review of NN technological advances is provided.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
International Journal of Linguistics and Computational Applications , 2015
Journal of Complexity, 1990
International Journal of Creative Research Thoughts, 2017
Models of Neurons and Perceptrons: Selected Problems and Challenges, 2018
Journal of the Society of Dyers and Colourists, 1998
Journal of Chemical Education, 1994