1. Introduction:
Since time immemorial one part of the human body has intrigued scientists, anatomists, engineers, psychoanalysts and many more about what is inside this hard oblong shaped skull, which is capable of performing so many complex functions simultaneously. This ceaseless inquisitive nature of man made him study this CPU of human body more thoroughly and still more deeply, so much so, that he took a titanic step of simulating & implementing this brain artificially! May it be a millionth part, but Artificial Neural networks are definitely a positive step in this direction.
Neural Networks, Neurocomputing or brain like computation are based on the wistful hope that we can reproduce at least some of the flexibility and power of the human brain by artificial means. One somewhat obvious solution is to simply implement Artificial Neural Networks on any of the today’s multipurpose computing platform including PCs, Workstations, and Supercomputers. In some cases, this is an acceptable, however this solution is seldom practical or economical for large scale Artificial Neural networks with real time operational constraints. Therefore in order to implement Neural Networks technology in real world applications, there is a need for small low cost Neural Networks hardware that takes full advantage of parallel nature of Neural Networks. In real world application we can realize ANNs using Analog, Digital or Hybrid dedicated electronic or optical hardware.
The main requirement for implementing Artificial Neural Networks are :
o Considerable computational power
o Sufficient capacity for large-scale networks
o Sufficient flexibility to handle various Neural Network configurations
o Potential to simulate a variety of neuron models including new models
o Fault tolerance
2 Biological Neuron :
The human brain consists of approximately 10 billion individual nerve cells called neurons. All the human activities and behavior can ultimately be traced to the activity of these tiny cells. Each neuron is interconnected to many other neurons forming a densely connected network called a Neural Network. These massive interconnections provide an exceptionally large computing power and memory. Figure shows the schematic diagram of a Biological Neuron.
As can be seen from figure 1.1 a neuron consists of following major parts :
a. The cell body : It is also called as soma, it collects and combines incoming information received from other neurons Fig: Bilogical Neuron
b. The axon : A single fiber through a neuron transmits information to other neurons.
c. The dendrites : Input signals from other neurons enter the cell by way of the dendrites, a bushy branching structure emanating from the cell body.
d. The synapses : The junction point of an axon with dendrites of another neuron is called a synapse. It provide memory to the past accumulated experience or knowledge.
Now, we briefly describe the Action potential and Neuron firing, which are important components that explain the working of a Biological Neuron.
(i) Action potential :
Each neuron can be thought of as a tiny biological battery full of ions and ready to be discharged. Neurons are filled with, and surrounded by, fluids containing dissolved chemicals. Both inside and around the soma are sodium(Na+), calcium(Ca++), potassium(K+), and chloride(Cl-) ions. Na+ and K+ ions are largely responsible for generating the active neural response called and Action potential also called the nerve impulses. K+ ions are concentrated inside the cell of neuron whereas the Na+ ions are concentrated out side the cell membrane.
The process of generating action potential either in neuron(where the processing of the information take place) or in the axons (through which transmission of information take place) is caused by exchange of ions(K+ and Na+) due to change in the permeability of cell membrane.
If the soma is electrically stimulated by a voltage grater then a certain threshold, there is an exchange of ions. This movement of ions causes the soma to change its internal state. More specifically, the flow of Na+ ions into the nerve membrane and K+ out of membrane generates an action potential of the neuron.
In summary, the action potentials are traveling positive charges generated by the flow of charged ions(Na+ K+) across the axon membrane.
(ii) Neuron Firing :
The neuron structure in the central nervous system is a very complex structure for the processing and transmission of information. Both the processing and transmission of information take place through the flow of ions across the axon and neuron membrane. The neuron is basically a computing node, that receive information (signal) and does some processing. The axon of one neuron is connected to the dendrite of another neuron through a synaptic junction called synapse. The synapse employs a chemical transmitter substance to convey a signal across the boundary of the junction. At the synaptic junction, the action potentials conducted along axon are converted to voltage called the post synaptic potential (PSP). The PSP is proportional to the amount of transmitter substance released, which, in turn, is proportional to the frequency of the axonic action potential(AAP).
The action potential is basically an electrical signal, where as communication between neurons is primarily a chemical process. Neuro transmitters may excite the neuron or inhibit it, depending on the type of the transmitter released and the nature of the dendrites membrane.
Each soma will receive on average of about 10000 excitatory and/or inhibitory inputs. The role of the soma is to perform the spatio-temporal summation of the excitatory and inhibitory slow potentials by means of weighted average. If this weighted average exceeds a threshold it is then converted at the axon hillock into action potentials with an appropriate output frequency and the action potentials are transmitted along the axon to the other nerve cell in order to repeat the process.
Thus there are three components involved in the transmission of information from one neuron to another that are:
1. Action potential,
2. Synaptic operation,
3. Somatic operation, as explained above.
3 Artificial Neuron :
The Artificial Neuron was designed to mimic the first order characteristics of the Biological Neuron. Due to the complexity and diversity of properties of biological neurons, the task of compressing their complicated characteristics in to a model is extremely difficult toward this goal, a model of this biological neuron also called an Artificial Neuron, has been developed in the Neural Network model. Figure 1.2 shown below represent a simple model of Artificial Neuron [1].
The Neuron receives inputs from number of other neurons or from external world. Each input is multiplied by corresponding weight, analogous to synaptic strength, here, a set of inputs is applied to Artificial Neuron analogous to a signals into a synapses of Biological Neuron. All of the weighted inputs are then summed to determine the activation level of the Artificial Neuron. Mathematically the function of Artificial Neuron can be modeled as
OUT = SUM(NET)
Where
NET = x1w1+x2w2+x3w3+…..+xnwn
( x1, x2,… xn) represent neural inputs.
(w1, w2, …wn) represent synaptic weights.
¦ [.] represent nonlinear activation function.
Activation function is a function according to which cell output(OUT) is obtained depending on NET input. Different types of activation functions and their mathematical forms are as shown in figure 1.3 below.
5. The potential benefits of Neural Network:
1. Fault Tolerance:
The Neural Network models have many neurons (the computational units) linked via the adaptive (synaptic) weights arranged in a massive parallel structure. Because of its high parallelism, failures of few neurons do not cause significant effects on overall system performance. This characteristic is known as fault tolerance[2].
2. Learning:
The main strength of Neural Network structures lies in their learning and adaptive abilities. Neural Network can modify their behavior in response to their environment. Shown a set of inputs (perhaps with desired outputs), they self-adjust to produce consistent responses. A wide variety of training algorithms has been developed, each with its own strengths and weaknesses[2].
3. Generalization:
It is possible to developed a network that can be generalize on the tasks for which it is trained, enabling the network to provide the correct answer when presented with a new input pattern that is different from the inputs in the training set. It is important to note that the Neural Network generalizes automatically as a result of its structure[2].
4. Abstraction:
Neural Networks are capable of abstracting the essence of a set of inputs. For example, a network can be trained on a sequence of distorted versions of an input. After adequate training, application of such a distorted example will cause the network to produce a perfectly formed output. In one sense, it has learned to produce something that it has never seen before. For this it does not require any mathematical modeling that might exists between the networks inputs and outputs[2].
ARTIFICIAL NEURAL NETWORK
Subscribe to:
Posts (Atom)