SlideShare a Scribd company logo
1 of 55
ArtificialArtificial
neural networkneural network
Yük. Blig. Müh. Mustafa Aadel MashjalYük. Blig. Müh. Mustafa Aadel Mashjal
158229001009158229001009
Background
- Neural Networks can be :
- BiologicalBiological models
- ArtificialArtificial models
- We wish to produce artificial systems capable of
complex calculation similar to the human brain.
describe the transmission ofdescribe the transmission of
information and some main ideasinformation and some main ideas
• The brain is consist of a mass of interconnected
neurons
– each neuron is connected to many other neurons
• Neurons transmit signals to each other
• Whether a signal is sent, depends on the strength of
the bond (synapse) between two neurons
How Does the Brain Work ? (1)
NEURON
- It is the cell that performs information processing in the
brain.
- Nervous tissue its consist of neurons, which receive and
transmit impulses
Each consists of :
SOMA, DENDRITES, AXON, and SYNAPSE.
How Does the Brain Work ? (2)
Brain vs. Digital Computers (1)
- Computers require hundreds of cycles to simulate
a firing of a neuron.
- The brain can fire all the neurons in a single step.
ParallelismParallelism
- Serial computers require billions of cycles to
perform some tasks but the brain takes less than
a second.
e.g. Face Recognition
Definition of Neural Network
A Neural Network is a system that consist of
many simple processing elements operating in
parallel which can achieve, store, and use
experimental knowledge.
What isWhat is
ArtificialArtificial
NeuralNeural
Network?Network?
Neurons vs. Units (1)
- Each element of NN is a node called unit.
- Units are connected by links.
-Each link has a numeric weight.
Neurons vs units (2)
• ANNs incorporate the two fundamental components
of biological neural nets:
1. Neurones (nodes)
2. Synapses (weights)
1- A set of connecting links, each link characterized by a weight: W1,
W2, …, Wm
2- An adder function (linear combiner) which computes the weighted
sum of the inputs:
3- Activation function (squashing function) for limiting the amplitude of
the output of the neuron.
∑=
=
m
1
jjxwnetj
j
Structure of a nodeStructure of a node
ActivationActivation Functions
- Use different functions to obtain different models.
- 3 most common choices :
1) Step function
2) Sign function
3) Sigmoid function
- An output of 1 represents firing of a neuron down
the axon.
Step Function Perceptrons
3 Activation Functions
Feed-Forward Neural Network
Architectures
The feed-forward neural network was the first and most simple type of
artificial neural network devised. In this network the information moves in
only one direction—forward: From the input nodes data goes through the
hidden nodes (if any) and to the output nodes. There are no cycles or loops
in the network.
• Two different classes of network architectures
– single-layer feed-forward neurons are organized
– multi-layer feed-forward in acyclic layers
• The architecture of a neural network is linked with the learning algorithm
used to train
Single Layer Feed-forward
The simplest kind of neural network is a single-layer perceptron network, which
consists of a single layer of output nodes; the inputs are fed directly to the outputs
via a series of weights. In this way it can be considered the simplest kind of feed-
forward network. The sum of the products of the weights and the inputs is calculated
in each node, and if the value is above some threshold the neuron fires and takes
the activated value; otherwise it takes the deactivated value. Neurons with this kind
of activation function are also called artificial neurons or linear threshold units.
This class of networks consists of multiple layers of computational units,
usually interconnected in a feed-forward way. Each neuron in one layer has
directed connections to the neurons of the next layer. In many applications
the units of these networks apply a sigmoid function as an activation function
Multi Layer Feed-forward
Supervised Learning Algorithm
• The learning algorithm would fall under this category if the desired
output for the network is also provided with the input while training
the network. By providing the neural network with both an input
and output pair it is possible to calculate an error based on it's
target output and actual output. It can then use that error to make
corrections to the network by updating it's weights.
• Single Layer neural network can be trained by a simple learning
algorithm that is usually called the Delta rule.
• Multi Layer neural network can be trained by a learning algorithm
that is usually called the Bakpropgation.
Delta rule
 The delta rule is specialized version of backpropagation's learning rule, for
use with single layer neural networks.
 It calculates the errors between calculated output and sample output data,
and uses this to create a modification to the weights, thus implementing a
form of gradient descent.

t (target) , o (output)
Mu (learning rate) , beta (error) , x (input)
collect old weight with the new modification weight and thus changing the
networks weight
Unsupervised Learning Algorithm
In this paradigm the neural network is only given a set of
inputs and it's the neural network's responsibility to find
some kind of pattern within the inputs provided without
any external aid.
Hebb Rule
Hebb method is used in the learning of networks that use
Unsupervised Learning , and the weight is modify by the
equation:
Mu (learning rate) , o (output) , x (input)
Then collect the output with the initial weight :
Introduction toIntroduction to
BackpropagationBackpropagation
- In 1969 a method for learning in multi-layer network, Backpropagation,
was invented by Bryson and Ho.
- Backpropagation is generalization of the delta rule to multi-layered
feedforward networks.
- Backpropagation is a common method of training artificial neural networks
used in conjunction with an optimization method such as gradient descent
(a first-order repeated optimization algorithm). It calculates the gradient of a
loss function with respect to all the weights in the network, so that the
gradient is fed to the optimization method which in turn uses it to update
the weights, in an attempt to minimize the loss function.
Backpropagation LearningBackpropagation Learning
DetailsDetails
• Use gradient descent to minimize the error
– propagate deltas to adjust for errors backward from
outputs to hidden layers to inputs
Backpropagation Algorithm – Main
Idea – error in hidden layers
The ideas of the algorithm can be summarized as follows :
1. Computes the error term for the output units using the
observed error.
2. From output layer, repeat
- propagating the error term back to the previous layer
and
- updating the weights between the two layers
until the earliest hidden layer is reached.
Forward Propagation of ActivityForward Propagation of Activity
• Step 1: Initialise weights at random, choose a
learning rate η
• Until network is trained:
• For each training example i.e. input pattern and
target output(s):
• Step 2: Do forward pass through net (with fixed
weights) to produce output(s)
– i.e., in Forward Direction, layer by layer:
• Inputs applied
• Multiplied by weights
• Summed
• ‘Squashed’ by sigmoid activation function
• Output passed to each neuron in next layer
– Repeat above until network output(s) produced
XOR- exampleXOR- example
Step 3. Back-Step 3. Back-
propagation ofpropagation of
errorerror
Backprop output layer
bias neuron in input layer
Bias Neurons inBias Neurons in BackpropBackpropagationagation
LearningLearning
Least-Mean-Square (LMS)Least-Mean-Square (LMS)
AlgorithmAlgorithm
Least mean squares (LMS) algorithms are a class of adaptive filter
used to simulate a required filter by finding the difference between the
desired and the actual signal. It was invented in 1960 by Stanford
University professor Bernard Widrow .
Examples andExamples and
ApplicationsApplications
of ANNof ANN
Neural Network in Practice
NNs are used for classification and function approximation
or mapping problems which are:
- Tolerant of some imprecision.
- Have lots of training data available.
- Hard and fast rules cannot easily be applied.
NETalk (1987)
• Mapping character strings into phonemes so they
can be pronounced by a computer
• Neural network trained how to pronounce each
letter in a word in a sentence, given the three
letters before and three letters after it in a window
• Output was the correct phoneme
• Results
– 95% accuracy on the training data
– 78% accuracy on the test set
Other Examples
• Neurogammon (Tesauro & Sejnowski, 1989)
– Backgammon learning program
• Speech Recognition (Waibel, 1989)
• Character Recognition (LeCun et al., 1989)
• Face Recognition (Mitchell)
ALVINNALVINN
• Steer a van down the road
– 2-layer feedforward
• using backpropagation for learning
– Raw input is 480 x 512 pixel image 15x per sec
– Color image preprocessed into 960 input units
– 4 hidden units
– 30 output units, each is a steering direction
Neural Network Approaches
ALVINN - Autonomous Land Vehicle In a Neural Network
Learning on-the-
fly
• ALVINN learned as the vehicle
traveled
– initially by observing a human
driving
– learns from its own driving by
watching for future corrections
– never saw bad driving
• didn’t know what was
dangerous, NOT correct
• computes alternate views of
the road (rotations, shifts, and
fill-ins) to use as “bad”
examples
– keeps a buffer pool of 200 pretty
old examples to avoid overfitting
to only the most recent images
Feed-forward vs. Interactive
Nets
• Feed-forward
– activation propagates in one direction
– We usually focus on this
• Interactive
– activation propagates forward & backwards
– propagation continues until equilibrium is reached in
the network
– We do not discuss these networks here, complex
training. May be unstable.
Ways of learning with an ANN
• Add nodes & connections
• Subtract nodes & connections
• Modify connection weights
– current focus
– can simulate first two
• I/O pairs:
– given the inputs, what should the output be?
[“typical” learning problem]
More Neural NetworkMore Neural Network
ApplicationsApplications
- May provide a model for massive parallel computation.
- More successful approach of “parallelizing” traditional
serial algorithms.
- Can compute any computable function.
- Can do everything a normal digital computer can do.
- Can do even more under some impractical assumptions.
Neural Network Approaches to driving
- Developed in 1993.
- Performs driving with
Neural Networks.
- An intelligent VLSI image
sensor for road following.
- Learns to filter out image
details not relevant to
driving.
Hidden layer
Output units
Input units
•Use special hardware
•ASIC
•FPGA
•analog
Neural Network Approaches
Hidden Units Output unitsInput Array
Actual Products Available
ex1. Enterprise Miner:ex1. Enterprise Miner:
- Single multi-layered feed-forward neural networks.
- Provides business solutions for data mining.
ex2. Nestor:ex2. Nestor:
- Uses Nestor Learning System (NLS).
- Several multi-layered feed-forward neural networks.
- Intel has made such a chip - NE1000 in VLSI technology.
Ex1. Software tool - Enterprise Miner
- Based on SEMMA (Sample, Explore, Modify, Model,
Access) methodology.
- Statistical tools include :
Clustering, decision trees, linear and logistic
regression and neural networks.
- Data preparation tools include :
Outliner detection, variable transformation, random
sampling, and partition of data sets (into training,
testing and validation data sets).
Ex 2. Hardware Tool - Nestor
- With low connectivity within each layer.
- Minimized connectivity within each layer results in rapid
training and efficient memory utilization, ideal for VLSI.
- Composed of multiple neural networks, each specializing
in a subset of information about the input patterns.
- Real time operation without the need of special computers
or custom hardware DSP platforms
•Software exists.
SummarySummary
- Neural network is a computational model that simulate
some properties of the human brain.
- The connections and nature of units determine the
behavior of a neural network.
- Perceptrons are feed-forward networks that can only
represent linearly separable functions.
Summary
- Given enough units, any function can be represented
by Multi-layer feed-forward networks.
- Backpropagation learning works on multi-layer
feed-forward networks.
- Neural Networks are widely used in developing
artificial learning systems.
ReferencesReferences
- Russel, S. and P. Norvig (1995). Artificial Intelligence - A
Modern Approach. Upper Saddle River, NJ, Prentice
Hall.
- Sarle, W.S., ed. (1997), Neural Network FAQ, part 1 of 7:
Introduction, periodic posting to the Usenet newsgroup
comp.ai.neural-nets,
URL: ftp://ftp.sas.com/pub/neural/FAQ.html
Eddy Li
Eric Wong
Martin Ho
Kitty Wong
Sources

More Related Content

What's hot

Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksFrancesco Collova'
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networksAkash Goel
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptronomaraldabash
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications Ahmed_hashmi
 
Unsupervised learning
Unsupervised learningUnsupervised learning
Unsupervised learningamalalhait
 
Supervised Learning
Supervised LearningSupervised Learning
Supervised Learningbutest
 
Artificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksArtificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksThe Integral Worm
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Mohd Faiz
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
Multi-Layer Perceptrons
Multi-Layer PerceptronsMulti-Layer Perceptrons
Multi-Layer PerceptronsESCOM
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep LearningOswald Campesato
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseMohaiminur Rahman
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Suraj Aavula
 

What's hot (20)

Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Basics of Soft Computing
Basics of Soft  Computing Basics of Soft  Computing
Basics of Soft Computing
 
Machine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural NetworksMachine Learning: Introduction to Neural Networks
Machine Learning: Introduction to Neural Networks
 
backpropagation in neural networks
backpropagation in neural networksbackpropagation in neural networks
backpropagation in neural networks
 
Perceptron
PerceptronPerceptron
Perceptron
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Unsupervised learning
Unsupervised learningUnsupervised learning
Unsupervised learning
 
Supervised Learning
Supervised LearningSupervised Learning
Supervised Learning
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Artificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural NetworksArtificial Intelligence: Artificial Neural Networks
Artificial Intelligence: Artificial Neural Networks
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Deep learning
Deep learningDeep learning
Deep learning
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Self-organizing map
Self-organizing mapSelf-organizing map
Self-organizing map
 
Multi-Layer Perceptrons
Multi-Layer PerceptronsMulti-Layer Perceptrons
Multi-Layer Perceptrons
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep Learning
 
Neural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics CourseNeural network final NWU 4.3 Graphics Course
Neural network final NWU 4.3 Graphics Course
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)
 

Similar to Artificial neural network

Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...bihira aggrey
 
Classification using back propagation algorithm
Classification using back propagation algorithmClassification using back propagation algorithm
Classification using back propagation algorithmKIRAN R
 
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Akash Goel
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagationKrish_ver2
 
Acem neuralnetworks
Acem neuralnetworksAcem neuralnetworks
Acem neuralnetworksAastha Kohli
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.pptbutest
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkIshaneeSharma
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsaciijournal
 
Lecture 11 neural network principles
Lecture 11 neural network principlesLecture 11 neural network principles
Lecture 11 neural network principlesVajira Thambawita
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptxSherinRappai
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdfgnans Kgnanshek
 
Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network Subham Preetam
 
Neural-Networks.ppt
Neural-Networks.pptNeural-Networks.ppt
Neural-Networks.pptRINUSATHYAN
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Deepu Gupta
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networksmadhu sudhakar
 

Similar to Artificial neural network (20)

Lec 6-bp
Lec 6-bpLec 6-bp
Lec 6-bp
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...
 
Classification using back propagation algorithm
Classification using back propagation algorithmClassification using back propagation algorithm
Classification using back propagation algorithm
 
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagation
 
Acem neuralnetworks
Acem neuralnetworksAcem neuralnetworks
Acem neuralnetworks
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Web spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithmsWeb spam classification using supervised artificial neural network algorithms
Web spam classification using supervised artificial neural network algorithms
 
Lecture 11 neural network principles
Lecture 11 neural network principlesLecture 11 neural network principles
Lecture 11 neural network principles
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf
 
Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network
 
Backpropagation.pptx
Backpropagation.pptxBackpropagation.pptx
Backpropagation.pptx
 
Neural-Networks.ppt
Neural-Networks.pptNeural-Networks.ppt
Neural-Networks.ppt
 
MNN
MNNMNN
MNN
 
Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02Neuralnetwork 101222074552-phpapp02
Neuralnetwork 101222074552-phpapp02
 
02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 

Recently uploaded

22-prompt engineering noted slide shown.pdf
22-prompt engineering noted slide shown.pdf22-prompt engineering noted slide shown.pdf
22-prompt engineering noted slide shown.pdf203318pmpc
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXssuser89054b
 
Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . pptDineshKumar4165
 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptDineshKumar4165
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VDineshKumar4165
 
Unit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdfUnit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdfRagavanV2
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfJiananWang21
 
Generative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTGenerative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTbhaskargani46
 
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Bookingdharasingh5698
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdfKamal Acharya
 
A Study of Urban Area Plan for Pabna Municipality
A Study of Urban Area Plan for Pabna MunicipalityA Study of Urban Area Plan for Pabna Municipality
A Study of Urban Area Plan for Pabna MunicipalityMorshed Ahmed Rahath
 
2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projectssmsksolar
 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueBhangaleSonal
 
chapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringchapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringmulugeta48
 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Arindam Chakraborty, Ph.D., P.E. (CA, TX)
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...roncy bisnoi
 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxJuliansyahHarahap1
 

Recently uploaded (20)

22-prompt engineering noted slide shown.pdf
22-prompt engineering noted slide shown.pdf22-prompt engineering noted slide shown.pdf
22-prompt engineering noted slide shown.pdf
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
 
Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . ppt
 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.ppt
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - V
 
Unit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdfUnit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdf
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
Generative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTGenerative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPT
 
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
 
(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7
(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7
(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdf
 
A Study of Urban Area Plan for Pabna Municipality
A Study of Urban Area Plan for Pabna MunicipalityA Study of Urban Area Plan for Pabna Municipality
A Study of Urban Area Plan for Pabna Municipality
 
2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects
 
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torque
 
chapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringchapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineering
 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptx
 
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
 

Artificial neural network

  • 1. ArtificialArtificial neural networkneural network Yük. Blig. Müh. Mustafa Aadel MashjalYük. Blig. Müh. Mustafa Aadel Mashjal 158229001009158229001009
  • 2. Background - Neural Networks can be : - BiologicalBiological models - ArtificialArtificial models - We wish to produce artificial systems capable of complex calculation similar to the human brain.
  • 3. describe the transmission ofdescribe the transmission of information and some main ideasinformation and some main ideas • The brain is consist of a mass of interconnected neurons – each neuron is connected to many other neurons • Neurons transmit signals to each other • Whether a signal is sent, depends on the strength of the bond (synapse) between two neurons
  • 4. How Does the Brain Work ? (1) NEURON - It is the cell that performs information processing in the brain. - Nervous tissue its consist of neurons, which receive and transmit impulses
  • 5. Each consists of : SOMA, DENDRITES, AXON, and SYNAPSE. How Does the Brain Work ? (2)
  • 6. Brain vs. Digital Computers (1) - Computers require hundreds of cycles to simulate a firing of a neuron. - The brain can fire all the neurons in a single step. ParallelismParallelism - Serial computers require billions of cycles to perform some tasks but the brain takes less than a second. e.g. Face Recognition
  • 7. Definition of Neural Network A Neural Network is a system that consist of many simple processing elements operating in parallel which can achieve, store, and use experimental knowledge.
  • 9. Neurons vs. Units (1) - Each element of NN is a node called unit. - Units are connected by links. -Each link has a numeric weight.
  • 10. Neurons vs units (2) • ANNs incorporate the two fundamental components of biological neural nets: 1. Neurones (nodes) 2. Synapses (weights)
  • 11. 1- A set of connecting links, each link characterized by a weight: W1, W2, …, Wm 2- An adder function (linear combiner) which computes the weighted sum of the inputs: 3- Activation function (squashing function) for limiting the amplitude of the output of the neuron. ∑= = m 1 jjxwnetj j Structure of a nodeStructure of a node
  • 12. ActivationActivation Functions - Use different functions to obtain different models. - 3 most common choices : 1) Step function 2) Sign function 3) Sigmoid function - An output of 1 represents firing of a neuron down the axon.
  • 15. Feed-Forward Neural Network Architectures The feed-forward neural network was the first and most simple type of artificial neural network devised. In this network the information moves in only one direction—forward: From the input nodes data goes through the hidden nodes (if any) and to the output nodes. There are no cycles or loops in the network. • Two different classes of network architectures – single-layer feed-forward neurons are organized – multi-layer feed-forward in acyclic layers • The architecture of a neural network is linked with the learning algorithm used to train
  • 16. Single Layer Feed-forward The simplest kind of neural network is a single-layer perceptron network, which consists of a single layer of output nodes; the inputs are fed directly to the outputs via a series of weights. In this way it can be considered the simplest kind of feed- forward network. The sum of the products of the weights and the inputs is calculated in each node, and if the value is above some threshold the neuron fires and takes the activated value; otherwise it takes the deactivated value. Neurons with this kind of activation function are also called artificial neurons or linear threshold units.
  • 17.
  • 18. This class of networks consists of multiple layers of computational units, usually interconnected in a feed-forward way. Each neuron in one layer has directed connections to the neurons of the next layer. In many applications the units of these networks apply a sigmoid function as an activation function Multi Layer Feed-forward
  • 19. Supervised Learning Algorithm • The learning algorithm would fall under this category if the desired output for the network is also provided with the input while training the network. By providing the neural network with both an input and output pair it is possible to calculate an error based on it's target output and actual output. It can then use that error to make corrections to the network by updating it's weights. • Single Layer neural network can be trained by a simple learning algorithm that is usually called the Delta rule. • Multi Layer neural network can be trained by a learning algorithm that is usually called the Bakpropgation.
  • 20. Delta rule  The delta rule is specialized version of backpropagation's learning rule, for use with single layer neural networks.  It calculates the errors between calculated output and sample output data, and uses this to create a modification to the weights, thus implementing a form of gradient descent.  t (target) , o (output) Mu (learning rate) , beta (error) , x (input) collect old weight with the new modification weight and thus changing the networks weight
  • 21. Unsupervised Learning Algorithm In this paradigm the neural network is only given a set of inputs and it's the neural network's responsibility to find some kind of pattern within the inputs provided without any external aid.
  • 22. Hebb Rule Hebb method is used in the learning of networks that use Unsupervised Learning , and the weight is modify by the equation: Mu (learning rate) , o (output) , x (input) Then collect the output with the initial weight :
  • 23. Introduction toIntroduction to BackpropagationBackpropagation - In 1969 a method for learning in multi-layer network, Backpropagation, was invented by Bryson and Ho. - Backpropagation is generalization of the delta rule to multi-layered feedforward networks. - Backpropagation is a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent (a first-order repeated optimization algorithm). It calculates the gradient of a loss function with respect to all the weights in the network, so that the gradient is fed to the optimization method which in turn uses it to update the weights, in an attempt to minimize the loss function.
  • 24. Backpropagation LearningBackpropagation Learning DetailsDetails • Use gradient descent to minimize the error – propagate deltas to adjust for errors backward from outputs to hidden layers to inputs
  • 25. Backpropagation Algorithm – Main Idea – error in hidden layers The ideas of the algorithm can be summarized as follows : 1. Computes the error term for the output units using the observed error. 2. From output layer, repeat - propagating the error term back to the previous layer and - updating the weights between the two layers until the earliest hidden layer is reached.
  • 26. Forward Propagation of ActivityForward Propagation of Activity • Step 1: Initialise weights at random, choose a learning rate η • Until network is trained: • For each training example i.e. input pattern and target output(s): • Step 2: Do forward pass through net (with fixed weights) to produce output(s) – i.e., in Forward Direction, layer by layer: • Inputs applied • Multiplied by weights • Summed • ‘Squashed’ by sigmoid activation function • Output passed to each neuron in next layer – Repeat above until network output(s) produced
  • 28. Step 3. Back-Step 3. Back- propagation ofpropagation of errorerror Backprop output layer
  • 29.
  • 30.
  • 31.
  • 32. bias neuron in input layer Bias Neurons inBias Neurons in BackpropBackpropagationagation LearningLearning
  • 33. Least-Mean-Square (LMS)Least-Mean-Square (LMS) AlgorithmAlgorithm Least mean squares (LMS) algorithms are a class of adaptive filter used to simulate a required filter by finding the difference between the desired and the actual signal. It was invented in 1960 by Stanford University professor Bernard Widrow .
  • 34.
  • 35.
  • 36.
  • 38. Neural Network in Practice NNs are used for classification and function approximation or mapping problems which are: - Tolerant of some imprecision. - Have lots of training data available. - Hard and fast rules cannot easily be applied.
  • 39. NETalk (1987) • Mapping character strings into phonemes so they can be pronounced by a computer • Neural network trained how to pronounce each letter in a word in a sentence, given the three letters before and three letters after it in a window • Output was the correct phoneme • Results – 95% accuracy on the training data – 78% accuracy on the test set
  • 40. Other Examples • Neurogammon (Tesauro & Sejnowski, 1989) – Backgammon learning program • Speech Recognition (Waibel, 1989) • Character Recognition (LeCun et al., 1989) • Face Recognition (Mitchell)
  • 41. ALVINNALVINN • Steer a van down the road – 2-layer feedforward • using backpropagation for learning – Raw input is 480 x 512 pixel image 15x per sec – Color image preprocessed into 960 input units – 4 hidden units – 30 output units, each is a steering direction
  • 42. Neural Network Approaches ALVINN - Autonomous Land Vehicle In a Neural Network
  • 43. Learning on-the- fly • ALVINN learned as the vehicle traveled – initially by observing a human driving – learns from its own driving by watching for future corrections – never saw bad driving • didn’t know what was dangerous, NOT correct • computes alternate views of the road (rotations, shifts, and fill-ins) to use as “bad” examples – keeps a buffer pool of 200 pretty old examples to avoid overfitting to only the most recent images
  • 44. Feed-forward vs. Interactive Nets • Feed-forward – activation propagates in one direction – We usually focus on this • Interactive – activation propagates forward & backwards – propagation continues until equilibrium is reached in the network – We do not discuss these networks here, complex training. May be unstable.
  • 45. Ways of learning with an ANN • Add nodes & connections • Subtract nodes & connections • Modify connection weights – current focus – can simulate first two • I/O pairs: – given the inputs, what should the output be? [“typical” learning problem]
  • 46. More Neural NetworkMore Neural Network ApplicationsApplications - May provide a model for massive parallel computation. - More successful approach of “parallelizing” traditional serial algorithms. - Can compute any computable function. - Can do everything a normal digital computer can do. - Can do even more under some impractical assumptions.
  • 47. Neural Network Approaches to driving - Developed in 1993. - Performs driving with Neural Networks. - An intelligent VLSI image sensor for road following. - Learns to filter out image details not relevant to driving. Hidden layer Output units Input units •Use special hardware •ASIC •FPGA •analog
  • 48. Neural Network Approaches Hidden Units Output unitsInput Array
  • 49. Actual Products Available ex1. Enterprise Miner:ex1. Enterprise Miner: - Single multi-layered feed-forward neural networks. - Provides business solutions for data mining. ex2. Nestor:ex2. Nestor: - Uses Nestor Learning System (NLS). - Several multi-layered feed-forward neural networks. - Intel has made such a chip - NE1000 in VLSI technology.
  • 50. Ex1. Software tool - Enterprise Miner - Based on SEMMA (Sample, Explore, Modify, Model, Access) methodology. - Statistical tools include : Clustering, decision trees, linear and logistic regression and neural networks. - Data preparation tools include : Outliner detection, variable transformation, random sampling, and partition of data sets (into training, testing and validation data sets).
  • 51. Ex 2. Hardware Tool - Nestor - With low connectivity within each layer. - Minimized connectivity within each layer results in rapid training and efficient memory utilization, ideal for VLSI. - Composed of multiple neural networks, each specializing in a subset of information about the input patterns. - Real time operation without the need of special computers or custom hardware DSP platforms •Software exists.
  • 52. SummarySummary - Neural network is a computational model that simulate some properties of the human brain. - The connections and nature of units determine the behavior of a neural network. - Perceptrons are feed-forward networks that can only represent linearly separable functions.
  • 53. Summary - Given enough units, any function can be represented by Multi-layer feed-forward networks. - Backpropagation learning works on multi-layer feed-forward networks. - Neural Networks are widely used in developing artificial learning systems.
  • 54. ReferencesReferences - Russel, S. and P. Norvig (1995). Artificial Intelligence - A Modern Approach. Upper Saddle River, NJ, Prentice Hall. - Sarle, W.S., ed. (1997), Neural Network FAQ, part 1 of 7: Introduction, periodic posting to the Usenet newsgroup comp.ai.neural-nets, URL: ftp://ftp.sas.com/pub/neural/FAQ.html
  • 55. Eddy Li Eric Wong Martin Ho Kitty Wong Sources

Editor's Notes

  1. Teacher values were gaussian with variance 10