Implementation of back propagation using neuralnet package
The backward propagation of errors or back-propgation is a common method of training artificial
neural networks used in conjunction with an optimization method such as
gradient descent.
The algorithm repeats a two phase cycle propagation and
weight update.
When an input vector is presented to the network , it is
propgated forward through the network layer by layer.
The goal of back-propgation is to optimize the weights so
that the neural network can learn how to correctly map arbitrary inputs to
outputs.
Back-propgation is an algorithm for supervised learning of
artificial neural networks using gradient descent.
Comments
Post a Comment