Backpropagation algorithm 
Backpropagation is an algorithm used to teach feed forward artificial neural networks. It works by providing a set of input data and ideal output data to the network, calculating the actual outputs and backpropagating the calculated error (the difference between the ideal and actual outputs) using gradient descent. This is useful for learning specific patterns of input and output data in order to be able to reproduce them afterwards even from slightly different or incomplete input data. Neural networks are able to learn any function that applies to input data by doing a generalization of the patterns they are trained with.


[ add comment ] ( 1 view )   |  [ 0 trackbacks ]   |  permalink  |  related link

<<First <Back | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | Next> Last>>