The backpropagation learning algorithm extends to bidirectional training of multilayer neural networks. The bidirectional operation gives a form of backward chaining.
or backward inference from a network output. We first prove
that a fixed three-layer network of threshold neurons can
exactly represent any finite permutation function and its
inverse. The forward pass gives the function value. The
backward pass through the same network gives the inverse
value. We then derive and test a bidirectional version of
the backpropagation algorithm that can learn bidirectional
mappings or their approximations.
date/time interval
July 25, 2016 -
Location
Location
Las Vegas
Additional Document Info
Conference
International Conference on Advances in Big Data Analytics