Abstract: (17747 Views)
In this paper we first show that standard BP algorithm cannot yeild to a uniform
information distribution over the neural network architecture. A measure of sensitivity is
defined to evaluate fault tolerance of neural network and then we show that the sensitivity
of a link is closely related to the amount of information passes through it. Based on this
assumption, we prove that the distribution of output error caused by s-a-0 (stuck at 0) faults
in a MLP network has a Gaussian distribution function. UDBP (Uniformly Distributed
Back Propagation) algorithm is then introduced to minimize mean and variance of the
output error. Simulation results show that UDBP has the least sensitivity and the highest
fault tolerance among other algorithms such as WRTA, N-FTBP and ADP. Then a MLP
neural network trained with UDBP, contributes in an Algorithm Based Fault Tolerant
(ABFT) scheme to protect a nonlinear data process block. The neural network is trained to
produce an all zero syndrome sequence in the absence of any faults. A systematic real
convolution code guarantees that faults representing errors in the processed data will result
in notable nonzero values in syndrome sequence. A majority logic decoder can easily detect
and correct single faults by observing the syndrome sequence. Simulation results
demonstrating the error detection and correction behavior against random s-a-0 faults are
presented too.
Type of Study:
Research Paper |
Received: 2008/10/12 | Revised: 2009/09/08 | Accepted: 2009/09/08