What is it about?

A varying-parameter ZNN (VPZNN) neural design is defined for approximating various generalized inverses and expressions involving generalized inverses of complex matrices. The proposed model is termed as $CVPZNN(A,F,G)$ and defined on the basis of the error function which includes three appropriate matrices $A,F,G$. The $CVPZNN(A,F,G)$ evolution design includes so far defined VPZNN models for computing generalized inverses and also generates a number of matrix expressions involving these generalized inverses. Global and super-exponential convergence properties of the proposed model as well as behaviour of its equilibrium state are investigated.

Featured Image

Why is it important?

(1) A complex-valued varying-parameter $ZNN(A,F,G)$ model, termed as $CVPZNN(A,F,G)$, is defined. That model is defined on the basis of very general error function able to generate outer generalized inverses and the CVP-CDNN neural design. (2) According to numerical experience, we proposed and investigated some new values of the scaling parameter. (3) Global and super-exponential convergence properties of the proposed model are verified and behaviour of its equilibrium state is investigated. (4) Most important particular cases of defined model are mentioned and numerically tested in order to show generality of the proposed design.

Perspectives

One possibility for further research is definition of a varying-parameter integration-enhanced and noise-tolerant $IEZNN(A,F,G)$ model, which could be termed as $CVPIENTZNN(A,F,G)$. That approach will include two dominant approaches in the ZNN research: the noise-tolerant ZNN design and the varying-parameter neural design. Another possibility is to use similar approach in solving systems of linear equations.

Ph.D. Predrag S. Stanimirovic
university of Nis, Faculty of Sciences and Mathematics

Read the Original

This page is a summary of: Varying-parameter Zhang neural network for approximating some expressions involving outer inverses, Optimization Methods and Software, March 2019, Taylor & Francis,
DOI: 10.1080/10556788.2019.1594806.
You can read the full text:

Read

Contributors

The following have contributed to this page