Cite this article as:

Slepovichev I. I. Algebraic Properties of Abstract Neural Network. Izv. Saratov Univ. (N. S.), Ser. Math. Mech. Inform., 2016, vol. 16, iss. 1, pp. 96-103. DOI:

519.68:007.5; 512.5

Algebraic Properties of Abstract Neural Network


The modern level of neuroinformatics allows to use artificial neural networks for the solution of various applied problems. However many neural network methods put into practice have no strict formal mathematical substantiation, being heuristic algorithms. It imposes certain restrictions on development of neural network methods of the solution of problems. At the same time there is a wide class of mathematical models which are well studied within such disciplines as theory of abstract algebras, graph theory, automata theory. Opportunity to use results received within these disciplines in relation to neural network models can be a good help in studying of artificial neural networks, their properties and functionality. In this work formulations and definitions of neural network models from the point of view of universal algebra and the theory of graphs are given. The main theorems of universal algebra are provided in neural network treatment. In article is also offered the way of the formal description of a neuronet by graph-schemes which allows to use results of graph theory for the analysis of neural network structures.


1. Golovko V. A. Nejronnye seti : obuchenie, organizacja i primenenie [Neural networks: training, and organizing your application]. Book 4 : Manual for High Schools / ed. A. I. Galushkina, Moscow, IPRZhR, 2001 (in Russian).

2. Gorban A. N. Generalized approximation theorem and computational capabilities of neural networks. Sib. Zh. Vychisl. Mat., 1998, vol. 1, no. 1, pp. 11–24 (in Russian).

3. Kruglov V. V., Dli M. I., Golunov R. U. Nechjotkaja logika i iskusstvennye nejronnye seti [Fuzzy Logic and Artificial Neural Network]. Moscow, Fizmatlit, 2001 (in Russian).

4. Tarhov D. A. Nejronnye seti. Modeli i algoritmy [Neural network. Models and Algorithms]. Moscow, Radiotehnika, 2005 (in Russian).

5. Haykin S. Neural Networks. A Comprehensive Foundation. Hamilton, Ontario, Canada, 1999.

6. Bogomolov A. M., Salii V. N. Algebraicheskie osnovy teorii diskretnykh sistem [Algebraic foundations of the theory of discrete systems]. Moscow, Nauka, 1997 (in Russian).

7. Kolmogorov A. N. On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition. Dokl. Akad. Nauk SSSR, 1957, vol. 114, no. 5, pp. 953–956 (in Russian).

8. Gorban A. N. Training neural networks. Moscow, USSR-USA JV "Paragraph", 1990 (in Russian).

9. Alekseev V. B., Lozhkin S. A. Elementy teorii grafov, skhem i avtomatov [Elements of the theory of graphs, charts and machines]. Moscow, Izdat. otdel fakulteta VMiK MGU, 2000 (in Russian).

10. Mitrofanov Yu. I. System analysis. Saratov, Nauchnaia kniga, 2000 (in Russian).

Full text: