We survey some relationships between computational complexity and neural network theory. Here, only networks of binary threshold neurons are considered. We begin by presenting some contributions of neural networks in structural complexity theory. In parallel complexity, the class TC0 k of problems solvable by feed-forward networks with k levels and a polynomial number of neurons is considered. Separation results are recalled and the relation between TC0 =∪TC0 k and NC1 is analyzed. In particular, under the conjecture TC ≠ NC1, we characterize the class of regular languages accepted by feed-forward networks with a constant number of levels and a polynomial number of neurons. We also discuss the use of complexity theory to study computational aspects of learning and combinatorial optimization in the context of neural networks. We consider the PAC model of learning, emphasizing some negative results based on complexity theoretic assumptions. Finally, we discussed some results in the realm of neural networks related to a probabilistic characterization of NP.

Structural complexity and neural networks / A. Bertoni, B. Palano (LECTURE NOTES IN COMPUTER SCIENCE). - In: Neural Nets / [a cura di] M. Marinaro, R. Tagliaferri. - Berlin : Springer, 2002. - ISBN 9783540442653. - pp. 190-216 (( Intervento presentato al 13th. convegno Italian Workshop on Neural Nets tenutosi a Vietri sul Mare nel 2002 [10.1007/3-540-45808-5_21].

Structural complexity and neural networks

A. Bertoni
Primo
;
B. Palano
Ultimo
2002

Abstract

We survey some relationships between computational complexity and neural network theory. Here, only networks of binary threshold neurons are considered. We begin by presenting some contributions of neural networks in structural complexity theory. In parallel complexity, the class TC0 k of problems solvable by feed-forward networks with k levels and a polynomial number of neurons is considered. Separation results are recalled and the relation between TC0 =∪TC0 k and NC1 is analyzed. In particular, under the conjecture TC ≠ NC1, we characterize the class of regular languages accepted by feed-forward networks with a constant number of levels and a polynomial number of neurons. We also discuss the use of complexity theory to study computational aspects of learning and combinatorial optimization in the context of neural networks. We consider the PAC model of learning, emphasizing some negative results based on complexity theoretic assumptions. Finally, we discussed some results in the realm of neural networks related to a probabilistic characterization of NP.
structural complexity; neural networks; finite state automata; learning; combinatorial optimization
Settore INF/01 - Informatica
2002
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
pubblicato.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 342.22 kB
Formato Adobe PDF
342.22 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/175815
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 5
social impact