IEEE TRANSACTIONS ON NEURAL NETWORKS, cilt.7, sa.4, ss.889-896, 1996 (SCI-Expanded)
In this paper a new multilayer perceptron (MLP) structure is introduced to simulate nonlinear transformations on infinite-dimensional function spaces. This extension is achieved by replacing discrete neurons by a continuum of neurons, summations by integrations and weight matrices by kernels of integral transforms, Variational techniques have been employed for the analysis and training of the infinite-dimensional MLP (IDMLP). The training problem of IDMLP is solved by the Lagrange multiplier technique yielding the coupled state and adjoint state integro-difference equations. A steepest descent-like algorithm is used to construct the required kernel and threshold functions. Finally, some results are presented to show the performance of the new IDMLP.