NEURAL NETWORK RESEARCH SEMINAR

                                               Paul Kainen
                                              DEPARTMENT OF MATHEMATICS
                                                       email: kainenp@gusun.georgetown.edu
                                                                         ICOS 324
 

    1. What is a neural network?

    * The basic concept and some of its history
    * Behavior of the elementary units
    * Feedforward vs. fully interconnected topologies
    * Connectionism and computation
    * Functional approximation and representation
    * Finding the weights

    2. Analysis and neural networks.

    * Polynomial approximation
    * Weierstrass and Chebyshev contributions
    * Normed linear spaces
    * Dense subsets as universal approximation and representation
    * Finding the weights

    3. Geometry and neural networks.

    * Affine geometry and uniqueness of parameterization
    * Geometry of balls, cubes and hyperoctahedra - implications of approximation
    * Geometry of the unit ball and best approximation
    * Convexity

    4. Statistics and neural networks.

    * Probablistic neurons
    * Adding noise to the weights
    * Adding noise to the activation function
    * Pattern recognition
    * Quasi-orthogonality
    * Signal Processing
    * Quantum computing

    5. Graph theory and neural networks.

    * Implementation constraints and bounded vertex degree
    * Graph invariants relevant to neural network design
    * Random graphs and their properties
    * Topology of graphs and implications for implementation
 

There will be no text for the course which will be based on the proposer's notes, supplemented with xeroxed copies of portions of various books and papers.  This is intended to be a research seminar for undergraduates. It is assumed that students have "mathematical maturity" but there are no other prerequisites.

Neural Networks have arisen independently in computer science, cognitive science and mathematics.  Much of the recent work in quantum computing can be rephrased in terms of neural networks and amounts to yet another independent rediscovery, this time by physicists.
Of course, neuroscience, philosophy and psychology also have studied the concept.

Neural networks have a pedigree in mathematics which includes Hilbert, Arnold and Kolmogorov, and they turn out to involve and connect quite a few rather disparate areas - notably, analysis, geometry, graph theory and statistics.  Further, neural networks provide a new perspective on nonlinear approximation and will likely have applications to nonlinear optimization.

Therefore, in addition to the rising practical interest in this topic, neural networks are an appropriate subject for theoretical mathematics.  Conversely, those from other disciplines who wish to investigate neural networks will be well served by having knowledge of the foundations. Finally, it is to be hoped that the "glamour" of the subject will motivate a new group of people to study math.

For further information, please contact Paul Kainen, Department of Mathematics, Georgetown University.  Phone # 202-687-2703