Skip to main content
Presentation
Prestructuring Neural Networks for Pattern Recognition Using Extended Dependency Analysis
Systems Science Faculty Publications and Presentations
  • George G. Lendaris, Portland State University
  • Thaddeus T. Shannon, Portland State Universiy
  • Martin Zwick, Portland State University
Document Type
Presentation
Publication Date
1-1-1999
Subjects
  • Neural networks -- Structure,
  • Pattern recognition,
  • Fourier transformations,
  • Information theory
Abstract
We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern classification context. General Systems Methodology tools for finding problem-domain structure suffer exponential scaling of computation with respect to the number of variables considered. Therefore we introduce the use of Extended Dependency Analysis (EDA), which scales only polynomially in the number of variables, for the desired analysis. Based on EDA, we demonstrate a number of NN pre-structuring techniques applicable for building neural classifiers. An example is provided in which EDA results in significant dimension reduction of the input space, as well as capability for direct design of an NN classifier.
Description

Invited paper presented at Applications and Science of Computational Intelligence II AeroSense '99, Orlando FL.

Persistent Identifier
http://archives.pdx.edu/ds/psu/16510
Citation Information
Lendaris, G., Shannon, M., and Zwick, M. (1999). "Prestructuring Neural Networks for Pattern Recognition Using Extended Dependency Analysis." Invited paper, Applications and Science of Computational Intelligence II AeroSense '99, Orlando FL, SPIE