Metamorphosis Networks: An Alternative to Constructive Methods

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Given a set of training examples, determining the appropriate number of free parameters is a challenging problem. Constructive learning algorithms attempt to solve this problem automatically by adding hidden units, and therefore free parameters, during learning. We explore an alternative class of algorithms-called metamorphosis algorithms-in which the number of units is fixed, but the number of free parameters gradually increases during learning. The architecture we investigate is composed of RBF units on a lattice, which imposes flexible constraints on the parameters of the network. Virtues of this approach include variable subset selection, robust parameter selection, multiresolution processing, and interpolation of sparse training data.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 5, NIPS 1992
EditorsStephen Jose Hanson, Jack D. Cowan, C. Lee Giles
PublisherNeural information processing systems foundation
Pages131-138
Number of pages8
ISBN (Electronic)1558602747, 9781558602748
DOIs
StatePublished - 1992
Externally publishedYes
Event5th Advances in Neural Information Processing Systems, NIPS 1992 - Denver, United States
Duration: Nov 30 1992Dec 3 1992

Publication series

NameAdvances in Neural Information Processing Systems
Volume5
ISSN (Print)1049-5258

Conference

Conference5th Advances in Neural Information Processing Systems, NIPS 1992
Country/TerritoryUnited States
CityDenver
Period11/30/9212/3/92

Fingerprint

Dive into the research topics of 'Metamorphosis Networks: An Alternative to Constructive Methods'. Together they form a unique fingerprint.

Cite this