Newsgroups: comp.ai.neural-nets
From: jimmy@ecowar.demon.co.uk (Jimmy Shadbolt)
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!europa.eng.gtefsd.com!howland.reston.ans.net!news.sprintlink.net!demon!ecowar.demon.co.uk!jimmy
Subject: Re: Degrees of freedom in a net 
Distribution: world
References: <Cz294r.LJt@cs.dal.ca>
Organization: Econostat
Reply-To: jimmy@ecowar.demon.co.uk
X-Newsreader: Simple NEWS 1.90 (ka9q DIS 1.21)
Lines: 24
Date: Wed, 23 Nov 1994 10:40:34 +0000
Message-ID: <785587234snz@ecowar.demon.co.uk>
Sender: usenet@demon.co.uk

In article <Cz294r.LJt@cs.dal.ca> ab340@cfn.cs.dal.ca writes:

> 
>A problem is that each weight is really not an independent parameter.
>Is it feasibly possible, then, to determine f?  Also, once the net 
>is trained, it may be discovered that some of the connection weights 
>are very close to zero.  Theoretically, these connections could be 
>removed with little effect on the output value produced by the trained 
>net.  Thus, it is possible to design a new network topology with fewer 
>degrees of freedom that produces output values comparable to the 
>original net.
> 
... Isn't it a sort of function analysis problem? Network *approximates*
functions (Riesz basis) and this amounts to properties/complexity of basis?
Smoothness issues are also addressed in the inverse methods theory. Check
out Girosi and Poggio regularisation (RBF-like) network model.

        It would be nice to have an equivalent of vector quantisation theory 
in continuous domains plus some VC/PAC learning theoretic-type results ..

        Cheers

        Drago

