Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news.dfci.harvard.edu!camelot.ccs.neu.edu!nntp.neu.edu!news.eecs.umich.edu!news.radio.cz!newsbastard.radio.cz!news.radio.cz!CESspool!hammer.uoregon.edu!arclight.uoregon.edu!news.mathworks.com!newsgate.duke.edu!interpath!news.interpath.net!news.interpath.net!sas!newshost.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: kick-out
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <E6HHxI.2IB@unx.sas.com>
Date: Mon, 3 Mar 1997 20:22:30 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References:  <5e7jbb$asq@abel.ic.sunysb.edu>
Organization: SAS Institute Inc.
Lines: 22


In article <5e7jbb$asq@abel.ic.sunysb.edu>, dmyers@terra (David S. Myers) writes:
|> I have read an article that describes a procedure to minimize weight 
|> oscillation, and claims 10-20% improvement in training times.
|> They call it the "kick-out" algorithm, which entails finding weights
|> that are oscillating and damping that.
|> Has anyone else used this, or a similar procedure?
|> 
|> ref: Ochiai, et al. IEEE ICNN 1995, p1182

I have not read that particular article, but the problem of oscillating
weights (characteristic of steepest-descent type algorithms applied to
ill-conditioned problems) has been solved in a variety of ways. See
references on RPROP, conjugate gradients, etc. in the Neural Network
FAQ, part 2 of 7: Learning, at ftp://ftp.sas.com/pub/neural/FAQ2.html
-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
 *** Do not send me unsolicited commercial or political email! ***

