Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!gatech!howland.reston.ans.net!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: What do you mean by "epoch"? (abused term ?)
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <D4voxI.7DG@unx.sas.com>
Date: Fri, 3 Mar 1995 19:03:18 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <3ig1hh$v1@uuneo.neosoft.com> <3ig6ak$9ut@cantaloupe.srv.cs.cmu.edu> <D4FHw9.4qG@unx.sas.com> <rrg.1211.000AB1DF@aber.ac.uk>
Organization: SAS Institute Inc.
Lines: 44


In article <rrg.1211.000AB1DF@aber.ac.uk>, rrg@aber.ac.uk (Roy Goodacre) writes:
|> ...
|> I agree with Scott:
|>
|> One complete calculation in the network is called an epoch.  This is
|> equivalent to one complete pass through all the training data, calculating for
|> each member of the training set.  ...

The source of my definition of "epoch", which I finally tracked down, is
Masters, T. (1993), _Practical Neural Network Recipes in C++_, p. 95
(brackets and ellipses are in the original):

   One pass through this [present a subset of the training set ...
   measure error .. update weights] cycle is called an _epoch_. The
   size of the subset (number of training samples used per weight
   update) is called the _epoch size_.

If "epoch" and "epoch size" are not the appropriate terms for these
concepts, then some other terms are needed, so I will know what to
call the option in my neural net software that specifies the "epoch
size".

|> >We statisticians do have some strange terminology (e.g., regression,
|> >normal deviations) but at least _we_ usually know what _we_ mean.  My
|>                                       ^^^^^^^
|> I am glad you inserted this word Warren   ;-)

Statisticians are trained to avoid stating things with certainty in
the presence of uncertainty. :-)

|> [Warren's *personal* biased (oh no what do I mean by bias) view of nets
|> field chomped]

But I forgot the best example of confused neural net terminology!
Is an MLP with 1 hidden layer called a 2-layer net or a 3-layer net?
There was a thread on this a year or two ago and the consensus was that
there was no consensus.

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
