Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!europa.eng.gtefsd.com!howland.reston.ans.net!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: What do you mean by "epoch"? (abused term ?)
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <D4FHw9.4qG@unx.sas.com>
Date: Thu, 23 Feb 1995 01:09:45 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <3ig1hh$v1@uuneo.neosoft.com> <3ig6ak$9ut@cantaloupe.srv.cs.cmu.edu>
Organization: SAS Institute Inc.
Lines: 48


In article <3ig6ak$9ut@cantaloupe.srv.cs.cmu.edu>, sef@CS.CMU.EDU (Scott Fahlman) writes:
|>
|> In article <3ig1hh$v1@uuneo.neosoft.com> hav@neosoft.com writes:
|>
|>    I was just wondering what folks mean by the term "epoch" for it seems
|>    to be somewhat abused.
|>
|>    As I understand it, the term epoch refers to a collection of measurments
|>    all made from the same origin - so when I say epoch, in relation to NN training,
|>    I mean the number of patterns processed between application of weight updates
|>    (errors are summed over the epoch - all are measures from the same origin on
|>    the surface being searched).
|>
|> I believe the term "epoch" is more commonly used to mean a single pass
|> through a fixed training set that is going to be used repeatedly.  You
|> may update the weights only at the end of each epoch or after each
|> training case goes by -- it's still referred to as an epoch.

A single pass through a fixed training set is what I would call an
"iteration". I have been using "epoch" as Horace defined it, in
distinction to "iteration". If "epoch" is not an appropriate term for
the number of patterns processed per weight update, what _do_ you call
that?

David DeMers wrote recently:
|> Most fields have their own jargon and arcana which set up barriers to
|> outsiders. Statistics, in particular, seems dedicated to preserving
|> impenetrability [ok, that's a bit provocative; save the flames and reply
|> with counterexamples...]

We statisticians do have some strange terminology (e.g., regression,
normal deviations) but at least _we_ usually know what _we_ mean.  My
impression of the neural network field is that there is considerable
confusion over terminology. Witness the above question on "epoch".
Suppose someone refers to training a net with 100 "inputs"; sometimes
that means there are 100 input nodes, sometimes it means there are 100
training cases. And how about what statisticians call a "population",
i.e. the set of all cases one wants a network to generalize to; this is
a fundamental concept, but I haven't found any term for it in the neural
net literature. And if anyone is dedicated to "preserving
impenetrability", it's the ART folks!  :-)

-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
