Newsgroups: uk.finance,misc.invest.futures,comp.ai.genetic,comp.ai.fuzzy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!news.mathworks.com!usenet.eel.ufl.edu!news-feed-1.peachnet.edu!gatech!psuvax1!uwm.edu!fnnews.fnal.gov!gw1.att.com!gw2.att.com!pacbell.com!well!miwok!nbn!news
From: Ron Macken <rmacken@calon.com>
Subject: Re: Financial Neural Networks Home Page
Message-ID: <D5ALvu.Fp5@nbn.com>
Sender: news@nbn.com
Organization: North Bay Network's news posting service - not responsible for content
References: <hacker-0403951257010001@dcc00171.slip.digex.net> <3jdl2s$4bm@deadmin.ucsd.edu> <3jeqju$ovq@sndsu1.sedalia.sinet.slb.com> <3jfd8n$raa@deadmin.ucsd.edu>
Date: Sat, 11 Mar 1995 20:21:29 GMT
Lines: 13
Xref: glinda.oz.cs.cmu.edu comp.ai.genetic:5176 comp.ai.fuzzy:4172

> >> ...  Time and time again it
> >> has been demonstrated that optimizing on past data leads to disaster
> >> when applied to the present tense using real money....

Much has been made of the dangers of optimizing on past data.
I'm working on a genetic programming system that is developed on
several sets of data and then is tested on different sets of data.

A model would only be considered successful if it worked well on
all of the test sets of data.

While this still carries risks as a predictor of the future,
it seems that the risks are greatly reduced. Am I missing something?
