Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!news.kei.com!simtel!col.hp.com!news.dtc.hp.com!hpscit.sc.hp.com!hp-apd-news!hpsgm1!ang
From: ang@hpsgrt1.sgp.hp.com (Teck-Hua ANG)
Subject: Backprop written with integer/fixed pt maths ??
Sender: news@news.sgp.hp.com (NEWS ADMIN)
Message-ID: <D5s0nE.56F@news.sgp.hp.com>
Date: Tue, 21 Mar 1995 05:59:37 GMT
Nntp-Posting-Host: hpsgrt1.sgp.hp.com
Organization: Hewlett-Packard Singapore 
X-Newsreader: TIN [version 1.2 PL2]
Lines: 22

I would like to hear comments from anyone who have written or used
integer/fixed point maths for their back propagation neural nets
implementation. 

The reasons I am thinking of doing that are speed and storage space requirement. 
Since a neural network using integer arithmetic would usually be faster
on the PC than one using floating point, even with a math co-processor.

The drawbacks of integer maths are the reduction in precision and 
greater difficulty in coding. Any comments anyone ?

Does anyone know of any codes available to implement Back Prop using 
integer arithmetic ?

Response via email or post are welcome. 

Thanks and best regards

Teck H. ANG



