Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news4.ner.bbnplanet.net!news3.near.net!paperboy.wellfleet.com!news-feed-1.peachnet.edu!gatech!news.sprintlink.net!noc.netcom.net!netcom.com!shankar
From: shankar@netcom.com (Shankar Ramakrishnan)
Subject: Re: about the future
Message-ID: <shankarDBH04r.Bz5@netcom.com>
Reply-To: shankar@vlibs.com
Organization: VLSI Libraries Incorporated
References: <DB8A1I.6uy@newssv1.pcvan.or.jp> <3tk6co$3d3@sndsu1.sedalia.sinet.slb.com>
Date: Sun, 9 Jul 1995 22:36:27 GMT
Lines: 52
Sender: vlsi_lib@netcom13.netcom.com

In article <3tk6co$3d3@sndsu1.sedalia.sinet.slb.com> Jacob Dreyer <jdreyer@houston.geoquest.slb.com> writes:
>>Today, I see computers making progress so rapidly.  For instance, 7 years ago, 
>>my computer with an 8 MHz 8 bit CPU and 192k RAM was so cool to me.  
>>Now it's just a crap. What we have now are 120 MHz Pentium CPU and 16000k RAM.
>>
>>Will it ever stop?  Like some mathematics-functions, will it converge?  If so, 
>>what do you think it will be?  Do you think it will be improving forever?
>
>
>There are physical limitations to both the speed of a computer
>and to the storage capacity (at least capacity per area).
>As far as I know we are not even close to any of these though,
>but it whould be interesting if anyone have details on this.
>What about other kind of computers: massive parallel, biological,
>quantum mechanical, etc.?

One thing that might help future computing is 3-D connectivity (like in
the brain). Of course, massively parallel computing would be the way
to go, since we are already seeing the limitations of serial machines.
>
>
>It is an interesting question; We've all become used to the ever
>improving software the improvements in hardware has made possible.

Also more code bloat.
>
>What will happen the day this stops? First we will (again) be
>focusing on algorithms and we will see slight improvements on
>quicksorts, Bresenhams, ray tracing etc.

these would be inconsequential to the general framework of AI. 

 Then performance
>dependent programs will be rewritten from C++ to C and then to
>Assembly language.  After a period where hardware has been more
>and more 'generalized', we will again see extreme hardware made
>for specific purposes. 

If you talk of graphic controller cards, there was never a period of
generalization. And the enormous complexity of future programs would
rule out hand assembly-coding. However, we would be able to develop supersmart
compilers that do a better job than most experienced assembly language
coders. 
>
>When do we reach the hardware limit for the computers of today?
>My guess is 50-100 years from now!
>
One never knows!

Shankar


