Date: Wed, 20 Nov 1996 21:53:40 GMT Server: Apache/1.0.5 Content-type: text/html Content-length: 5971 Last-modified: Fri, 02 Aug 1996 06:55:49 GMT CS 578 Syllabus - Fall '94

Neural Networks and Connectionist Computing

TuTh 9:35-10:50am, 2241 SFLC

Tony Martinez, 3334 TMCB, Office Hours: TuTh 3:00 - 4:00pm or by appointment

Goals: Introduce and study the philosophy, utility, and models of connectionist computing, such that students are able to propose original research with potential follow-up in a graduate research program. Expand the creativity of the students in all aspects of computing.

Text: J. Mclelland and D. Rumelhart, Explorations in Parallel Distributed Processing: A Handbook of Models, Programs, and Exercises. Prepared Packet of papers at the end of each section of the notes. You will be expected to read the assigned literature beforeand optionally after the scheduled lecture.

Prerequisites Senior or Graduate standing, computer architecture, Calculus, Creativity.

Lab (3346 TMCB): 4 Mac II's with 5MB RAM and 40 MB hard drives. 2 DS5000 workstations with 32MB RAM and 1GB disks and 3 high speed HP workstations. (These may be used when available but the researching graduate students have priority on these machines). Software for simulations, projects, etc. will be made available.

Literature: I have placed interesting and representative papers for reference in the periodicals room of the HBLL library. There are two separate packets (2 copies of the first) and both are under my name. As needed, I will place more packets in the library. I also have more papers in my office which can be looked over and copied under constraint of the 15 minute rule. I can also send for most any paper you wish through interlibrary loan, (and will do so), but it usually takes 2-3 weeks, so plan ahead.

Grading (~): Simulations and Homeworks: 30%, Midterm: 22.5%, Project: 22.5%, Final: 25% (Tue., Dec. 14, 7am-10am). Grading is on a curve and some amount of subjectivity is allowed for attendance, participation, perceived effort, etc. If you think, you'll be all right

Late assignments: Assignments are expected on time (beginning of class on due date). Late papers will be marked off at 5%/school day late. However, if you have any unusual circumstances (sick, out of town, unique from what all the other students have, etc.), which you inform me of, then I will not take off any late points. Nothing will be accepted after the last day of class instruction.

Project: An indepth effort on a particular aspect of neural networks. A relatively extensive literature search in the area is expected with a subsequent bibliography. Good projects are typically as follows: Best: Some of your own original thinking and proposal of a network, learning paradigm, system, etc. This (and other projects) are typically well benefited by some computer simulation to bear out potential. Very Good: Starting from an indepth study of some current model, strive to extend it through some new mechanisms. Not Bad: A study of a current model with an indepth analysis of its strengths, weaknesses, potential, and suggested research. Not Good: A description of a current model. The earlier you start the better. Note that in a semester course like this, you will have to choose a topic when we have only covered half of the material. That does not mean your project must cover items related to the first half of the semester. You should use your own initiative and the resources available (library literature, texts, me, etc.) to peruse and find any topic of interest to you, regardless of whether we have or will cover it in class. Interesting models which we will probably not have time to cover indepth in class include: Feldman nets, Genetic algorithms, Kohonen maps, HOTLU's, BAMs, CMAC, ASN, Cognitron, Neo-Cognitron, BolzCONS, Michie Boxes, Cauchy Machines, Counterpropagation, Madaline II, Associative Networks, RCE, etc, etc.

Topics and Reading Assignments
1. Intro to Neural Networks (1) *
2. Brain and Nervous System (3) Your Neural Network
3. Computation, VN Bottleneck, and NN Goals (1)
4. Definitions, Theory, learning, applications, and General Mechanisms of Neural Networks (2)
5. Delta Rule Models - Linear associators, Perceptron, Adaline, Quadric Machines, Higher Order networks, Committee Machines, Delta rule Simulation and separability issues (4)
6. Back-Propagation (2) Backpropagation Sim.
7. ASOCS (Adaptive Self-Organizing Concurrent Systems) (6)
8. Midterm (1) Project Abstract
9. Hopfield Networks (2)
10. Boltzmann Machine (1)
11. Competitive Learning, Adaptive Resonance Theory (2) CL Simulation
12. Survey of other models, implementation, future research (2)
13. Oral Presentations (2) Final Project Paper
*As a general rule, read all of the papers at then end of a section of notes before the lecture.
Go to
Comments to webmaster