Vol. 1, #1 | The | Spring 1990 /| International /| / | Student Society / | ---------/ | /---- for ----/ | /----------- |/ Neural Networks |/ % % %%%%%% % % %%%% % %%%%%% %%%%% %%%%% %%%%%% %%%%% %% % % % % % % % % % % % % % % % %%%%% % % %%%% % %%%%% % % %%%%% % % % % % % % %% % % % % % % % %%%%% % %% % %% %% % % % % % % % % % % % %%%%%% % % %%%% %%%%%% %%%%%% % % %%%%%% % % WELCOME!!!!! We are happy to introduce the first issue of the ISSNNet newsletter. This is an important stepping stone for what promises to be a great student initiative. The ISSNNet has the following goals: to facilitate dissemination of relevant information to and among members; to facilitate interaction between members and professionals from academia and industry; to encourage student support from academia and industry; to insure that the interests of students are considered by and protected within other societies and institutions that promote neural networks; and to foster a spirit of cooperation as the rapidly expanding field of neural networks matures into a self-contained discipline. We are developing a set of bylaws that reflect our commitment to these ideals, and full elections will be held later this year for all positions on the Governing Board. In the meantime we are working on a number of useful services, including a resume service, job listings, academic program listings, and fund raising. We hope to provide funds for student members presenting papers at conferences, through our fund-raising events and by encouraging companies to donate sponsorship money. And these are only some of the ideas that we have to assist students in the field of Neural Networks! We are very pleased with the overwhelming response that we have received from our preliminary announcements. We appreciate the interest of nearly 400 prospective members from over 20 countries! We also appreciate the strong encouragement that we have received from academia and industry worldwide. Thank you for your support, we hope you will continue to follow the development of this enterprise. ------------------------------------------------------------------------------ |||__|_||_Stimulus_Traces_||_Chris Pribe___||_|| ------------------------------------------------------------------------------ This is the first issue of the official newsletter of ISSNNet. It is our hope you will find this newsletter both lively and informative. The newsletter is meant to keep you informed of news about the society and the field of neural networks. It is meant to serve as a vehicle to facilitate your access to information which is timely and useful to your research and to your career. The newsletter features articles and columns by a number of ISSNNet collaborators: Paolo Gaudiano's article inaugurates the newsletter with a description of his view on the modern history of the field of neural networks. It does so in light of the historical tension between the need to specialize and the need for interdisciplinary awareness. Karen Haines opens the Employment Forum with a commentary based on discussion with executives in this field. Tom Edward's column highlights the accessibility of neural network research to undergraduate students. Andrew Worth moderates a column that provides information about existing graduate programs. This information will be useful to prospective graduate students, and also to current graduate students seeking to identify and learn about their peers. Our "Action Potentials" section caps off the newsletter with informative announcements on Society matters and job openings. We invite all of you to join what promises to be one of the largest societies in the field of neural networks! ------------------------------------------------------------------------------- | Those of you who wish to contribute to future issues of this newsletter | | should send e-mail to , or to the editor of | | the specific column for which you wish to write. Future issues may include | | a column for editorial comments submitted by readers. All inquiries and | | comments related to the Newsletter in general should also be addressed to | | . All inquiries and comments regarding the | | Society, including requests to be added to this mailing list, should be | | addressed to . We hope that future issues | | of the newsletter will contain many of your contributions, to reflect the | | ISSNNet ideal of promoting cooperation and an open exchange of information. | | All surface mail (except where otherwise noted) should go to: ISSNNet, 111 | | Cummington Street, Room 244, Boston, MA 02215. | ------------------------------------------------------------------------------- ------------------------------------------------------------------------------ ||-- Historical Perspective by P. Gaudiano --|| ------------------------------------------------------------------------------ Before the turn of the century it was typical for individual scientists to make significant contributions to a number of different, sometimes unrelated disciplines. In the last century alone, one could find interdisciplinary scholars such as Gauss, Maxwell, Helmoltz, and Mach. Of these nineteenth-century scientists, Helmholtz was arguably the one whose eclectic work most directly contributed to modern day neural networks research. He formulated the principle of conservation of energy through his studies of electrical conduction in nerve axons. He used his knowledge of optics to make contributions to our understanding of the visual system, and his knowledge of acoustics to advance theories of audition. Helmholtz's work was held in such high esteem that his theory on the generation of missing fundamentals through nonlinear distortion, although incorrect, went unchallenged for nearly one hundred years. The beginning of the twentieth century brought about a growing segregation of psychology and biology from the formalism of mathematics and physics, and the interdisciplinary scientist became an endangered species. This segregation came about as a result of two fundamental trends: first, rapid advances in physics, reinforced by concomitant advances in mathematics, lead scientists to look toward theories about the real world and abandon the less precise, introspective realm of psychology. Secondly, mathematical formalism in psychology and biology encountered analytical problems that could not be solved through existing techniques. The work of Helmholtz in particular pointed to the need for tools that could unravel the nonlinear and nonlocal nature of cerebral computation. The lack of such tools led psychologists and biologists to abandon mathematical rigor in favor of more intuitive, albeit less rigorous approaches. It was not until the middle of this century that scientists once again attempted to reunite mathematical formalism with the cognitive sciences. McCulloch and Pitt's work (1943) brought together physiological and mathematical insights. Rosenblatt's work on the Perceptron (1959) was perhaps the forefather of neural networks. The allure of brain-like machines came to an abrupt end with the publication of Minsky and Papert's 1969 book that challenged the usefulness of such approaches. The tide turned against those scientists attempting to obtain useful results from mechanisms inspired by biology. Evidence of the profound effect of this book permeates much of the ensuing literature in related areas. For example, in his 1975 work on a perceptron-like robot controller, J. Albus writes "Early attempts along these lines were notoriously unsuccessful in producing any significant results and the subsequent disillusionment has strongly prejudiced the intellectual community against seeking any guidance from the numerous existence theorems provided by nature." During the sixties and seventies the study of neural networks was limited to isolated individuals such as Grossberg, von der Malsburg, Kohonen, Amari, Anderson, and Caianiello. Their work received little public attention and went mostly unnoticed by the majority of the scientific community. The importance and extent of much of this work is only now starting to be recognized. The late seventies and early eighties witnessed the emergence of a number of techniques to attack those problems that are intractable with a linear perceptron. Of these techniques, the backpropagation learning rule for nonlinear multi-layer perceptrons received greatest publicity through a number of attention-grabbing demonstrations. The result was a tidal wave of renewed interest in models based on putative biological mechanisms, and more applications of these models to pattern recognition, associative learning, and other problems. Scientists approaching neural networks could now rely on the theoretical and technological tools that had been developed by scholars of many different disciplines since the turn of the century. The multifarious nature of neural network research has given birth to a new type of interdisciplinary scientist--one that must be trained in a number of different fields to provide insights toward unified theories of brain function. The demand for knowledge in this area was confirmed by the immediate success of the first International Conference for Neural Networks (ICNN) in 1987 and the creation of the International Neural Network Society (INNS) and its first conference in 1988. The willingness of the intellectual community to accept this line of scientific pursuit is reflected in the recent USA law that declares the 1990s as "the decade of the brain." The rapid growth of interest in this field, along with the benefits gained from the renewed excitement, has brought with it some the unpleasant side effects that can afflict any rapidly growing structure. Many large conferences are featuring parallel sections clustered around established disciplines, thus promoting segregation instead of collaboration between the constituent fields of neural network research. A similar phenomenon is seen in the sudden proliferation of neural network journals based more on a need to supply information to experts from existing fields then on the desire to make those fields accessible to others. Also, the competitive requirements imposed by funding agencies promote application of the existing techniques to new problems instead of the development of sounder theories. As a result, most scholars approaching this field are unable to devote time to interdisciplinary training, and the emphasis is placed instead on short-term results. Today, as students, we have the opportunity to be trained as interdisciplinary scientists in the unified field of neural networks. Because of the competitive atmosphere in this young field it is difficult to find academic programs that will actually train us in all the relevant disciplines. Most students are trained by faculty members from Computer Science, Psychology, Mathematics, Engineering or other departments that can only provide them with understanding of specific aspects of neural networks. Only a few interdisciplinary academic programs on neural networks have begun to emerge in recent years. Those of us that are lucky enough to belong to such a program have the responsibility of collaborating with others to insure that similar opportunities are generated around the world. The ISSNNet will provide a medium for exchange of information to supplement the few existing academic programs, it will foster a spirit of international kinship among students from different fields, and it will do so within an environment that is free of the pressure of financial and political struggles that exist in today's neural network community. We hope that you will be willing to contribute to the realization of these ideals: only through sincere effort can we ensure that years from now there will be a field of neural networks in which to graduate. ------------------------------------------------------------------------------ ||--|| EMPLOYMENT FORUM - K. Haines ||--|| ------------------------------------------------------------------------------ One of the goals of this society is to facilitate interaction between student members and professionals from academia and industry. This forum provides a place for open discussion on the subject of finding employment in the neural network community. By presenting this forum, we hope that members will provide a retrospective view of their experiences while seeking employment. In particular, this is not a forum for gripe sessions. The newsletter currently seeks constructive input of genuine value to students seeking employment in neural network research and applications. Possible topics might include seeking employment in foreign countries (e.g. obtaining visas), establishing contacts, or discussing a company's particular area of interest. We open this forum with comments concerning the status of neural network research jobs based on discussions with several executives in the field. The outlook for job availability is bleak and the competition is tough. Not only are recent graduates competing against each other for the few research jobs, but they are also competing with more experienced researchers who often have a minimum of 5 years experience and are attempting to switch over to neural networks. As a result, those with little or no experience in neural networks may feel they have a decreased chance of getting jobs. One thing that can be done to avoid the typical catch-22 encountered by recent graduates trying to get their foot in the door is to be creative. An example of a creative approach is proposing to your prospective employer to hire you for a training period. The training period may be defined as an agreement that for the first 3 months of employment you would be willing to work for 3/4 the amount of the annual salary. It may not guarantee you the job, but it's more likely to put you in the running. Another possibility might be to seek out (by talking with people) a particular technical problem that a company or project is having, one that you think may be a viable application of neural nets. You could then offer to work on the problem for a limited time, say 6 months, as a short term research project. This provides the company with an alternative approach and limited risk. If you can manage this situation, you have got a winner. You and the company are in a position to discover useful information. In the worst case (assuming you only do what you set out to do), you gain 6 months of experience, in the best case, you've created a job for yourself. In addition to the lack of experience working against you, smaller companies with limited resources may not want to incur relocation expenses, and therefore may not even grant you an interview. If you receive a 'No thank you try at a later time' notice from a company out of town, try resubmitting your applications using the local address and phone of a friend in the city where the company is located. A caveat: if you use this approach, be prepared to hop on plane for the interview and to pay your own relocation expenses. ------------------------------------------------------------------------------ |-| Undergraduate Issues -- T. Edwards |-| ------------------------------------------------------------------------------ The ISSNNet encourages undergraduate students who are interested in studying neural networks to join the society. Neural networks are well suited to undergraduate study. Most neural models are fairly easily understood by those who have taken two or three semesters of calculus (and perhaps differential equations). Typical simulations require no more equipment than a microcomputer (although one might have to let a CPU-intensive model run for a few days). In addition, learning about neural networks will hone your abilities to track down information on current research and understand research papers and technical reports. This "Undergraduate Issues" column is a way of showing our commitment to undergraduates, and to facilitate their inclusion in a field dominated by graduate students and professionals. In future issues we hope to present undergraduate research opportunities, highlight papers written by undergraduates, and present other material to complement the general-interest topics that appear in other columns. What appears here, though, is to a large part up to you. Are you an undergraduate involved in neural network research? Do you have neural net research/programming positions available for undergraduates? If so, send e-mail to , or in writing: 331 East University Parkway, Baltimore, MD 21218, USA. ------------------------------------------------------------------------------ ||-|| Academic Programs -- A. Worth ||-|| ------------------------------------------------------------------------------ >>>> Moderator's note: Readers are encouraged to submit descriptions of their university programs that involve Neural Networks, Parallel Distributed Processing, Connectionism, etc. These submissions should include a short (1/2 page) description of the program, personal comments, and contacts for getting additional information (official descriptions, applications, etc.). Some additional information on these and other programs can be obtained by request to . If you contact the schools directly, please be sure to tell them you saw their description in the ISSNNet Newsletter, and drop us a note (). <<<<< ............................................................................. Cognitive & Neural Systems Program (M.A. & Ph.D.) Boston University, 111 Cummington St., Boston, MA 02215, USA. (617) 353-7857. by Andrew J. Worth The official description of this program is along the lines of "neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems." Classes are composed of lectures, extensive readings, and computer simulations. Readings draw from areas such as neuroanatomy and physiology, psychology, and virtually all of the various "camps" of neural networks and a (not surprising) emphasis on the work done at BU's Center for Adaptive Systems. The core faculty is made up of Daniel Bullock, Gail Carpenter, Michael Cohen, Stephen Grossberg, John Merrill, and Ennio Mingolla. The core curriculum includes eight classes. Neural Modeling I & II provide a thorough exposure to neural networks in general and Stephen Grossberg's work in particular. Courses on Vision, Motor Control, Memory, Speech, and Learning provide in depth explorations into each of these specific areas of neural networks. The program is topped off with "Advanced Topics in Neural Modeling," which lets students sink their teeth into doing research at an advanced level. Each individual's studies are rounded out by including classes from related departments: Biology, Computer Science, Engineering, Mathematics, Medical Sciences, and Psychology. In addition to classes, there are numerous opportunities (both at the Center for Adaptive Systems and the greater Boston area) to attend lectures, symposiums, etc., on neural, computational, and neural network issues and to meet outstanding researchers in these fields. The M.A. requirements include 8 courses (at least 4 core courses), competence in a foreign language, and a comprehensive examination. Ph.D. students are required to take 16 courses; 8 core courses and 8 in an approved area (a previous Master's degree can take the place of the latter 8 courses); demonstrate competence in a foreign language, pass a PhD qualifying examination, and complete a dissertation and oral examination. I feel the strengths of the program lie in its interdisciplinary nature and its analysis of computational theories by simulating them as mathematical models. Also, two important hallmarks of Stephen Grossberg and the Center for Adaptive Systems' work are (1) the models' motivation and constraint by psychological data, physiological data and gedanken (thought) experiments, and (2) an emphasis on nonlinear dynamical systems. The disadvantages I can think of are that the program is still fairly new, and the facilities have been slow to keep up with its rapid growth (40+ students in its second year). Further information about the program can be obtained through the address above. If you have any comments or questions about my description of the program, please contact me by e-mail (worth@bucasb.bu.edu) or at the CAS address above. ........................................................................... GRADUATE STUDENT RESEARCH IN NEURAL NETWORKS/CONNECTIONISM AT THE UNIVERSITY OF COLORADO, Department of Computer Science, Boulder, CO 80309. by Olivier Brousse The BOULDER CONNECTIONIST RESEARCH GROUP consists of : ADVANCED GRADUATE STUDENTS: Jonathan Bein, Rick Blumenthal, Olivier Brousse, Rich Fozzard, Phil Gardner, Doug Joseph, Patrick Lynn, Don Mathis, Clayton McMillan, Beth Richards, Dennis Sanger, Bill Skaggs, Don Wolfe. POST-DOCS: David Goldberg, Yoshiro Miyata PROFESSORS: Gary Bradshaw (Psychology), Michael Mozer (CS), Paul Smolensky (CS), Kelvin Wagner (Optical Neurocomputing). The group is also engaged in collaborative work with Bruce McNaughton (Neuroscience/Psychology), Dana Anderson (Optical Computing) and Kristina Johnson (Optical Computing). CONNECTIONIST COURSES: Courses in connectionism include a two-semester sequence, CS5522 Introduction to Connectionist AI, and CS5622 Advanced Connectionist Modeling, ECEN 5811 Neuroelectric signals, ECEN 5821 Neuroelectric systems, ECEN 5831 Brain, mind and computers, ASEN 5028 Neural modeling. SOFTWARE TOOLS: A number of connectionist simulating tools are available to the group. These include P3, a connectionist simulator running on the Symbolics Lisp machines, SunNet (developed by Yoshiro Miyata),the Rochester Simulator, running on the Sun workstations, PDP3, running on all UNIX machines Mactivation, developed by Mike Kranzdorf, running on the Macintosh. CONNEX, a general interface to connectionist simulators, is currently being developed by Michael Mozer and student members of the group. ADVANCED GRADUATE STUDENTS: * JONATHAN BEIN: Implementation of a connectionist model for inductive information retrieval. * RICK BLUMENTHAL: Recursive Competitive Learning Network for doing data compression. * OLIVIER BROUSSE: Investigating rapid learning and explosive generalization in combinatorial connectionist networks. * RICH FOZZARD: Building a connectionist expert system predicting solar flares. * PHIL GARDNER: Simulating connectionist learning algorithms on parallel distributed architectures. * DOUG JOSEPH: Applying numerical optimization techniques to connectionist learning algorithms. * PATRICK LYNN: Biologically motivated models of cognitive processes. * DON MATHIS: Connectionist learning of heuristics in problem-solving environments. * CLAYTON MCMILLAN: Linguistic analysis of a connectionist model. * DENNIS SANGER: Analyzing the hidden units of back-propagation networks: Contribution analysis. * BETH RICHARDS: Representation Transformations in Connectionist Learning: Applications to Speech Recognition. * DON WOLFE: Connectionist models for visual representation and analogical reasoning. Requests about publication or more information should be made to: . Nicer and longer formatted descriptions can also be made available. ........................................................................... School of Psychology University of New South Wales, Australia by Trevor Phelps [ Moderator's note: Trevor, one of our australian representatives, has provided this survey of academics with interests in neural networks in the School of Psychology at UNSW. A future newsletter will feature a similar list for the Electrical Engineering and Computer Science departments.] 1. Dr. Sally Andrews - primarily interested in neural network applications to word recognition. Does not do any modelling. 2. Assoc. Prof. Jim Kehoe- involved in connectionist modelling of the rabbit's nicitating membrane response. His model particular looks into the "learning-to-learn" phenomena. Has had a number of publications in the area, and a number of researchers throughout the world are implementing his model. The model, effectively, is a neural network model of classical conditioning. 3. Trevor Phelps. - MATHEMATICAL AND NEURAL MODELLING OF HUMAN MEMORY AND LEARNING: Developing an adaptive neural model which will present a possible mechanism for information storage of abstracted generalizations of many exemplars(prototypes). The research assumes that concepts are ill-defined and that an integration of exemplar and prototype(or rules) are stored in the brain. The research will consist of human behavioural experiments and computer simulations. Preliminary experiments already carried out have shown that the ability of human subjects to discriminate between two prototype patterns is a function of noise, interprototype variance and the quantity of information defining the prototypes. Future experiments will be conducted to investigate other dynamical properties of human information processing. 4. Dr. Marcus Taft - Marcus is interested in how neural network models can be implemeted in lexical processes. He also does not do any modelling. 5. Although associated with Electrical Engineering, Dr. Peter Neilson is developing an adaptive filter neural model of motor control. The reason for including with other behavioural researchers is that Dr. Neilson is applying his model to the rehabilitation of motor behaviour with people suffering from cerebral palsy. Already, he has a clinic at one of the local hospitals targetted towards this purpose. ------------------------------------------------------------------------------ ||-|| Action Potentials: ISSNNet Announcements || ------------------------------------------------------------------------------ ******* Moderator's note: Please submit all short announcements (max. ten lines) to , or send surface mail to: ISSNNet, 111 Cummington Street, Room 244, Boston, MA 02215, USA. ******** >>> ISSNNet goes public! <<< We are happy to announce that the organizers of both the San Diego IJCNN conference (June 17-21) and the Paris INNC conference (July 9-13) have donated a booth in the exhibit hall for ISSNNet. We will feature a GLOSSY copy of this newsletter, T-Shirts, resume service, and other events. Student members of ISSNNet may submit short papers or other items suitable for booth presentation. Contact for details. >>> Resume Service <<< There is still time to get in on the ISSNNet resume service. For information contact . >>> Conference Volunteers! <<< What a great way to go! You get FREE CONFERENCE REGISTRATION and FREE PROCEEDINGS in exchange for help with the San Diego and Paris conference and tutorials. For IJCNN (San Diego) volunteer information, contact Nina Kowalski (, (301) 889-0587). For INNC (Paris) volunteer information, contact Karen Haines ((412) 362-8675 (USA), ). >>> JOBS JOBS JOBS <<< [Note: Some of these jobs were advertised on different mailing lists or newsgroups. Date and place of the original announcement are included at the top of each ad. Some ads have been shortened from original.] - Companies: Are you hiring? Are you thinking about getting into neural networks? ISSNNet can help both you and our members! Send e-mail to , or surface mail to ISSNNet Job Program, 111 Cummington Street, Room 244, Boston, MA 02215, USA. - Attention Undergraduates and Graduates: The Connection Machine Facility at the Naval Research Lab (Washington, D.C.) is looking for students interested in parallel programming on the Connection Machine (a SIMD parallel computer with 16K processors). The job involves working with NRL scientists to research models and algorithms on the Connection Machine. There is some opportunity for neural network research. Contact Dr. Hank Dardy at the lab via e-mail (dardy@cmsun.nrl.navy.mil). - [Mar 16 1990, on various lists] The component of The Food and Drug Administration responsible for regulating medical devices has an opening for a research scientist. This is a permanent civil service position available for someone interested in modeling the neural activity of the hippocampus. The candidate will focus his/her research on improving the safety and effectiveness of electro-convulsive therapy devices. The candidate must have a PhD in one of the physical sciences. Any additional training in the biological sciences is highly desirable. For more information call or write to: Dr. C. L. Christman, (301) 443-3840, Address: FDA, HFZ-133, 12721 Twinbrook Pkwy, Rockville, MD 20857. - [Mar 9 1990, misc.jobs.offered] >>> Undergrad 1990 Summer work! <<< This "research experience for undergraduate" (REU) project is based on a grant from the NSF. The focus of the project will be on "intelligent systems" (including neural nets, expert systems, and communicating agents). You will be experimenting with software development tools and developing your own software and intelligent systems. We have several grad students doing neural net research, so there is a possibility for collaborative work. In addition, there will be opportunities to publish papers about the research carried out. Contact: Dr. Larry Medsker, Computer Science and Information Systems Dept., The American University, 4400 Massachusetts Ave. NW, Washington, DC 20016 For more info, you can call (202) 885-1470 and ask for Dr. Medsker or Dr. Anita Lasalle. - [Mar 26 1990, comp.ai.vision] Applications are sought from suitably qualified persons for a research fellowship in the department of Geography at Reading University. The person appointed will work on the development of algorithms for the segmentation and interpretation of satellite images of the earth's surface. The position is initially until the end of December 1992; related projects are likely to be in place at the end of this time, providing the possibility of continuation of employment. A good programming background is essential, and experience of computer vision or related subjects will be a definite advantage. For further details please contact Jeff Settle . [Note: From ARPANET you may try ] >>>>>>> CALL FOR HELP <<<<<< >>>>>> How YOU can help the Society! <<<<<<<< - SPONSORS: we are undergoing the process of becoming a non-profit organization. We need monetary and material support. All our profits are used to support students (with minimal overhead through much volunteer work). We would like to receive travel and registration money for students presenting papers at conferences. If you want to sponsor an individual paper, or wish to contribute in other ways send e-mail to , or write to: ISSNNet Sponsorship Program, 111 Cummington Street, Room 244, Boston, MA 02215. - NNet CONFERENCE ORGANIZERS: Show your support to students! Offer low student rates or help our Society. Send e-mail to . - We are setting up a network of Governors and Ambassadors to represent national and international groups on the Governing Board of the Society, and also to coordinate local mailing lists and memberships to minimize costs and e-mail traffic. If you can represent at least ten fellow Members, have easy access to e-mail and printing facilities, and if you want to be an active part of the Society, send mail to for details. We want to make the Society reflect the need of its members, and not the other way around! We already have governors from France, England, Canada, Australia, Germany, Ireland, and Spain, but we always need more! - COMPUTER NEEDED: We are trying to set up an ftp site for storage of Society related information. This would include official program descriptions, old newsletter, simulation code, student articles, class notes, and any other useful materials. We would love someone to give us access to a computer with up to 100MB of disk and easy access to the various networks. We have experienced volunteers who could set up anonymous ftp, and maybe set up a newsgroup or mailing list. Send e-mail to . - >>>>>>>>>>>>>>>>>> BECOME AN ISSNNet MEMBER !!!!! <<<<<<<<<<<<<<<<<<<<<<<<<<< The single most important way to show your support for ISSNNet is to fill out the membership form below. Your money is used in part to defray the cost of printing this newsletter for distribution at conferences, or to support other ISSNNet programs. All profits go to student support. Membership includes a subscription to this newsletter. There are several types of membership, all cost (US)$5. Students, prospective students, and recent graduates are eligible for regular memberships. Only regular members may vote or hold office. Faculty and staff of recognized academic institutions may join as academic members. All others may become affiliate members. ISSNNet bylaws and other details are available upon request . NOTE FOR NON-USA Members: if your country has a Governor, his or her name will appear below. Make your payment to that person's name in your local currency. Cut here ----------------------------------------------------------- Cut here |||| 1990 ISSNNet Membership Form |||---------------------------------|||| We encourage all friends of neural networks to join the society. Please send this completed form and a check for $5 (in US dollars) to ISSNNet. This is a one-year membership that expires June, 1991. Name:_________________________________ mail to: Address:______________________________ ISSNNet Memberships ______________________________ 111 Cummington St, Rm. 244 ______________________________ Boston, MA 02215 ______________________________ USA School:_______________________________ e-mail (include network) :_________________________________ (PLEASE PRINT CLEARLY TO AVOID LOST MAIL!) Membership Type: ( ) Regular ( ) Academic ( ) Affiliate 1. May we include this information in our membership directory? ( ) yes ( ) no 2. May we put you on other mailing list to fund the society? ( ) yes ( ) no 3.To help defray printing and mailing costs, may we send you electronic mail only? ( ) yes ( ) no 3a. If you marked "yes" to question 3, what formats do you prefer? PostScript ( ) LaTeX ( ) Troff ( ) Plain text ( ) other_______________ (We may be able to accommodate all these and other formats. No binary formats) signature:____________________________ date:_________________________