Perceptron lecture

- explain basics of perceptron units

- p. 95: decision boundary is perpendicular to the weight vector

- without a bias connection, decision boundary must pass through the origin
 (show this with figure of x1,x2 intercepts)

- explain the "threshold as bias connection" trick

- units can be binary, continuous, or stochastic

  (explain how stochastic units work)

  you can approximate a probability distribution
  using stochastic binary units

- derive LMS rule

- show output of perceptron program

- prove perceptrons can't learn XOR

- batch vs. stochastic update

- pp. 98-100: explain difficulty measure D(w)

- p. 109: explain relative entropy error measure

- explain encoder problem, and solutions for N-2-N

- Paul's videotape

  (a) first sequences are drawn in (x1,x2) input space:
         - four input cases, shown by fixed dots
         - decision boundary is shown by a line that moves

  (b) remaining sequences are in (h1,h2) hidden space:
         - input cases shown by moving dots
           and are color coded (e.g., blue/red)
         - decision boundaries shown by colored lines

