I stated in the slides and the applet page (at
http://www.cs.cmu.edu/~ggordon/SVMs/) that the optimal-margin
hyperplane for a linearly-separable classification problem corresponds
to the weight vector at the ball center of version space. This is a
slight oversimplification, and here is the full explanation.
A word of terminology: the applet will let you look at weight space
from two views, corresponding to weight vectors (v,w,-1) and
(-v,-w,1). (The first is the default setting; placing a weight vector
by control-clicking switches to the opposite setting.) I will call
the default setting "positive weight space" and the other one
"negative weight space."
If version space is bounded, everything's OK. Version space will lie
in either positive weight space or negative weight space but not both.
The ball center of version space (the center of the largest ball
inscribed inside version space) always corresponds to the
optimal-margin hyperplane.
But what if version space is unbounded? Then version space can
intersect both positive and negative weight space. There are 4 cases,
according to whether positive version space and negative version space
include the origin. If both contain the origin, there are no training
examples, and all weight vectors are optimal.
If just one of positive and negative version space contains the
origin, all training examples are the same class. So, again, all
weight vectors are optimal: we can fix any weight vector and achieve
an arbitrary margin by adjusting the bias. The standard quadratic
program for fitting an SVM will pick the origin as the optimal weight
vector, while the most pleasing definition of "ball center" may put
the ball center of version space at infinity, but the difference makes
no difference.
If neither positive nor negative version space contains the origin,
that's when things start to get weird. The optimal weight vector will
still be equidistant from some subset of constraints (usually d+1 of
them in d dimensions). But, it may be possible to increase the size
of the inscribed ball by moving the weight vector in some direction.
Doing so will *decrease* the margin. To visualize this situation in
the applet, give it any training data (on the unit circle) which can
be separated by a line through the origin.