16-811: Math Fundamentals for Robotics, Fall 2025

Brief Summaries of Recent Lectures




Num Date Summary
01 26.August

We discussed course mechanics for about half the lecture.

We looked at the Jacobian of a simple 2-link manipulator. The Jacobian is a matrix that relates differential joint motions to differential motions of the arm's end-effector. Using a virtual work argument we showed how the transpose of this matrix relates desired Cartesian forces imparted by the end-effector to the joint torques necessary to generate those forces. A summary of our discussion appears here.

We further considered the case of multiple fingers gripping an object (or multiple legs standing on some surface) and wrote down a linear equation of the form

-F = Wc

to describe the contact forces c needed to oppose a generalized force F (force and torque) acting on some object. Here W is the so-called wrench matrix that models forces and torques induced by contact forces. A summary of that discussion appears here.

We discussed the importance of bases in representing linear transformations, and in particular, how the matrix used to represent a given transformation changes as one changes bases.

A basis for a vector space is a set of linearly independent vectors that spans the vector space. This means that every vector in the vector space can be written in a unique way as a finite linear combination of basis vectors.

The representation of a linear function by a matrix depends on two bases: one for the input and one for the output. As we will see in future lectures, choosing these bases well allows one to represent the linear function by a diagonal matrix. Diagonal matrices describe decoupled linear equations, which are very easy to solve.

For next time, please go back to your linear algebra course and remind yourself of the method one uses to generate linear coordinate transformations, represented as matrices: The columns of such a matrix are the vectors of one basis expressed as linear combinations of the other basis's vectors. This example illustrates some of the relevant points.

Briefly, at the end of the lecture, we reviewed some facts from linear algebra. We defined the column space, row space, and null space of a matrix (and linear functions more generally). As an example, we considered the matrix linked here.

02 28.August

We started the lecture with a review of some facts from last time. Subsequently, we discussed some conditions under which a square matrix is not invertible.

During much of the lecture we discussed Gaussian Elimination. We computed the PA = LDU decomposition for some matrices. Such a decomposition exists whenever the columns of A are linearly independent (and may exist more generally for other matrices).

For an invertible matrix A, the PA = LDU decomposition makes solving Ax = b simple:

  • First, solve Ly = Pb for y. This is easy because L is lower-triangular, so one can basically just "read off" the components of y using substitution.

  • Second, solve Ux = D-1 y for x. This is easy because U is upper-triangular, so one can again "read off" the components of x, now using backward substitution. Also note that D-1 stands for inverse of the matrix D. This matrix is easy to compute: Its entries are all 0, except on the diagonal. Where D has entry d on the diagonal, D-1 has entry 1/d.
  • Here are the examples we considered in lecture.

    Near the end of lecture, we started our discussion of diagonalization based on eigenvectors. We will work through an example next week. This method serves as a springboard for Singular Value Decomposition (SVD), which we will also discuss next week.





    Back to the webpage for 16-811