Instructors: Chris Atkeson,
cga at cmu
Mike Taylor, mtaylor dot ri at gmail dot com
Time: MW 3-4:20PM
Place: NSH 3002
Articles of Interest
Robot brings autonomously beer from the fridge NVIDIA Jetson Challenge - YouTube
DIY Text-to-Speech with Raspberry Pi | Hackaday
AI can beat us at games, but sometimes,
thats by cheating - MIT Technology Review
A computer was trained to play Qbert and immediately broke the game in a way no human ever has
Learning by playing | DeepMind
If HAL was Alexa
What if Alexa could do more?
Amazon picks Carnegie Mellon Team To Compete in Alexa Challenge.
Who is laughing now?
Google and Nest reunite in push to add AI to every gadget
Disney has begun populating its parks with autonomous, personality-driven robots | TechCrunch
Building a Robotic Colleague With Personality
NYT: Facial Recognition Is Accurate, if Youre a White Guy
Why being cute improves a robots functions
2018: Experimenting with course format.
Goal is for projects to interoperate and persist beyond the end of the course.
A great outcome would be a set of web pages telling others how to do the
Focus on Humanoid Personal Agents. Projects will be components of a
personal agent: conversation, hear, speak, see, gesture, move, think,
interact with web, hardware components, ...
Use some class time for group meeting with instructors.
Let projects drive lectures.
There will be some assignments intermixed with working on the projects.
Assignment 0: Due Jan 20.
Send email to Chris and Mike: What project do you want to do?
Who are you? Done any robotics? Using or creaing software agents?
Google and send me some interesting URLs.
Be sure your name is obvious in the email, and you mention the course
name or number in the subject line.
We teach more than one course, and a random email from
firstname.lastname@example.org is hard for us to process.
Assignment 1: Pick an aspect of of humanoid personal agents and
implement it. Due Feb 4.
Jan 17: Introduction to the course.
Jan 22: Survey of personal agents. Demo Alexa, Google Home, Jibo.
survey of humanoid agents.
Jan 24: Continue survey of personal agents.
Jan 29: Continue survey of personal agents.
Jan 30: Case study: Cozmo
Feb 5: How does face recognition work?
Features: pixels to higher-level features. Feature vectors.
Representations: memory-based, hand-designed parametric representations,
Feb 7: How does face recognition work?
How do we train representations? (optimization, gradient descent).
What is the best answer? Bayes Theorem.
How does vision work? What about depth cameras?
How do microphones work? Localizing and recognizing sounds.
How can we generate speech?
How does speech recognition work? Markov models and recognition techniques.
Personality and emotion.
Generating facial expressions.
How does emotion recognition work?
Where am I? Kinematics and inverse kinematics. Making gestures. Jacobeans.
Pick and place. Trajectories.
Using function optimization to solve hard problems.
Representing 3D orientation.
Where am I? part2 - Kalman filters and SLAM. Integrating information over
How can a face agent generate expressions and pretend to talk?
Motor skills: dynamics and learning. Acting and learning in simulation.
Model-based learning. Making and learning models.
Learning from practice I - integral control, learn a trajectory.
Standing, walking, and running.
Building a humanoid.
How to use an Arduino.
How to use a Raspberry Pi.
Motors: how can I make my robot move?
Human-robot interaction (HRI).
Skin and tactile sensing. Robot touch and force sensors.
Smell and taste.
Other forms of learning: clustering, learning probabilities. reinforcement learning, learning from demonstration.
Ethics, policy issues
Entertainment robots. Robot playmates.
Safety. Handling errors and robostness.
Things to check out: