This will be the 2nd workshop entitled “Can we build Baymax?”. The 1st workshop was held at Humanoids 2015 in Korea. Baymax is the soft humanoid character in Disney's feature animation "Big Hero 6." It is a healthcare robot with an inflatable body, capable of walking, bumping into surrounding objects and physically interacting with people. However, in the real world, it is not an easy robot to build. For the realization of this robot, awareness of the environment, especially a sense of touch and vision, is very important. Covering robots with soft material to protect both humans and the robot is also required. Integrating these features into actual hardware using reasonable fabrication methods is a challenging task. In this workshop, we will discuss topics related to building robots like Baymax with special features including but not limited to implementation of skin sensors, methods to protect humans and robots, and fabrication of soft skin for humanoids.


09:00 - 09:30 Introduction by the Organizers
09:30 - 10:00 Chris Atkeson
Optical Robot Skin and Whole Body Vision
One way to achieve high resoution tactile sensing is to use imaging to measure skin deformation. If the skin if fully transparent, it is also to possible to image nearby objects and provide a "proximity" sense which builds a map of nearby objects and surfaces. This talk describes first attempts at implementing optical skin for soft robots.
10:00 - 10:30 Gordon Cheng
Making Hard Robots Actively Soft
Conversational robots are hard and unsafe for real physical interactions - and not soft. Although efforts with joint force/torque control have introduced a way to interact with robots, thus yielding a level of compliance to robots – which makes them a bit softer. In our latest works, we offer yet another level of compliance to robots, providing softness at the surface level – making them actively soft. In this talk, I will present our works through all the stages of creating this new level of surface soft controls.
10:30 - 11:00 Coffee Break
11:00 - 11:30 Hee Sup Shin
Toward A Soft Sensor Skin
In this talk, we introduce two different soft sensors, tactile and strain sensors. Both sensors were microfabricated using soft materials, resulting in compact sized sensors with high dynamic ranges. The design, fabrication, and characterization of the sensors will be presented in detail. In addition, two different methods to fabricate a soft skin will be introduced: integrating the strain sensors in a skin and manufacturing large area skins using a computerized numerical control (CNC) milling.
11:30 - 12:00 Rudan János
OptoForce Multi-Axis Force and Torque Sensors to be used in Humanoid Robots
As humanoid robots are achieving higher and higher complexity, the application of force and torque sensing is going to have crucial role on multiple levels. Multi-axis force sensors integrated into the feet can serve essential feedbacks about the layout of the ground, the ground-robot contact and the balance of the robot. Torque sensors applied in the joints can be used to have sensorial inputs about the body position and the execution of the planned movements. Small, sensitive force sensors in the hand and fingers of the robot can increase it's capability to interact with the environment properly. In the talk we are going to cover these applications of multi-axis force/torque sensing and we will see how OptoForce sensors can be applied in any of these setups.
12:00 - 12:30 Akihiko Yamaguchi
Optical Soft Skin For Soft Object Manipulation
Recently our research challenge is robot cooking. Robotics technologies for handling cooking need to go beyond rigid object manipulation. Robot cooking involves many different types of manipulation, such as deformable (soft) object, chemical and thermal processes. We are taking a skill-library approach where we have many different strategies (skills) for handling difficult manipulation; for example tipping, shaking, and squeezing skills for pouring. Selection of skills and adjustment of skill parameters are done with model-based reinforcement learning. One big challenge is increasing the sensing capability. With lack of sensors, generating behaviors become much harder. We are proposing whole-body vision. In this talk, we focus on a specific version, optical skin for robot fingers. The optical skin is a camera-based tactile sensor that perceives tactile information (force field) and proximity vision. This multimodality is useful in soft object manipulation. Related papers and videos are available at akihikoy.net
12:30 - 14:00 Lunch
14:00 - 14:30 Shuuji Kajita
Fall Experiments of a Humanoid Robot HRP-2 Kai with an Airbag
In this talk, we explain some technical details related to "Impact Acceleration of Falling Humanoid Robot with Airbag" which will be presented in the conference of Humanoids 2016. Preliminary experiments using commercial shock absorbing materials, and the comparison with another airbag system will be discussed.
14:30 - 15:00 Marc Killpack
Soft Robot Variable Stiffness Control, Failure Mitigation, and Tactile Sensor Design
Soft robots have the potential of changing the way that robots interact with the world and humans because of their low-inertia and potential low-cost. In this workshop talk, I will focus on our recent results in being able to simultaneously control position and stiffness of an antagonistic, pneumatically-actuated, soft robot joint. I will also present results on how varying stiffness can increase the operational life time of a soft pneumatic robot with a leak without directly sacrificing end effector accuracy. Finally, I will present our basic design for a tactile sensing grid that is easy to manufacture and integrate with soft robots.
15:00 - 15:30 Jinoh Lee
Development of Modular and Active Impact Protection System for Humanoids Falling
In this talk, we demonstrate a newly developed active-soft impact protection system specialized for humanoids, where a soft inflating vessel design, which is suitable to be used in the impact reduction mechanism, is incorporated with a miniaturized active pressure control unit with on-off solenoid valves. Owing to its modular design strategy, the developed impact protection system may be applied to a variety of practical humanoid robots. As a preliminary case study, the soft -inflating vessel is designed to be implemented on hands of a humanoid robot WALK-MAN. Details of the design, fabrication, experimental verification, and falling experiments will be covered.
15:30 - 16:00 Coffee Break
16:00 - 16:30 Masayuki Inaba
16:30 - 17:00 Joohyung Kim
Towards Huggable Robots
In this talk, we show our efforts to implement huggable robots. It'll include the design and fabrication of our soft skins and our hugging study with children. We will also present a new arm and hand system, which is covered with air bags for soft interaction.
17:00 - 18:00 Closing Discussion


Gordon Cheng

Gordon Cheng holds the Chair of Cognitive Systems with regular teaching activities and lectures. He is Founder and Director of Institute for Cognitive Systems, Faculty of Electrical and Computer Engineering at Technical University of Munich, Munich/Germany. He is also the coordinator of the CoC for Neuro-Engineering - Center of Competence Neuro-Engineering in the Department of Electrical and Computer Engineering.

Formerly, he was the Head of the Department of Humanoid Robotics and Computational Neuroscience, ATR Computational Neuroscience Laboratories, Kyoto, Japan. He was the Group Leader for the newly initiated JST International Cooperative Research Project (ICORP), Computational Brain. He has also been designated as a Project Leader/Research Expert for National Institute of Information and Communications Technology (NICT) of Japan. He is also involved (as an adviser and as an associated partner) in a number of major European Union Projects.

Over the past ten years Gordon Cheng has been the co-inventor of approximately 20 patents and is the author of approximately 250 technical publications, proceedings, editorials and book chapters.

Masayuki Inaba

Masayuki Inaba is a Professor at the Department of Creative Informatics in the Graduate School of Information Science and Technology, The University of Tokyo. He graduated from the department of Mechanical Engineering at the University of Tokyo in 1981, and received M.S. and Ph.D. degrees from the graduate school of Information Engineering at The University of Tokyo in 1983 and 1986 respectively. He was appointed as a lecturer in the Department of Mechanical Engineering at The University of Tokyo in 1986, an associate professor in 1989, and a professor in the Department of Mechano-Informatics in 2000, and also a professor in new Department of Creative Informatics from 2005. He is directing the JSK Robotics Lab at The University of Tokyo. His research interests include key technologies of robotic systems and software architectures to advance robotics research. His research projects have included hand-eye coordination in rope handling, vision-based robotic server system, remote-brained robot approach, whole-body behaviors in humanoids, robot sensor suit with electrically conductive fabric, flexible spined humanoid and developmental JSK mother projects with the remote-brained system environment, life-size assistive humanoids, musculoskeletal spined humanoid series, whole-body soft sensor tissues, IRT home assitance with personal mobility, open-source robotics middlewares, high speed-and-powered legs for the next generation humanoid.

Shuuji Kajita

Received M.E. (1985) and Dr.E. (1996) degrees in control engineering from Tokyo Institute of Technology, Japan. In 1985, I joined the Mechanical Engineering Laboratory, Ministry of International Trade and Industry. Meanwhile I was a Visiting Researcher at California Institute of Technology, 1996-1997. Currently I am a senior researcher at the National Institute of Advanced Industrial Science and Technology, Tsukuba, Japan, which was reorganized from AIST-MITI in April 2001. My research interests include robotics and control theory. I am a member of Society of Instrument and Control Engineers, Robotics Society of Japan and IEEE (Robotics and Automation Society).

Marc Killpack

Marc Killpack is an assistant professor in the department of Mechanical Engineering at Brigham Young University (BYU) since 2013. His lab was awarded a NASA Early Career Faculty award which has funded their research on soft robots. His current research interests relate to improving modeling and control for robot manipulation in unstructured and dynamic environments. This includes applications to space exploration, search and rescue, disaster response and human-robot interaction. Marc completed his Ph.D. in Robotics from the Healthcare Robotics Lab (HRL) at the Georgia Institute of Technology. Prior to joining HRL, Marc completed Masters’ degrees in Mechanical Engineering in 2008 from both Georgia Tech and AM ParisTech (formerly ENSAM) in Metz, France. In 2007, Marc graduated with a Bachelor of Science in Mechanical Engineering from Brigham Young University.

János Rudan

János Rudan is a product consultant at OptoForce. He holds an MSc in robotics and a PhD in computer science. He is with OptoForce since 2014. He is an expert in multi-axial force and force/torque sensing, robotics and computer based control.

Hee-Sup Shin

Hee-Sup Shin received the B.S. degree in mechanical engineering from Korea University, Seoul, South Korea, in 2013 and the M.S. degree in mechanical engineering from Carnegie Mellon University, Pittsburgh, PA, USA in 2015. He is currently pursuing the Ph.D degree in mechanical engineering at University of Maryland, College Park, MD, USA. His research interests include smart materials for sensors, development of soft sensors for small-scale robots, and design of sensing skins for aircrafts.

Akihiko Yamaguchi

Akihiko Yamaguchi received the BE degree from the Kyoto University, Kyoto, Japan, in 2006, and the ME and the PhD degrees from Nara Institute of Science and Technology (NAIST), Nara, Japan, in 2008 and 2011, respectively. From April 2010 to July in 2011, he was with NAIST as a JSPS, Japan Society for the Promotion of Science, Research Fellow. From August 2011 to March 2015, he was with NAIST as an Assistant Professor of the Robotics Laboratory in the Graduate School of Information Science. From April 2014 to March 2015, he was a visiting scholar of Robotics Institute in Carnegie Mellon University, and from April 2015 to present, he is a postdoctoral fellow of the same institute. His research interests include robot learning, reinforcement learning, artificial intelligence, and manipulation of deformable objects especially pouring.



Christopher G. Atkeson

I am a Professor in the Robotics Institute and Human-Computer Interaction Institute at CMU. I received the M.S. degree in Applied Mathematics (Computer Science) from Harvard University and the Ph.D. degree in Brain and Cognitive Science from M.I.T. I joined the M.I.T. faculty in 1986, moved to the Georgia Institute of Technology College of Computing in 1994, and moved to CMU in 2000. I have received an NSF Presidential Young Investigator Award, a Sloan Research Fellowship, and a Teaching Award from the MIT Graduate Student Council.


Joohyung Kim

Joohyung Kim is currently an Associate Research Scientist in Disney Research, Pittsburgh. His research interests include implementation of robots based on animation characters, soft human-robot interaction, balancing and walking control for humanoid robots and novel mechanisms for legged locomotion. He received BSE and Ph.D. degrees in Electrical Engineering and Computer Science from Seoul National University, Korea, in 2001 and 2012. Prior to joining Disney Research, he was a postdoctoral fellow in Robotics Institute of Carnegie Mellon University for DARPA Robotics Challenge in 2013. From 2009 to 2012 he was a senior engineer in Samsung Electronics, Korea, developing biped walking controllers for humanoid robots.

Jinoh Lee

Jinoh Lee was born in Seoul, South Korea. He received the B.Sc. degree in mechanical engineering from Hanyang University, Seoul, South Korea, in 2003 (awarded Top 2% Academic Excellence) and the M.Sc. and the Ph.D. degrees in Mechanical Engineering from Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea, in 2012. Since 2012, he has joined the Department of Advanced Robotics, Istituto Italiano di Tecnologia (IIT), Genoa, Italy, as a postdoctoral researcher, and was awarded a competitive grant from the National Research Foundation (NRF) of Korean Government, titled as ‘Fostering next generation researchers program’ from 2013-2014.

He is currently a Senior Postdoctoral Researcher involved in projects such as Safe and Autonomous Physical Human-Aware Robot Interaction (SAPHARI) and Whole-body Adaptive Locomotion and Manipulation (WALK-MAN) funded under the European Community's 7th Framework Programme. In particular, he is a team member of WALK-MAN participating to final DARPA Robotics Challenge (DRC) on 5-6 June, 2015, Pomona USA, where contributions have been made to develop various manipulation skills on the humanoid. His research has been concerned with whole body manipulation of humanoid robots, compliant robotic system control for safe human-robot interaction, dual-arm dexterous manipulation, robust control of highly nonlinear systems, and smart and soft actuators. Since 2014, Dr. Lee has participated in Technical Committee member of International Federation of Automatic Control (IFAC), TC4.3 Robotics. He is also a member of IEEE Robotics and Automation, Control Systems, and Industrial Electronics Societies and the Program Committee of 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO).

Katsu Yamane

Dr. Katsu Yamane is a Senior Research Scientist at Disney Research. He received his PhD in mechanical engineering from University of Tokyo in 2002. Prior to joining Disney in 2008, he was a postdoctoral fellow at Carnegie Mellon University and a faculty member at University of Tokyo. His research interests include humanoid robot control and motion synthesis, physical human-robot interaction, character animation, and human motion simulation.

Related Links