I am a Research Scientist at Facebook Reality Labs (formerly Oculus Research) in Redmond. I am currently exploring contextual and adaptive approaches for interactions with future augmented reality (AR) devices. I obtained my PhD in Fall 2018 at the Robotics Institute, Carnegie Mellon University working with Stelian Coros and Jim McCann. My research interests lie in applying optimization and data-driven techniques to solve complex real-world problems.
During my PhD, I build computational design tools that enable casual users to build robots, towards increasing accessibility of robotics. In the past, I have also had not-so-brief stints in the area of human motor control and biomechanical simulations. During my Masters, I worked with Hartmut Geyer on understanding human control of leg placement. I have also spent some time trying to understand human balance, and motor skill aquisition in people learning bongoboarding with Jessica K. Hodgins as an intern at Disney Research Pittsburgh. In 2017, I spent a wonderful summer with the User Interface Research group at Autodesk Research, Toronto. I worked with Fraser Anderson, Justin Matejka, and Tovi Grossman on a data-driven design tool for robots.
My PhD research aimed at making robotics more accessible to casual users by reducing the domain knowledge required in designing and building robots. Towards this end, I developed several interactive design tools that enable the design of desired structure and behavior of diverse robots.
Interactive tool for articulated robot design
This tool allows novices to create custom articulated robots such as manipulators and walking robots. It supports both manual and automatic design, and enables design testing using physics-based simulation.
This tool enables novices to create smart IoT devices with embedded sensors using digital fabrication. It automatically finds assembly-aware packing of components within the device, and exports necessary geometries for 3D printing.
Semantic design of expressive robot behaviors
Can we design complex robot behaviors such as robot walking based on the emotion that the behavior evokes?
Geppetto: Enabling Semantic Design of Expressive Robot Behaviors Ruta Desai, Fraser Anderson, Justin Matejka, Stelian Coros, James McCann, George Fitzmaurice, and Tovi Grossman ACM Conference on Human Factors in Computing Systems (CHI), Glasgow, UK (2019). PDF | Video | Fastforward | Supplementary (PDF)
Assembly-aware Design of Printable Electromechanical Devices Ruta Desai, James McCann, and Stelian Coros ACM User Interface Software and Technology Symposium (UIST), Berlin, Germany (2018). PDF | bib | Video | Fastforward
Skaterbots: Optimization-based Design and Motion Synthesis for Robotic Creatures with Legs and Wheels Moritz Geilinger, Roi Poranne, Ruta Desai, Bernhard Thomaszewski, and Stelian Coros ACM Transactions on Graphics (Proc. ACM SIGGRAPH), Vancouver, Canada (2018).
Interactive Co-Design of Form and Function for Legged Robots using the Adjoint Method Ruta Desai, Beichen Li, Ye Yuan and Stelian Coros International Conference on Climbing and Walking Robots (CLAWAR), Panama city, Panama (2018). Best technical paper (second place)
Computational Abstractions for Interactive Design of Robotic Devices Ruta Desai, Ye Yuan and Stelian Coros IEEE International Conference on Robotics and Automation (ICRA), Singapore (2017). PDF | bib | Slides | Video
Robot models for fabrication: car, walking robot
3D Printing Pneumatic Device Controls with Variable Activation Force Capabilities Marynel Vazquez, Eric Brockmeyer, Ruta Desai, Chris Harrison and Scott E. Hudson ACM Conference on Human Factors in Computing Systems (CHI), Seoul, Korea, (2015).
Integration of an Adaptive Swing Control into a Neuromuscular Human Walking Model Seungmoon Song, Ruta Desai and Hartmut Geyer IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan (2013).