I am a Research Scientist and Tech Lead Manager at Meta Reality Labs Research in Redmond. At Reality Labs, I lead an embodied AI team whose vision is to develop contextual Augmented Reality (AR) systems that can extend their users' capabilities.
We adopt an embodied AI framing for contextual AR assistance. In particular, we model the AR system as an intelligent agent that perceives the user and their environment through various sensors and outputs actions in service of the user's goals, aligned with the user's preferences.
Prior to joining Meta, I obtained my PhD at the Robotics Institute, Carnegie Mellon University working with Stelian Coros and Jim McCann.
For my thesis, I devised human-AI systems that enabled casual users to build and program robots — towards increasing accessibility of robotics. In the past, I have also had not-so-brief stints in the area of human motor control and biomechanical simulations with Hartmut Geyer and Jessica K. Hodgins.
Please get in touch if you're interested in applying for an internship at the Reality Labs Research.
My current research focuses on leveraging representation learning, planning, and reinforcement learning techniques — combined with an embodied AI framing — for developing AR assistance models. Our embodied AI framing allows us to break down the problem of computing AR/VR assistance into two main parts — a) Inferring the goals and context of the user given a sequence of multi-modal sensor observations and b) Learning goal-conditioned system-action policies.
Goal and Context Inference for AR Assistance
We leverage self-supervised learning and multi-modal sensors (egocentric video + IMU) to understand user's goals and context such as current task and actions in AR.
ICCV EPIC Workshop 2021
Towards Goal-Conditioned Assistance in VR
We compute assistance policies for house-cleaning task in VR and evaluate it at-scale by deploying these policies in a web-based version of the AI Habitat simulator.
My PhD research aimed at making robotics more accessible to casual users by reducing the domain knowledge required in designing and building robots. Towards this end, I developed several interactive human-in-the-loop AI systems that enable the design of desired structure and behavior of diverse robots.
Interactive AI System for Articulated Robot Design
This tool allows novices to create custom articulated robots such as manipulators and walking robots. It supports both manual and automatic design, and enables design testing using physics-based simulation.
Interactive AI System for Non-articulated Robot Design
This tool enables novices to create smart IoT devices with embedded sensors using digital fabrication. It automatically finds assembly-aware packing of components within the device, and exports necessary geometries for 3D printing.
UIST 2018
Semantic Design of Expressive Robot Behaviors
Can we design complex robot behaviors such as robot walking based on the emotion that the behavior evokes?
Action Dynamics Task Graphs for Learning Plannable Representations of Procedural Tasks Weichao Mao, Ruta Desai, Michael Louis Iuzzolino, and Nitin Kamra AAAI Workshop on User-Centric AI for Assistance in At-Home Tasks, Washington DC, USA (2023).
Coming soon!
Egocentric Scene Context for Human-centric Environment Understanding from Video Tushar Nagarajan, Santhosh Kumar Ramakrishnan, Ruta Desai, James Hillis, and Kristen Grauman Under Review at The Conference on Computer Vision and Pattern Recognition (CVPR)(2023).
Episodic Memory Question Answering Samyak Datta, Sameer Dharur, Vincent Cartellier, Ruta Desai, Mukul Khanna, Dhruv Batra, and Devi Parikh The Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, USA (2022).
How You Move Your Head Tells What You Do: Self-supervised Video Representation Learning with Egocentric Cameras and IMU Sensors Satoshi Tsutsui, Ruta Desai, and Karl Ridgeway International Conference on Computer Vision (ICCV), EPIC Workshop (2021).
Towards Inferring Cognitive State Changes from Pupil Size Variations in Real World Conditions Naga Venkata Kartheek Medathati, Ruta Desai, and James Hillis ACM Symposium on Eye Tracking Research and Applications (ETRA), Stuttgart, Germany (2020). PDF | bib
Human-AI Collaborative and Mixed-Initiative Systems, Robotics
Effective Baselines for Multiple Object Rearrangement Planning in Partially Observable Mapped Environments Engin Tekin, Elaheh Barati, Nitin Kamra, and Ruta Desai AAAI Workshop on User-Centric AI for Assistance in At-Home Tasks, Washington DC, USA (2023). arXiv | bib
Cross-Domain Imitation Learning via Semantic Skills Karl Pertsch, Ruta Desai, Franziska Meier, Vikash Kumar, Dhruv Batra, and Akshara Rai Conference on Robot Learning(CoRL), Aukland, New Zealand (2022).
Optimizing the Timing of Intelligent Suggestion in Virtual Reality Difeng Yu, Ruta Desai, Ting Zhang, Hrvoje Benko, Tanya R. Jonker, and Aakar Gupta ACM User Interface Software and Technology Symposium (UIST), Bend, USA (2022).
Geppetto: Enabling Semantic Design of Expressive Robot Behaviors Ruta Desai, Fraser Anderson, Justin Matejka, Stelian Coros, James McCann, George Fitzmaurice, and Tovi Grossman ACM Conference on Human Factors in Computing Systems (CHI), Glasgow, UK (2019). Best paper award (top 1%) [Details] PDF | bib | Video | Fastforward | Supplementary (PDF)
Assembly-aware Design of Printable Electromechanical Devices Ruta Desai, James McCann, and Stelian Coros ACM User Interface Software and Technology Symposium (UIST), Berlin, Germany (2018). PDF | bib | Video | Fastforward
Skaterbots: Optimization-based Design and Motion Synthesis for Robotic Creatures with Legs and Wheels Moritz Geilinger, Roi Poranne, Ruta Desai, Bernhard Thomaszewski, and Stelian Coros ACM Transactions on Graphics (Proc. ACM SIGGRAPH), Vancouver, Canada (2018).
Automatic Design of Task-specific Robotic Arms Ruta Desai, Margarita Safonova, Katharina Muelling, and Stelian Coros ICRA Workshop on Autonomous Robot Design, Brisbane, Australia (2018).
Interactive Co-Design of Form and Function for Legged Robots using the Adjoint Method Ruta Desai, Beichen Li, Ye Yuan, and Stelian Coros International Conference on Climbing and Walking Robots (CLAWAR), Panama city, Panama (2018).
Computational Abstractions for Interactive Design of Robotic Devices Ruta Desai, Ye Yuan, and Stelian Coros IEEE International Conference on Robotics and Automation (ICRA), Singapore (2017). PDF | bib | Slides | Video
| Code
Robot models for fabrication: car, walking robot
3D Printing Pneumatic Device Controls with Variable Activation Force Capabilities Marynel Vazquez, Eric Brockmeyer, Ruta Desai, Chris Harrison and Scott E. Hudson ACM Conference on Human Factors in Computing Systems (CHI), Seoul, Korea, (2015).
Virtual Model Control for Dynamic Lateral Balance Ruta Desai, Hartmut Geyer, and Jessica K. Hodgins IEEE International Conference on Humanoid Robots (Humanoids), Madrid, Spain (2014).
Integration of an Adaptive Swing Control into a Neuromuscular Human Walking Model Seungmoon Song, Ruta Desai, and Hartmut Geyer IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan (2013).
Muscle-Reflex Control of Robust Swing Leg Placement Ruta Desai and Hartmut Geyer IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany (2013).
Robust Swing Leg Placement under Large Disturbances Ruta Desai and Hartmut Geyer IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China (2012).
Optimal Assistance for Object-Rearrangement Tasks in Augmented Reality Benjamin A. Newman, Kevin T. Carlberg, Ruta Desai, James Hillis US Patent US20220114366 A1 (2022).
I am always looking to support and contribute towards encouraging women in science and technology. In the past:
I have led roadshows and volunteered in teaching middle school girls at Technights at CMU. We have smashed computers, fought combats in code, and have even gazed stars through apps!
I helped organize OurCS 2015, a conference to encourage undergraduate women in research. I was also a part of Google Anita Borg Scholars Alumni Planning committee.
Selected Press
Techcrunch, New toolkit makes it easy to drag and drop your own robot, 2017.