MAE Colloquium: Tapomayukh Bhattacharjee (Cornell CS)
Towards Robotic Caregiving: Building robots that work alongside human stakeholders
How do we build robots that can assist people with mobility limitations with activities of daily living? To successfully perform these activities, a robot needs to be able to physically interact with humans and objects in unstructured human environments. Through this talk, I will cover various projects in my lab that showcase fundamental advances in the field of physical robotic caregiving that involve complex and uncertain physical human-robot interaction. Specifically, I will show you how we can build caregiving robots to perform activities of daily living such as feeding, meal-preparation, and bed-bathing using our newly developed caregiving simulation tools and algorithms that leverage multimodal perception and user feedback, and how we deployed these systems to work in the real world with real users.
Bio:
Tapomayukh “Tapo” Bhattacharjee is an assistant professor in the Department of Computer Science at Cornell University, where he directs the EmPRISE Lab. He completed his Ph.D. in robotics from Georgia Institute of Technology and was an NIH Ruth L. Kirschstein NRSA postdoctoral research associate in computer science & engineering at the University of Washington. He wants to enable robots to assist people with mobility limitations with activities of daily living. His work spans the fields of human-robot interaction, haptic perception, and robot manipulation and focuses on addressing the fundamental research question on how to leverage robot-world physical interactions in unstructured human environments to perform relevant activities of daily living. He is the recipient of TRI Young Faculty Researcher Award, NSF CAREER Award, and his work has won Best Paper Award Finalist at HRI 2024, Best Demo Award at HRI 2024, Best RoboCup Paper Award at IROS 2022, Best Paper Award Finalist and Best Student Paper Award Finalist at IROS 2022, Best Technical Advances Paper Award at HRI 2019, and Best Demonstration Award at NeurIPS 2018. His work has also been featured in many media outlets including the BBC, Reuters, New York Times, IEEE Spectrum, and GeekWire and his robot-assisted feeding work was selected to be one of the best interactive designs of 2019 by Fast Company.