Dr Mehmet Dogar

Profile

I am a University Academic Fellow (~Asst. Prof.) at the School of Computing, University of Leeds. I direct the Robot Manipulation Lab in the school. Previously, I was a postdoctoral researcher at CSAIL, MIT. I received my PhD from the Robotics Institute at Carnegie Mellon University. 

My research focuses on autonomous robotic manipulation. I envision a future where robots autonomously perform complex manipulation tasks in human environments; such as grasping an object from the back of a cluttered shelf, or manufacturing and assemling a complex piece of furniture. I am particularly interested in the use of physics-based predictive models in manipulation planning.

I am a co-chair of the IEEE RAS Technical Committee on Mobile Manipulation, I have been an Area Chair for the Robotics: Science and Systems conference in 2017 and 2018, and I am an Associate Editor for IEEE Robotics and Automation Letters

I was a finalist for the Best Paper Award at IEEE ICRA 2018, at IEEE ICRA 2015, and at IEEE IROS 2010. I received the Best Reviewer Award at IEEE ICRA 2014.

Publications

At the very bottom of this page is a list of my publications. Also see my Google Scholar page

Projects

  • 2017-2019: EPSRC First Grant, PI, "Multi-Robot Manipulation Planning for Forceful Manufacturing Tasks", £101K. More here.
  • 2017-2019: Horizon 2020 Marie-Curie Individual Fellowship, Fellow, "Robotic Manipulation Planning for Human-Robot Collaboration on Forceful Manufacturing Tasks", £146K.
  • 2018-2020: Horizon 2020 Marie-Curie Individual Fellowship, Supervisor, "Integrating robotic control and planning with human activity prediction for efficient human robot collaborative manipulation", £150K.
  • 2018-2020: EPSRC, Co-I, "Human-like physics understanding for autonomous robots", £303K. More here.

News

  • 06/19: We have a paper on learning physics-based manipulation accepted to IROS 2019.
  • 04/19: We have organized the 3rd UK Robot Manipulation Workshop in Leeds, on 9-10 April 2019!
  • 09/18: We have one paper (here) accepted to WAFR 2018 and three papers (here, here, and here) accepted to IEEE-RAS Humanoids 2018!
  • 06/18: Our paper on manipulation planning for human-robot collaboration is accepted to IEEE/RSJ IROS 2018. You can read it here.
  • 03/18: Our ICRA 2018 paper (lead author: Guy Rosman, MIT) has been nominated for the Best Paper Award!
  • 03/18: We have organized the Mobile Manipulation Hackathon at IROS 2018. 
  • 05/15: Our work on multi-robot manipulation planning was featured as a spotlight by MIT News. You can read it here
  • 05/15: Our paper was a finalist for the Best Paper Award and the Best Manipulation Paper Award at ICRA 2015! A draft is available here.

Research interests

My research focuses on autonomous robotic manipulation. I envision a future where robots autonomously perform complex manipulation tasks in human environments; such as grasping an object from the back of a cluttered shelf, or manufacturing and assemling a complex piece of furniture. My manipulation planners use physics-based predictive models. This challenges the existing paradigm which is based on a geometric representation of the world and is limited to pick-and-place actions. The physics-based approach, on the other hand, enables a robot to interact with the environment with a rich set of actions such as pushing, tumbling, and throwing, as well as pick-and-place. 

I am also interested in collaborative manipulation planning, where the task is performed through the collaboration of human-robot or robot-robot teams.

Research Videos


In our recent work, we have looked at learning policies for physics-based manipulation in clutter. Arxiv preprints here and here (appeared at Humanoid 2018 and IROS 2019, respectively). Lead author Wissam Bejjani, in collaboration with Matteo Leonetti.


We have been developing algorithms for reactive re-planning for manipulation in cluttered environments. The planners that have been developed for this problem has been open-loop so far. The video below compares the performance of our closed-loop approach. Appeared at IEEE-RAS Humanoids 2018, arxiv preprint (lead author: Wisdom Agboh) here.


Imagine reaching into a fridge shelf that is crowded with fragile objects, such as glass jars and containers. You move slowly and carefully. However, if the fridge shelf is almost empty, only with a few plastic containers that are difficult-to-break, you move faster with less care. In this work (lead author: Wisdom Agboh) we develop a model predictive control framework for robot pushing that adapts to the required task accuracy, pushing fast when it is okay, but slow when the robot needs to be more careful. Appeared at WAFR 2018, arxiv preprint here.


Imagine grasping a wooden board while your friend drills holes into it and cuts pieces off of it. You would predict the forces your friend will apply on the board and choose your grasps accordingly; for example you would rest your palm firmly against the board to hold it stable against the large drilling forces. We developed (lead author: Lipeng Chen, appeared at IROS 2018) a manipulation planner to enable a robot grasp objects similarly. Arxiv preprint (2018) here


At MIT, we developed a multi-robot system to perform multi-step assembly tasks. The system can perform collaborative operations, e.g. two robots carrying a heavy part together. The system is also able to shift between coarse manipulation operations, such as transport, to fine operations, such as part alignment and fastener insertion. Read more about this study in our ISER 2014 paper. Our paper about multi-robot manipulation planning was a finalist for the Best Paper Award and the Best Manipulation Paper Award at ICRA 2015! A draft is available here


Tactile sensors provide contact information, essential for making physics-based predictions. In this work we use tactile sensing to localize objects during pushing actions.


Human environments are cluttered and robots regularly need to solve manipulation problems by moving objects out of the way to reach other objects. The video below shows the PR2 using my push-grasp planner to contact multiple objects simultaneously and grasp objects through clutter:


I developed an algorithm to rearrange clutter using a library of actions including pushing. The planner can move objects that are not movable by pick-and-place actions, e.g. large or heavy objects. Here is a video where HERB pushes a large box out of the way: 


You can see HERB executing push-grasps in the following video. Contrary to the instantaneous approach to grasping, I model grasping as a process where the object is pushed along by the hand before the fingers close on it. Push-grasping is robust to large uncertainties in object pose, because the uncertainty can be funneled into the hand during pushing. My work showed that we can pre-compute these funnels, called capture regions. Read more about this study in our IROS 2010 paper


Here is a demo our lab at CMU put together where our robot HERB microwaves a meal: 


 

Student education

I teach introductory Robotics.

Research groups and institutes

  • Artificial Intelligence

Publications:

Loading...