IMG_8363_1.jpg

Food Manipulation

Built a custom data collection setup and gathered food manipulation data to train a perception model.

Jun. 2017 - May 2019


 

Publication

[J-5] T. Bhattacharjee, G. Lee, H. Song, S. S. Srinivasa, "Towards Robotic Feeding: Role of Haptics in Fork-based Food Manipulation," 2019 IEEE International Conference on Robotics and Automation and IEEE Robotics and Automation Letters (RA-L). [pdf] [video] [media]

[J-2] T. Bhattacharjee, H. Song, G. Lee, S. S. Srinivasa, “A Dataset of Food Manipulation Strategies,” International Journal of Robotics Research, 2018. [pdf]

[D-1] T. Bhattacharjee, H. Song, G. Lee, and S. S. Srinivasa, “A Dataset of Food Manipulation Strategies,” 2018. [Online]. Available: https://doi.org/10.7910/DVN/8TTXZ7


Media

  1. Mark Wilson, “Robots that feed people who can’t feed themselves are here,” Online article, Fast Company, 13 March 2019. [link]

  2. James Thorne, “‘Feed me, Alexa.’ University of Washington team creates voice-controlled robot to help people eat,” Online article, GeekWire, 9 March 2019. [link]


Motivation

Feeding is a particularly high-impact task because approximately 12.3 million people need assistance with this activity of daily living.

Purpose

Food manipulation project is for making a robot feed upper extremity-impaired people.


Approach

 
 
Forque: Fork that is able to sense force and torque

Forque: Fork that is able to sense force and torque

Experiment Setup

Experiment Setup

 
 
  • Design an experiments for human subjects to collect the haptic and motion data of the forque, in which a force/torque sensor and motion capture markers are embedded, when the human subject is forking foods.

  • With this data, a robot will learn how to classify and fork food items based on haptic feedback and visual perception.


Human Subject Experiments 

This video shows three interesting strategies that human subjects used during the experiments. Top-left shows the video of experiments and the top right shows the graphical representation of the forque's movement. The bottom graph shows the force/torque signals representing each strategy.


Robot Experiments

This video shows one trial of robot experiments. We collected the same data captured from the human subjects' experiments (force, torque, pose of the forque) using a 7 DOF back-drivable arm (Fetch) and used the data to verify the classifier we generated based on the data from human subjects' experiments.