Master of Science in Robotics Thesis Talk

  • Remote Access Enabled - Zoom
  • Virtual Presentation
Master's Thesis Presentation

Applications of Deep Learning for Robotic Agriculture

Agricultural automation is a varied and challenging field, with tasks ranging from detection to sizing and from manipulation to navigation. These are also precursors to effective plant breeding and management. Making plant measurements by manually scouting is labor-intensive and intractable at large scale. While the cost of gene sequencing has gone down several orders of magnitude in the last couple of years, measuring the physical traits of plants remains a labor-intensive task. This calls for robotic solutions that can measure plant traits with a high through-put. Accelerated by the current developments in the area of deep learning and computer vision, in this thesis, we propose the applications of these advancements for non-contact phenotyping, tactile phenotyping and vision-based navigation in agricultural fields. For non-contact phenotyping, we propose an architecture that can count plants within 10% of  human ground truth counts and measure stalk widths with a mean absolute error of 2.76mm. We also present a deep-learning based architecture for segmenting and grasping stalks, for tactile measurements, achieving an average grasping accuracy of 74.13% and a stalk detection F1 score of 0.90. We then propose a deep-reinforcement learning based architecture that can learn policies to navigate in agricultural fields without human supervision. We test the same algorithm in simulation and various outdoor scenes. The robot learns to navigate in simulation in as low as 9 minutes of training time. We report the performance of the same algorithm in outdoor track, vineyard and hops plantation. The agent can learn to follow tracks in as low as 1hr 35 minutes on the real robot, learning from scratch.

Thesis Committee:
George Kantor (Advisor)
Jeff Schneider
Dinesh Reddy

Zoom Participation Enabled. See announcement.

For More Information, Please Contact: