Skip to main content
Article
Development of an Autonomous Indoor Phenotyping Robot
Agricultural and Biosystems Engineering Conference Proceedings and Presentations
  • Dylan Shah, Iowa State University
  • Lie Tang, Iowa State University
Document Type
Conference Proceeding
Conference
2016 ASABE Annual International Meeting
Publication Version
Published Version
Publication Date
1-1-2016
DOI
10.13031/aim.20162460767
Conference Date
July 17-20, 2016
Geolocation
(28.5383355, -81.37923649999999)
Abstract

In order to fully understand the interaction between phenotype and genotype x environment to improve crop performance, a large amount of phenotypic data is needed. Studying plants of a given strain under multiple environments can greatly help to reveal their interactions. To collect the labor-intensive data required to perform experiments in this area, an indoor rover has been developed, which can accurately and autonomously move between and inside growth chambers. The system uses mecanum wheels, magnetic tape guidance, a Universal Robots UR 10 robot manipulator, and a Microsoft Kinect v2 3D sensor to position various sensors in this constrained environment. Integration of the motor controllers, robot arm, and a Microsoft Kinect (v2) 3D sensor was achieved in a customized C++ program. Detecting and segmenting plants in a multi-plant environment is a challenging task, which can be aided by integration of depth data into these algorithms. Image-processing functions were implemented to filter the depth image to minimize noise and remove undesired surfaces, reducing the memory requirement and allowing the plant to be reconstructed at a higher resolution in real-time. Three-dimensional meshes representing plants inside the chamber were reconstructed using the Kinect SDK’s KinectFusion. After transforming user-selected points in camera coordinates to robot-arm coordinates, the robot arm is used in conjunction with the rover to probe desired leaves, simulating the future use of sensors such as a fluorimeter and Raman spectrometer. This paper shows the system architecture and some preliminary results of the system, as tested using a life-sized growth chamber mock-up. A comparison of using raw camera coordinates data and using KinectFusion data is presented. The results suggest that the KinectFusion pose estimation is fairly accurate, only decreasing accuracy by a few millimeters at distances of roughly 0.8 meter.

Comments

This proceeding is published as Shah, Dylan S., and Lie Tang. "Development of an Autonomous Indoor Phenotyping Robot." ASABE Annual International Meeting, Orlando, FL, July 17-20, 2016. Paper No. 162460767. DOI: 10.13031/aim.20162460767. Posted with permission.

Copyright Owner
American Society of Agricultural and Biological Engineers
Language
en
File Format
application/pdf
Citation Information
Dylan Shah and Lie Tang. "Development of an Autonomous Indoor Phenotyping Robot" Orlando, FL(2016) p. 162460767
Available at: http://works.bepress.com/lie_tang/32/