[Company Logo Image]  (*)

Home Up Feedback Contents Search

Field Phenotyping   

HeadWheelchair ClipOn EMG SmartSpeakingKeyboard Virtual Multi Stereopsis Plant Phenotype 3D using Octree Limb-Volume measurement using Infrared Depth Sensor Object_Recognition Scene Understanding Field Phenotyping Background Subtraction Human Motion Calibration of VSNs Image-Based Servoing Robot Navigation Path Planning Inverse Kinematics Target Tracking Target Geolocation ODI Virtual Dermatologist CNN Virtual Machines for Image Processing New Models for Parallel Soft Computing

Home
Up

 

Automated Field Phenotyping

                                                                             by Siavash Farzan and Nick Smith

 Introduction

   As the name indicates, the idea of Automated Field Phenotyping is to perform high-throughput plant phenotyping in the field. That is, to make data collection effective, easy, and less labor intensive for scientists doing crop phenotyping research. The current system that we have been developing is built upon the TALON robot platform from Foster-Miller (as seen below).

 

 

  

   Despite not being the most indicated robot for this application due to its size, our choice of the TALON was simply because we were given one. So, for now, the robot can only perform phenotype of plants on the outside of the field (i.e. not between rows of crop).

   Using this robot platform, including a differential GPS system, paired with several hardware and software add-ons, the goal is to develop an automated system that drives down each row of crops, taking different measurements and snapshots.

   We divided the development of this system into three phases.

Preliminary Phase

   In the preliminary stage of development, the system is being driven manually by an operator with the TALONís control system (click on the image below to see a video of its operation).

 

 

   The sensory hardware and software are used to log data such as: humidity, light intensity, and temperature.  This implementation can be seen in the block diagram below:

 

Phase Two

   In phase two, besides the functionalities of Phase One, the system could also take pictures of the plants.  The use of RFID tags attached to the plant allows for the robot to know where to stop and take measurements.  This implementation can be seen in the block diagram below:

 

 

Final Phase

   At this point, the whole system will be automated and manual operation of the system will no longer be required.  Also in the Final Phase of the system, the pictures taken by the platform will be used to do 3D reconstruction of the plants driven by algorithms on the onboard Raspberry Pi.  Here are some sample images already collected by the onboard cameras (click on the images to see them larger):

 

 
Unprocessed Left Image   Unprocessed Right Image
 
Processed Left Image   Processed Right Image

 

 

 

 

 

 

Home ] Up ]

Send mail to webmaster@ee.missouri.edu with questions or comments about this web site.
Last modified: 06/26/16
(*) Logo created by James Wong