Research Supervisor Connect

Advanced orchard mapping systems using robotics, sensing and perception


The aim of the project is to research and develop intelligent orchard mapping systems that provide timely, high resolution data to support farm management.


Dr James Underwood.

Research location

Aerospace, Mechanical and Mechatronic Engineering

Program type



Recent developments in machine learning and computer vision have allowed new approaches to precision agriculture. For orchards, this has allowed us to develop vision-based systems that can scan, detect and map flowers and fruit with unprecedented resolution and accuracy, providing growers with quantified spatial data to support farm management. Beyond detecting flowers and fruit in images, a systematic approach is required at the intersection of robotics, sensing and perception, with exciting opportunities for Masters and PhD level research. Research topics include multi-modal sensing and perception using lidar combined with colour/thermal/hyperspectral cameras, simultaneous localisation and mapping, multi-perspective and multi-temporal data fusion, and machine learning. Application areas include spatio-temporal mapping of flowers and fruit and their corresponding stages of development (tracking blooms, stages of fruit ripeness, etc); measuring and modelling tree canopy geometry and architecture; and combined systems to efficiently scan and digitise the state of an orchard, including whole-farm mapping and digitisation. 

Additional information

The Australian Centre for Field Robotics at The University of Sydney is pursuing exciting research and development projects in agricultural robotics, which will have a large and long-term impact in Australia and globally over the next decade. Our PhD program is an opportunity for students to join our world-leading group to develop novel, industry-relevant research that will position you for a career in the future of high-tech robotics and sensing applications.

Want to find out more?

Opportunity ID

The opportunity ID for this research opportunity is 2361