This work will develop algorithms that enable robots to benefit from the rich information captured by emerging light field video cameras.
Light field cameras see a 4D superset of what normal 2D cameras see, simultaneously capturing light rays' positions and directions. This promises greater robustness in low light and through rain and fog, as well as natively capturing higher-order light transport effects like specularity and transparency.
In this project you will develop the algorithms needed to make sense of these cameras in a robotics context, with an emphasis on real-time performance with light field video. There are opportunities to construct camera prototypes or to work with emerging commercial devices with embedded CPU, GPU, and FPGA, and to apply machine learning or conventional approaches to algorithmic development. Applications arise anywhere robots encounter perceptual challenges including all-weather autonomous driving, drone flight, underwater survey, human-robot interaction, and locomotion on challenging terrain.
Working within the Australian Centre for Field Robotics (ACFR), you will have access to the state-of-the-art robots, facilities, dedicated technical staff, and mentorship available through this world-class research centre. The ACFR undertakes significant field robotics programs in autonomous driving, flight, agriculture, and underwater survey, providing rich opportunities for deployment and validation of novel perception systems.
The opportunity ID for this research opportunity is 2629