Explore a range of mechatronic engineering research internships to complete as part of your degree during the semester break.
The following internships listed are due to take place across the Winter break.
Application will open on 1 April and close at midnight on 27 April 2026.
Supervisor: Dr Don Dansereau
Eligibility:
Project Description:
Digital twins are becoming a critical tool in managing our infrastructure. From solar farms to urban infrastructure, we are working to automate the capture and modelling of how our assets evolve over time. A key challenge in this process is constructing and maintaining these models.
Building on existing work with the ACFR’s small unmanned ground vehicle Wombat, you will advance techniques for repeated, automated inspection and 3D reconstruction of outdoor assets that evolve over time.
Depending on aptitude and interest, this could include development of teach-and-repeat navigation strategies, sensor payloads, 3D reconstruction algorithms, interactive visualisation tools, and/or approaches to change detection and modelling. Deployment using the Wombat platform is a key part of this program.
Requirement to be on campus: Yes *dependent on government’s health advice.
Supervisor: Dr Don Dansereau
Eligibility:
Project Description:
3D modelling of reflective objects like glass buildings is almost impossible. From Google maps to Hollywood visual effects, automated solutions to this problem remain elusive.
We are developing physics-based solutions to this problem, enabling 3D modelling of reflective objects using drone-based imagery.
To validate our approaches we will leverage visual special effects technologies used in the film industry to build an outdoor drone motion tracking system to provide a source of ground truth for the drone’s pose.
Your task is to design and deploy the outdoor drone motion capture system. Depending on aptitude and interest, this can involve previsualisation for camera and lens selection, infrastructure and logistics for deployment and data capture, post-capture processing pipeline development and deployment, and/or design of an active drone-mounted marker system.
Requirement to be on campus: Yes *dependent on government’s health advice.
Supervisor: Dr Don Dansereau
Eligibility:
Project Description:
What does your robotic vacuum cleaner see, and who else has access to those images? In homes, hospitals, and secure industrial sites, the uptake of autonomous robots is limited by privacy concerns.
Working with researchers at the Australian Centre for Robotics, this project will develop novel sensing technologies to enable robots to visually understand their environments without capturing privacy-revealing images.
Building on a unique prototype technology, you will advance the design of an opto-mechatronic privacy-preserving robotic vision system.
Depending on interest and ability there is scope to focus on machine learning for enabling new privacy-preserving capabilities, hardware developing in optics and analogue electronics, or deployment on an outdoor ground vehicle. There is also scope to quantify privacy by constructing sophisticated attacks that identify and exploit points of weakness.
Requirement to be on campus: Yes *dependent on government’s health advice.
Supervisor: Dr Don Dansereau
Eligibility:
Project Description:
Long-term autonomy requires robots to improvise solutions to novel challenging conditions. Resilience to such conditions represents a final frontier in the deployment of truly trustworthy systems.
In this project you will advance a processing architecture inspired by the global workspace model of human cognition. We hypothesise this will enable the sort of metareasoning and problem solving humans use when tackling unprecedented challenges.
Students will have an opportunity to engage with both theoretical and applied aspects of the work, from concept to deployment, including work with a small mobile unmanned ground vehicle.
Requirement to be on campus: Yes *dependent on government’s health advice.
Supervisors: Dr. Ahalya Prabhakar
Eligibility: Programming (Python, C++) and Software/Hardware Integration Skills required. Prior Experience with VR Development beneficial.
Project Description:
This project will develop an immersive interface to enable intuitive and effective human–robot shared control for assistive wheelchair interfaces. Students will develop Shared Control Interface and interactive VR Environment for users to control a semi-autonomous wheelchair through crowded/cluttered environments. The interface and VR Environments will be developed to investigate effects of different shared control methods for different dynamic scenarios/environments. The resulting system will provide a flexible tool for research to investigate interaction techniques that reduce cognitive load and improve system performance.
Requirement to be on campus: Yes *dependent on government’s health advice.
Supervisors: Dr. Ahalya Prabhakar
Eligibility: Programming (Python, C++) and Robotics Hardware/Simulation skills required (ROS2).
Project Description:
This project will develop an interactive interface that enables users to teach robots using preference-based learning. The project will develop an interface will present candidate behaviours through simulation and visualization tools, allowing users to observe how a robot performs a task and select the option that best matches their intent. These preferences will be used by learning algorithms to iteratively refine the robot’s policy, enabling the system to improve performance through human feedback. The project will provide a scalable framework for studying interactive robot learning and developing more transparent, human-centered approaches to training intelligent robotic systems.
Requirement to be on campus: Yes *dependent on government’s health advice.
Supervisors: Dr Ahilya Prabhakar
Eligibility: Programming (Python, C++) and Software/Hardware Integration Skills required. Prior Experience with VR Development beneficial.
Project Description:
This project will develop an immersive virtual reality (VR) interface to enable intuitive and effective human–robot teaming. Students will develop Shared Control Interface and interactive VR Environment for users to communicate and collaborate with robot teams through natural gestures and spatial manipulation. The interface will be designed to support both single robots and multi-robot teams and VR Environments will be developed to investigate effects on human-robot teaming for different tasks and dynamic scenarios. The resulting system will provide a flexible tool for research in deployment of human-robotic systems in complex environments to investigate interaction techniques that reduce cognitive load and improve situational awareness compared to traditional screen-based interfaces.
Requirement to be on campus: Yes *dependent on government’s health advice.
Supervisors: Prof Stefan Williams
Eligibility: Students should have a background in Mechatronic Engineering, Computer Science or related disciplines and an interest in working robotic systems and/or sensors such laser or vision.
Project Description:
This project offers hands-on experience contributing to ARIAM Hub research in robotics, sensing, and 3D reconstruction. The student will work with LiDAR and RGBD camera datasets to generate high‑quality spatial reconstructions using COLMAP and Emesent Aura, assist with ROS 2 integration and testing, and participate in field deployments with robotic platforms like our Boston Dynamics Spot platform.
A major component of the project includes practical experience designing sensor payloads, including mechanical design, power and data integration, and software development. The student will also have opportunities for CAD tutelage to support the hardware design process.
This project suits enthusiastic students interested in getting hands-on with robotics, sensing systems, payload R&D, and transforming real‑world data into digital twins of physical assets. The work directly contributes to ARIAM’s ongoing research into robot‑captured data, digital twins, and spatial modelling.
Requirement to be on campus: Yes *dependent on government’s health advice.
Last updated: 26 March 2026