Data science and computer engineering algorithms
Research_

Algorithms

Pushing the limits of computation
We’re exploring the foundational aspects of computer science with an in-depth focus on algorithms, optimisation and data structures.

What is an algorithm?

An algorithm is a sequence of instructions, typically to solve a class of problems or perform a computation. They are unambiguous specifications for performing calculation, data analysis, automated reasoning, or other tasks.

Our research

We’re developing provably correct algorithms for fundamental combinatorial and geometric problems, including scheduling, trajectory and graph analysis.

These problems have applications in many fields, such as mathematics, biology, robotics, operations research and social networks. 

Our focus research is to narrow the gap between theory and practice and develop practical algorithms with rigorous performance guarantees.

To achieve this aim we use a multitude of frameworks, such as:

  • approximation algorithms
  • randomised algorithms,
  • realistic input models, 
  • sublinear algorithms,
  • formal methods, and
  • multivariate complexity.

 

Our expert: Dr André van Renssen

Routing is the process of exchanging data between nodes and the reason why networks are constructed. We aim to design efficient and practical routing algorithms for highly dynamic graphs.

Nowadays devices can join and leave at any given time and must operate in the presence of complex obstacles that disrupt communication, such as electromagnetic fields in wireless networks.

We intend to develop innovative methods to deliver the outcome of provably efficient routing in obstacle-rich environments.

This will enable faster data gathering in sensor networks used in geographical and geological operations, such as earthquake detection and prediction.

Our experts: Professor Joachim Gudmundsson, Dr André van Renssen

Our partners: Defence, Science and Technology Group, Australian Government 

Technological advances in tracking and sensing are producing an increasing number of opportunities to trace the movement of vehicles, humans and animals.

Current tracking technology allows for low cost, almost continuous capture of individual trajectories with sub-second sampling rates.

As a result, a variety of disciplines including ecology, animal-behaviour research, sports, defence, geographic information systems and transport are dependent on movement analysis.

Our goal is to develop fundamental algorithms and basic data structures with quality guarantees.

Our experts: Dr Julian Mestre, Dr William Umboh

Scheduling jobs on machines is a fundamental area of optimisation. Most of the classical work on it focused on the setting in which the scheduler has complete information about the jobs.

However, in modern applications of job scheduling, such as cloud computing and content delivery, the jobs are not known to the scheduler in advance but arrive over time.

We aim to develop algorithmic techniques for scheduling jobs that arrive over time. 

Our expert: Lijun Chang

A graph is ubiquitous in modelling real-word data that captures the relationship of information. With the rapid development of information technology, we are nowadays facing large graph data with billions of nodes and edges where the traditional algorithms and techniques are inefficient in processing these large graph data.

Our aim is to design efficient algorithms and techniques to speed up and scale large graph analytics, such as finding dense subgraphs, computing shortest paths and identifying communities.

Our experts: Professor Seokhee Hong, Professor Emeritus Peter Eades

Our partner: Professor Hiroshi Nagamochi (Kyoto University, Japan)

Most real-world data sets are relational and can be modelled as graphs consisting of vertices and edges. Graph drawing algorithms aim to produce human-readable pictures of relational data.

These algorithms are used in visualisation software underlying many data mining tools, in domains such as market surveillance, fraud detection, bioinformatics, software re-engineering, and counter-terrorism.

Planar graphs are fundamental for both graph theory and graph algorithms and are extensively studied. Structural properties and fundamental algorithms for planar graphs have been discovered. However, most real-world graphs, such as social networks and biological networks, are non-planar.

To analyse and visualise such real-world networks, it is necessary to solve fundamental mathematical and algorithmic research questions on sparse non-planar graphs, called beyond planar graphs.

New algorithms for beyond planar graphs will be in high demand by practitioners in various application domains to solve complex visualisation problems.

 

Our experts: Professor Seokhee Hong, Professor Emeritus Peter Eades

Graph drawing is to construct good geometric representation of graphs in two and three dimensions.

Although graph drawing has been extensively studied due to the wide range of applications, including very large-scale integration (VLSI) design, information systems, sociology, biology, networks, and software engineering, the majority of research has been devoted to study representations of graphs in two dimensions (2D). 

We are investigating a new multiplane framework, which draws graphs using a set of 2D planes, nicely arranged in three dimensions, and satisfying new aesthetic criteria derived from topology and graph theory.  

More specifically, we aim to study multiplane embeddings from both mathematical and computational points of view:

  • define new mathematical criteria for multiplane embeddings and establish lower/upper bounds,
  • characterise multiplane graphs, 
  • determine the complexity of computing multiplane  embeddings, and 
  • design algorithms for constructing multiplane embeddings.

Our expert: Dr Sasha Rubin

A Reactive System is a computer system that continuously interacts with an external environment. Today there are many examples in which the environment is the physical world, such as autonomous mobile assistants, robots on factory floors, and traffic control systems. It is important to guarantee they operate safely and intelligently.

Unfortunately, designing reactive systems is challenging because their desired behaviour can refer to an unbounded amount of time, and their actions may depend on their entire history of observations so far (rather than just the current observation).

Reactive Synthesis is a branch of Formal Methods that studies how to specify and automatically construct (=synthesise) reactive systems. Specifications of the desired system behavior are expressed in rich logic-based languages called Temporal Logics. For instance, the following is naturally expressible in linear-temporal logic: "repeat forever: search for the person in rooms 1,2 and 5; if you find them, turn on the microphone and wait; if you lose them, turn off the microphone and keep searching".

Reactive Synthesis uses and develops classic and modern algorithmic techniques, including automata-theory and reinforcement learning, for devising algorithms for constructing such systems.

Our expert: Dr Clement Canonne

Learning from data and performing computation on it is pervasive is computer science, and science at large. Many algorithms have been devised over the years, for tasks as diverse as deciding structural properties of graphs, learning good classifiers from labelled data, or performing statistical inference on datasets. Those algorithms are typically efficient, in that they run in time polynomial —or even linear— in the relevant parameters of the problem.

Yet, the sheer volume of data can nowadays make even linear-time algorithms impractical. Can we still then say anything meaningful about the data or task at hand, even when only looking at a small fraction of the input? And what happens when time is not the only bottleneck for our algorithms, but memory, bandwidth, or data privacy are also constraints to take into account?