Big Picture - Modern society is increasingly dependent on software products that support day-to-day activities such as shopping, banking, and recreation. These software products are created from a large number of interacting components. For example, a typical online shopping system requires components for the payment handling, tracking the delivery of goods, online maps, and many other components. As software products take on more functionality, their size and complexity increase to the extent that it is impossible for a human brain to comprehend all the details involved in the engineering and operation of such systems. For the past decade my research has been focused on the use of models, that is mathematical abstractions of software systems, for the design, engineering, maintenance and operation of large and complex software systems. My current research falls into two streams.
1. Real-time processing of fast and big data: Development of new algorithms and methods of automated detection of problems including fault, malicious behaviour, or undesirable scenarios to be used for large systems while they are in operation. I am trying to answer questions such as:
Real-time stream processing: How can we automatically detect these problems from the large and fast streams of data in real-time?
I am also interested in partially-observable systems, black-box systems that we have access to a subset of events, commonly known as observable events.
- Diagnosability: Under what conditions it is possible to detect fault, malicious behaviour, or undesirable scenarios automatically in partially-observable systems?
- Diagnosis: If a system is diagnosable, how can we create efficient diagnosers? A diagnoser is a software module that receives a stream of observations from the system and determines in real-time if any fault or malicious behaviour has happened.
2. Modelling methods in software engineering. I am trying to answer questions such as:
- Modelling: How can we further our understanding of complex systems? How can we use this new understanding to aid in the design of better systems?
- Automated analysis: As models of real-world systems are large, how can we use algorithms to automatically analyse the models to detect design flaws?
- Code generation: How
can we use algorithms to automatically implement/produce the
systems from their models? or how can we automatically compose
systems to build larger systems (synthesis)?
Beside these high-level research interests, I am currently working on several more focused research streams.
Real-time and near-real-time stream processing: Failure in the services provided by telecommunication companies can result in financial penalties and unhappy customers. I am interested in developing methods and algorithms for fault detection, see (Aldoib and Bordbar 2012) for a sample of our papers. This research has received considerable attention from BT. To recognise my key contribution, in 2008 I was awarded a BT Research Fellowship funding me to work directly with BT research team at Ipswich. The collaboration with BT also resulted in a joint patent (Title: Method and System for Distributed and Collaborative Monitoring, Application Number: US13/184,015). I am currently focusing my research on processing of big data in real-time via frameworks such as storm and spark.
detection of malicious behaviour in G-Cloud: Cloud
computing is revolutionising the use of IT in our daily lives. In
2011, funded by the Royal Academy of Engineering, I was seconded
to work at the Cloud and Security research Lab at HP. The research
resulted in a pioneering method of detection of malicious
behaviour that complements existing security mechanisms in the
G-Cloud, the UK government Cloud. The outline of the suggested
method is presented in (Harrison
et al. 2012), which was nominated for the best paper award
by the conference. Later, I was invited and funded to return to HP
Labs to continue my work and was awarded the prestigious
international Innovation Research Program award by HP.
Automated analysis of the UML models: Creating a good design, during early stages of software development avoids costly modifications. I introduced a method of automatically analysing models of systems captured in Unified Modelling Language (UML) to discover faulty designs. UML is a standard language widely used by the software engineers for designing systems. Our method pioneers automatic extraction of logical representations of the system and analyses them via software components know as constraint-solvers.
To show the viability of the approach I developed a prototype tool called UML2Alloy. Alloy is an example of a constraints solver. The tool UML2Alloy was downloaded 2,400 times within the first 6 months of its release. UML2Alloy is multi-purpose and can be applied in different application domains. I was invited and funded by companies such as IBM and BT to use UML2Ally and (new implementations such as UML2Z3) to solve complex technical problems. Other teams including international research groups at Magill University, INRIA/IRISA, and Aachen University, among others, have extended and used the tool in numerous projects.
successfully lead a team of researchers from Birmingham Medial
School, Heart of England Foundation (HEFT), Birmingham Cross City
Clinical Commissioning Group (BCCG), and an Industrial Partner in
applying for an EPSRC grant. Awarded in 2015, the proposal (Title:
Automated Conflict Resolution in Clinical Pathways) aims to
discover new methods of finding and mitigating conflicts and
mismatches in medical guidelines for the treatment of the patients
with more than one disease, commonly known as multi-morbidity.
Model Driven Architecture and software tools
Fault detection in Telecom Services
Modeling, design and control of complex systems
Quality of Service: Modelling, Design and Analysis (Model Checking)
- Hybrid System and Control (Agile Manufacturing Systems)
- MSc and PhD (Functional analysis and Topology)