These are the selected software research challenges for discussion during the Second SWForum.eu workshop.

CHALLENGE 1: An ecosystem for analysis software 


Proponent: Michel Chaudron (Technical University of Eindhoven)
Context: International Research Collaboration on Software Engineering (esp. Reverse Software Architecting)
Challenge: Integration and Sharing of Scientific Workflows on top of Shared Software Data

There are many specialized tools for analysis software - ranging from fact extraction (from various sources: code, design, test, issues, ...), tools for combining these, tool for combining and analysing the results and visualising the results. The majority of this research exists in islands of research groups. The community as a whole could improve their research (in terms of reach, replicability, transparency) by standing on each others shoulders by building on other tools in the ecosystem. However, there is no actor that drives this integration.

We want to create a (community) for building an infrastructure for Integration and Sharing of Scientific Workflows on top of Shared Software (Engineering) Data

At the workshop we want to discuss: 

  1. How to build a community that supports this idea?
  2. How to build the actual infrastructure which existing infrastuctures can we build on top of?
  3. Can we reuse from other disciplines (Medicine, Genetics, Biology, ...)?
  4. What can be a sustainable funding-model?

CHALLENGE 2: IaC Cost and Performance Optimization 

Proponent: Alfonso de la Fuente (Prodevelop)
Context: H2020 PIACERE Project - Programming trustworthy Infrastructure As Code in a sEcuRE framework. 
Challenge: Multi-Objective optimization of costs, performance, availability, and resiliency for Cloud Applications. 

This matters because scale-in and out operations for elastic infrastructure provisioning are increasingly being automatized from a Software Configuration Management (SCM) perspective. Furthermore, multiple organizations confront the trial of minimizing unnecessary expenditures and extracting the most value from Cloud-based commercial applications, by means of Infrastructure as Code (IaC) management, through periods of both predictable and unpredictable demand. 

With this discussion we aim at identifying the concepts, criteria and tools relevant to this problem.

CHALLENGE 3: The role of AI in the edge-to-cloud context

Proponent: Rita Giuffrida (Trust-IT)
Context: Artificial Intelligence is becoming pervasive today, driven by the strong growth potential of AI software platforms which are expected to approach USD 11.8 billion in revenue at a CAGR of 35.3% by 2023 worldwide. Many of the benefits of this evolution come from using computing resources at the periphery of the network. Moreover, many companies are evaluating the use of edge computing for data collection, processing, and online analytics to reduce applications latency and data transfers. A growing number of use cases (e.g. predictive maintenance, machine vision, and healthcare) can benefit from AI applications spanning edge-to-cloud infrastructures leveraging resources available at the computing continuum. In this context, edge intelligence, i.e., edge-based inferencing, will become the foundation of all industrial AI applications while most new applications will involve some AI components at various levels of the computing continuum. 
Challenges:

  • How to enhance the flexibility of cloud based solutions in order to accommodate new AI sensors and data while optimising performance?
  • How to tackle the growing number of sensors and network devices and unpredictable size of data collected, thereby increasing computation efficiency and relieving the strain on cloud and edge resources to process huge amounts of data?
  • How to cope with the variety of IoT devices and their trend to incorporate AI functionalities as new technology will bring even more diverse devices in terms of computing capacity and energy efficiency?
  • How to address security and privacy concerns in terms of model training, their deployment and local data processing?

This is important because artificial intelligence (AI) is fast becoming a key driver of economic development, playing a major role in shaping global competitiveness and productivity over the next years. With the ever-increasing development of AI applications such as intelligent personal assistants, video/audio surveillance, smart cities’ applications, autonomous driving and Industry 4.0, comes also a growing need to optimise the use of computational resources for data collection, processing, and online analytics, while at the same time preserving data privacy and increasing the security of data. AI-SPRINT will change the perspectives of several stakeholders:

  • AI application developers: AI-SPRINT will help them easily implement new applications;
  • System integrators: AI-SPRINT gives powerful yet flexible tools to develop multi-cloud systems including resources at the full computing continuum stack involving classic components and AI models;
  • Cloud providers: they can benefit from tools for simplifying resource management also at the edge layer and for offering new PaaS AI services thanks to the availability of novel open-source design tools.

With this discussion we aim at find points of collaboration with other projects that work in the same field.