This document contains the workflow and toolchain final release information.
This document contains the workflow and toolchain information.
The purpose of this document is define the traceability mechanism from operational data to development lifecycle using the developed OSLC Bridge.
This document presents the research and experiments done on recovery mechanisms and overall resilience improvement of CPSoS.
This document presents the uncertainty detection methods at runtime, mainly includes two
parts: (1) anomaly detection with curriculum learning and (2) uncertainty-aware robustness
This document presents the research and experiments done on the application of test oracles at the operational time of CPSoS.
In the last years, the use of microservice architectures is spreading in Cyber-Physical Systems (CPSs) and Internet of Things (IoT) domains. CPSs are systems that integrate digital cyber computations with physical processes. The development of software for CPSs demands a constant maintenance to support new requirements, bug fixes, and deal with hardware obsolescence. The key in this process is code testing and more if the code is fragmented during the development of CPSs. It is important to remark that this process is challenging and time-consuming. In this paper, we report on the experience of instantiating of the microservice-based architecture for DevOps of CPSs to test elevator dispatching algorithms across different test levels (i.e., SiL, HiL and Operation). Such an architecture allows for a continuous deployment, monitoring and validation of CPSs. By integrating the approach with a real industrial case study, we demonstrate that our approach reduces significantly the time needed in the testing process and consequently, reduces the economic cost of the entire process.
Authors: Iñigo Aldalur, Aitor Arrieta, Aitor Agirre, Goiuria Sagardui and Maite Arratibel.
Title of the source: Software Quality Journal
Relevant pages: –
Jon Ayerdi successfully defended his PhD thesis at MGEP
Jon Ayerdi successfully defended his PhD thesis, “Simulation-based Metamorphic Testing of Cyber-Physical Systems”, at Mondragon Unibertsitatea, with several contributions of his thesis developed as part of the Adeptness project and in collaboration with Orona. The focus of his work is on the application of Metamorphic Testing (MT) techniques to improve the efficiency and quality of the Cyber-Physical System (CPS) development process. On the one hand, he presented a CPS DevOps architecture which will enable the automation of development tasks, focusing on the verification tasks where MT techniques can be implemented. On the other hand, he proposed and evaluated manual and fully-automated approaches for defining MT oracles for CPSs, for which he implemented two prototype tools. Finally, his thesis also presented and evaluated an optimization approach for MT based on test selection. Click here to see the full document.
A problem when testing Cyber-Physical Systems (CPS) is the difficulty of determining whether a particular system output or behaviour is correct or not. Metamorphic testing alleviates such a problem by reasoning on the relations expected to hold among multiple executions of the system under test, which are known as Metamorphic Relations (MRs). However, the development of effective MRs is often challenging and requires the involvement of domain experts. This paper summarizes our recent publication: “Generating Metamorphic Relations for Cyber-Physical Systems with Genetic Programming: An Industrial Case Study”, presented at ESEC/FSE 2021. In that publication we presented GAssertMRs, the first technique to automatically generate MRs for CPS, leveraging GP to explore the space of candidate solutions. We evaluated GAssertMRs in an industrial case study, outperforming other baselines.
Authors: Ayerdi, Jon and Terragni, Valerio and Arrieta, Aitor and Tonella, Paolo and Sagardui, Goiuria and Arratibel, Maite
Title of the source: Proceedings of the Genetic and Evolutionary Computation Conference Companion
Publisher:Association for Computing Machinery
Relevant pages: 15–16
Big Data is reforming many industrial domains by providing decision support through analyzing large data volumes. Big Data testing aims to ensure that Big Data systems run smoothly and error-free while maintaining the performance and quality of data. However, because of the diversity and complexity of data, testing Big Data is challenging. Though numerous research efforts deal with Big Data testing, a comprehensive review to address testing techniques and challenges of Big Data is not available as yet. Therefore, we have systematically reviewed the Big Data testing techniques’ evidence occurring in the period 2010–2021. This paper discusses testing data processing by highlighting the techniques used in every processing phase. Furthermore, we discuss the challenges and future directions. Our findings show that diverse functional, non-functional and combined (functional and non-functional) testing techniques have been used to solve specific problems related to Big Data. At the same time, most of the testing challenges have been faced during the MapReduce validation phase. In addition, the combinatorial testing technique is one of the most applied techniques in combination with other techniques (i.e., random testing, mutation testing, input space partitioning and equivalence testing) to find various functional faults through Big Data testing.
Authors: Iram Arshad, Saeed Hamood Alsamhi, Wasif Afzal
Title of the source: Computers, Materials & Continua
Publisher: Computers, Materials & Continua
Relevant pages: 2739-2770