In the last years, the use of microservice architectures is spreading in Cyber-Physical Systems (CPSs) and Internet of Things (IoT) domains. CPSs are systems that integrate digital cyber computations with physical processes. The development of software for CPSs demands a constant maintenance to support new requirements, bug fixes, and deal with hardware obsolescence. The key in this process is code testing and more if the code is fragmented during the development of CPSs. It is important to remark that this process is challenging and time-consuming. In this paper, we report on the experience of instantiating of the microservice-based architecture for DevOps of CPSs to test elevator dispatching algorithms across different test levels (i.e., SiL, HiL and Operation). Such an architecture allows for a continuous deployment, monitoring and validation of CPSs. By integrating the approach with a real industrial case study, we demonstrate that our approach reduces significantly the time needed in the testing process and consequently, reduces the economic cost of the entire process.
Authors: Iñigo Aldalur, Aitor Arrieta, Aitor Agirre, Goiuria Sagardui and Maite Arratibel.
Title of the source: Software Quality Journal
Relevant pages: –
A problem when testing Cyber-Physical Systems (CPS) is the difficulty of determining whether a particular system output or behaviour is correct or not. Metamorphic testing alleviates such a problem by reasoning on the relations expected to hold among multiple executions of the system under test, which are known as Metamorphic Relations (MRs). However, the development of effective MRs is often challenging and requires the involvement of domain experts. This paper summarizes our recent publication: “Generating Metamorphic Relations for Cyber-Physical Systems with Genetic Programming: An Industrial Case Study”, presented at ESEC/FSE 2021. In that publication we presented GAssertMRs, the first technique to automatically generate MRs for CPS, leveraging GP to explore the space of candidate solutions. We evaluated GAssertMRs in an industrial case study, outperforming other baselines.
Authors: Ayerdi, Jon and Terragni, Valerio and Arrieta, Aitor and Tonella, Paolo and Sagardui, Goiuria and Arratibel, Maite
Title of the source: Proceedings of the Genetic and Evolutionary Computation Conference Companion
Publisher:Association for Computing Machinery
Relevant pages: 15–16
Big Data is reforming many industrial domains by providing decision support through analyzing large data volumes. Big Data testing aims to ensure that Big Data systems run smoothly and error-free while maintaining the performance and quality of data. However, because of the diversity and complexity of data, testing Big Data is challenging. Though numerous research efforts deal with Big Data testing, a comprehensive review to address testing techniques and challenges of Big Data is not available as yet. Therefore, we have systematically reviewed the Big Data testing techniques’ evidence occurring in the period 2010–2021. This paper discusses testing data processing by highlighting the techniques used in every processing phase. Furthermore, we discuss the challenges and future directions. Our findings show that diverse functional, non-functional and combined (functional and non-functional) testing techniques have been used to solve specific problems related to Big Data. At the same time, most of the testing challenges have been faced during the MapReduce validation phase. In addition, the combinatorial testing technique is one of the most applied techniques in combination with other techniques (i.e., random testing, mutation testing, input space partitioning and equivalence testing) to find various functional faults through Big Data testing.
Authors: Iram Arshad, Saeed Hamood Alsamhi, Wasif Afzal
Title of the source: Computers, Materials & Continua
Publisher: Computers, Materials & Continua
Relevant pages: 2739-2770
Quality assuring the quality assurance tool: applying safety-critical concepts to test framework development
The quality of embedded systems is demonstrated by the performed tests. The quality of such tests is often dependent on the quality of one or more testing tools, especially in automated testing. Test automation is also central to the success of agile development. It is thus critical to ensure the quality of testing tools. This work explores how industries with agile processes can learn from safety-critical system development with regards to the quality assurance of the test framework development. Safety-critical systems typically need adherence to safety standards that often suggests substantial upfront documentation, plans and a long-term perspective on several development aspects. In contrast, agile approaches focus on quick adaptation, evolving software and incremental deliveries. This article identifies several approaches of quality assurance of software development tools in functional safety development and agile development. The extracted approaches are further analyzed and processed into candidate solutions, i.e., principles and practices for the test framework quality assurance applicable in an industrial context. An industrial focus group with experienced practitioners further validated the candidate solutions through moderated group discussions. The two main contributions from this study are: (i) 48 approaches and 25 derived candidate solutions for test framework quality assurance in four categories (development, analysis, run-time measures, and validation and verification) with related insights, e.g., a test framework should be perceived as a tool-chain and not a single tool, (ii) the perceived value of the candidate solutions in industry as collected from the focus group.
Title of the source: PeerJ Comput Sci. 2022; 8: e1131
Publisher: PeerJ Comput Sci. 2022; 8: e1131
Digital Twin-based Anomaly Detection with Curriculum Learning in Cyber-physical Systems. ACM Transactions on Software Engineering and Methodology.
Anomaly detection is critical to ensure the security of cyber-physical systems (CPS). However, due to the increasing complexity of attacks and CPS themselves, anomaly detection in CPS is becoming more and more challenging. In our previous work, we proposed a digital twin-based anomaly detection method, called ATTAIN, which takes advantage of both historical and real-time data of CPS. However, such data vary significantly in terms of difficulty. Therefore, similar to human learning processes, deep learning models (e.g., ATTAIN) can benefit from an easy-to-difficult curriculum. To this end, in this paper, we present a novel approach, named digitaL twin-based Anomaly deTecTion wIth Curriculum lEarning (LATTICE), which extends ATTAIN by introducing curriculum learning to optimize its learning paradigm. LATTICE attributes each sample with a difficulty score, before being fed into a training scheduler. The training scheduler samples batches of training data based on these difficulty scores such that learning from easy to difficult data can be performed. To evaluate LATTICE, we use five publicly available datasets collected from five real-world CPS testbeds. We compare LATTICE with ATTAIN and two other state-of-the-art anomaly detectors. Evaluation results show that LATTICE outperforms the three baselines and ATTAIN by 0.906%-2.367% in terms of the F1 score. LATTICE also, on average, reduces the training time of ATTAIN by 4.2% on the five datasets and is on par with the baselines in terms of detection delay time.
Authors: Qinghua Xu, Shaukat Ali, Tao Yue
Title of the source: ACM Transactions on Software Engineering and Methodology
Relevant pages: Just accepted. Online but not included in the journal yet.
This thesis defines a concept of CPS/IoT Ecosystem as a hierarchical structure, that governs practices and procedures for modeling, design, development, execution and operation of smart systems. We divide these systems in three loosely dependent scopes of operation: the cloud, the fog, and the swarm. Furthermore, we propose a series of methods and approaches that support the dependable design, execution, and operation of CPS/IoT Ecosystems: the methods for ensuring the deterministic execution of tasks in safety constrained applications, a communication channels virtualization for many-core architectures, and a secure communication architecture for many-core platforms. A CPS/IoT Ecosystem is a highly heterogeneous environment with hardware and software components that are designed and implemented by multiple organizations. To ensure coherence between different components and to reduce complexity we propose a continuous integration and deployment (CI/CD) scheme for CPS/IoT Ecosystem. Furthermore, we demonstrate a runtime verification (RV) mechanism that provides a basis for quality of service (QoS) orchestration and dynamic reconfiguration of CPS/IoT applications. As final step in this thesis we propose methods to achieve energy-sustainable CPS/IoT Ecosystems. In conclusion, this thesis tries to seed methodological guidelines on how to build dependable CPS/IoT Ecosystems for applications with various confidence requirements. We want to understand the upcoming changes and reduce eventual effects of ad-hoc development. To explain physical environments using mathematical models and to learn new emerging behaviors using this massive incursion of new data and new insights.
Title of the source: Doctoral dissertation
Publisher: Technische Universität Wien
Relevant pages: 1-155