Joomla 3.2 Template by Justhost Complaints

PhD6: V&V Methodologies and Tools for large-scale, dynamic, service-oriented architectures

Published: Saturday, 05 April 2014

Title: V&V Methodologies and Tools for large-scale, dynamic, service-oriented architectures

 

Advisors:

 

Abstract:

Large-scale systems based on architectural approaches like SOA, component-based, service-oriented, etc, are becoming a norm in business-critical and safety critical enterprise applications. Healthcare, banking, e-government, e-commerce, stock market applications are examples of such trend. Quality attributes like security, availability, reliability, resilience, etc, are becoming extremely relevant in these scenarios.

 

When moving to the scenario of dynamic and evolving systems, traditional Verification and Validation (V&V) methodologies and techniques need to be improved, substituted or integrated with new ones. As the requirements and the system configurations evolve through time, it becomes extremely difficult to define a priori risks, failure modes, checklists, and the RAMS quantitative values before the deployment of the product. Even models for quantitative analysis of resilience cannot consider all possible evolution of the system and requirements. Most of the available methods are based on the construction and solution of models representing a static view of the system, with pre-defined requirements and system structure, so not explicitly addressing the dynamic and evolving nature of such systems and infrastructures. Consequently, common methods and tools for offline V&V need to be improved, adapted or even substituted with more fitting methods.

 

The goal of this research is to devise new methodologies and tools supporting V&V activities, specifically focusing on model-based approaches for dependability, resilience, safety and security evaluation. This will require the exploitation of the synergies between monitoring and modeling features, the definition of a dynamic model generation (and solution) process capable to dynamically produce at run-time different models representing the current system state and conditions, and capable to feed the models’ parameters with values coming from monitoring and experimental evaluation activities. The proposed methodologies and tools will be developed considering real usage scenarios in the context of large-scale, dynamic, service-oriented architectures, in order to bring practical evidence of the feasibility and advantages of the offered solutions.

 

Bibliography:

  • Andrea Bondavalli, Paolo Lollini, Istvan Majzik, and Leonardo Montecchi. Modelling and Model-based Assessment. In Resilience Assessment and Evaluation, Lecture Notes in Computer Science, Springer Berlin / Heidelberg, Edited by Katinka Wolter, Alberto Avritzer, Marco Vieira and Aad van Moorsel, ISBN 978-3-642-29031-2, pp. 153-165, 2012. 
  • Paolo Lollini, Andrea Bondavalli and Felicita Di Giandomenico. A decomposition-based modeling framework for complex systems. In IEEE Transactions on Reliability, Volume 58, Issue 1, pp. 20-33, 2009. 
  • Leonardo Montecchi, Paolo Lollini and Andrea Bondavalli. Towards a MDE Transformation Workflow for Dependability Analysis. In Proc. of the 16th IEEE International Conference on Engineering of Complex Computer Systems (ICECCS 2011), pp.157-166, Las Vegas, USA, 27-29 April, 2011.
  • Leonardo Montecchi, Paolo Lollini and Andrea Bondavalli. A DSL-Supported Workflow for the Automated Assembly of Large Stochastic Models. In Proc. of the 10th European Dependable Computing Conference (EDCC 2014), pp. 82-93, Newcastle upon Tyne, UK, May 13-16, 2014.

 

 

This page corresponds to a PhD proposal that will be Co-Advised by experienced researchers of at least two of the partners of the project. If you are interested in pursuing this proposal, please contact us at This email address is being protected from spambots. You need JavaScript enabled to view it.