In MDE, quality of models is an important issue as models constitute the central artifacts in the development process. When executable models are employed, it is possible to validate their functional correctness by applying model testing where the model under test is executed and different properties of the carried out execution are validated. Unfortunately, systematic testing approaches for models are rare.

We developed a testing framework for UML models based on the fUML standard which provides a virtual machine for executing UML activities. This testing framework comprises a test specification language for expressing assertions on the execution behavior of UML activities as well as a test interpreter for evaluating them.

More details on our testing framework for UML can be found at http://www.modelexecution.org/?page_id=524.

ATM Example – New Version of the Test Specification Language

We introduced several new features into our test specification language, which are illustrated on the example of an ATM system. The example and descriptions of the provided features can be found at http://www.modelexecution.org/?page_id=544.

User Study

For evaluating the ease of use and usefulness of our testing framework, we have performed a user study with eleven participants. The results of the user study and other related materials be found at http://www.modelexecution.org/?page_id=1184.

 

Model differencing is concerned with identifying differences among models and constitutes an important prerequisite to efficiently carry out development and change management tasks in model-driven engineering. While most of the existing model differencing approaches focus on identifying differences on the abstract syntax level, we propose to reason about differences on the semantics level. Thereby, we utilize the behavioral semantics specification of the used modeling language, which enables the execution of the compared models, to reason about semantic differences.

Further details about our approach, our implementation based on the semantics specification language xMOF, as well as examples can be found at http://www.modelexecution.org/?page_id=1118.

 

We analyzed a set of 121 open UML models regarding the usage frequency of the sublanguages and modeling concepts provided by UML 2.4.1, as well as of UML profiles.

The analyzed models have been created with the UML modeling tool Enterprise Architect, and have been retrieved from the Web.

The following three research questions have been investigated in this study:

  1. What is the usage frequency of UML’s sublanguages?
  2. What is the usage frequency of UML’s modeling concepts?
  3. What is the usage frequency of UML profiles?

Information about the analyzed data set, the analysis process, as well as the results of the analysis are provided at: http://www.modelexecution.org/?page_id=982.

The paper titled On the Usage of UML: Initial Results of Analyzing Open UML Models was accepted for publication at the conference Modellierung 2014 which will take place in March at Vienna.

 

On Thursday, October 3rd 2013, we will give a tool demonstration of xMOF at MoDELS 2013. In this tool demonstration you will learn

  • how you can specify the behavioral semantics of your modeling languages with xMOF and
  • how you can execute your models according to this specification.

In the tool demonstration we will show you our tool support implemented for the Eclipse Modeling Framework based on a simple Petri Net modeling language. You can download an Eclipse bundle with xMOF readily installed as well as the used example at http://www.modelexecution.org/media/201309_xmofdemo_eclipsebundle.

The following tutorial video provides you a step-by-step guide for specifying the semantics of the Petri Net modeling language and for executing a simple Petri Net model: http://www.youtube.com/watch?v=2y1-xpfK-_Q

© 2012 moliz by Vienna University of Technology Suffusion theme by Sayontan Sinha