-
Notifications
You must be signed in to change notification settings - Fork 5
Getting Started
This section shows what the Simulation Tool looks like from the users' point of view, and provides a quick description on how to use it in combination with GreatSPN.
First, it is worth to recall that the modeling phase is done using Papyrus. Since there exists extensive documentation on how to use this tool to create profiled UML models, we will not provide details on the usage of this specific tool.
Next figure shows a general view of the Papyrus modeling perspective.
On the left of the figure, the different explorers (Project Explorer, Model Explorer and Outline) are shown.
The rest of the figure shows the Model Editor and the Properties view.
The model itself is depicted in the canvas of the Model Editor.

Profiles, stereotypes and tagged values are defined using the Properties view (Profiletab).
The following images show in the Properties view some tagged values that are applied to some model elements. Specifically:
- The first figure shows the
inserDBelement, stereotyped asGaStep, and itshost demandtagged value. The latter defined as(expr=$timeAdd,unit=ms,statQ=mean,source=est), where$timeAddis an input parameter representing ameantime duration in millisecondsms.

- The second figure shows the the selected control flow, stereotyped as
GaStep, and itsprobtagged value. The latter is defined as(expr=1-$probActLoop), where$probActLoopis an input parameter.

- The third figure shows the
startelement (the initial node), stereotyped asGaWorkloadEvent, and itspatterntagged value. The latter is defined asopen=(arrivalRate=(expr=$arrRate,unit=Hz,statQ=mean,source=est)), that is an open workload characterized by ameanarrival rate input parameter ($arrRate), where the rate unit isHz.

Tagged values are specified in Papyrus-MARTE using the so-called Value Specification Language.
As already seen in the previous figures, we can specify model input parameters ($timeAdd, $probActLoop and $arrRate) that will be set to actual values in the simulation configuration step (next section).
A SSH connection to the simulation server must be provided before launching the simulation (see the Configure GreatSPN SSH Connection section in the First-Steps page). The simulation server hosts the GreatSPN tool, which is in charge of computing the performance metrics selected by the user. The SSH connection is configured in Eclipse - Preferences - Simulation Tools - SSH connection.
The first step to be carried out is to set the configuration for the simulation experiments.
Therefore, we need to open the Run Configurations... window either
by selecting the Run as -> Run Configurations... option from the contextual menu associated to the UML model (Project Explorer view)
or by clicking the clock button (red marked in the figure below).

In this case we can create a new DICE Simulation configuration from scratch from the Run Configuration... windows:

In this case a DICE Simulation configuration is created (called Model) that is filled with the information
retrieved from the UML model profiled with MARTE:

Note that, depending on the annotations defined in the UML model, the Model configuration can be partially filled or complete.
In the running example it is complete and a simulation experiment can be run without any changes.
The figure above shows the Main tab of the Modelconfiguration where it is possible to select:
- The
Model to Analyse, i.e., a UML model annotated with MARTE, by browsing in the worksapce - The
Active scenario, by selecting the possibleGaScenarioin the model (the running example includes one one) - The
NFP to calculate, i.e., the type of analysis:PerformanceorReliability
Moreover, the tables shown at the bottom of the Main tab can be used to customize the values assigned to the input parameters
specified in the UML model.
The Filters tab of the Model configuration is shown in the following figure:

It includes two panels:
- The
Measurepanel, where it is possible to select/deselect the metrics to be estimated during the simulation experiment: such metrics, like the input parameters, are retrieved from the UML model annotated with MARTE. - The
Sensitivity analysispanel, where it is possible to select/deselect the input parameter configurations. Observe that the tool generates all the possible parameter configurations from the range of values assigned to the input parameters in theMaintab.
The Parameters tab of the Model configuration is shown in the following figure:

It includes General and Simulation parameters of the GreatSPN simulator.
In particular, the following are relevant for controlling the duration of a simulation run:
-
Maximum simulation execution time, a simulation run lasts at most the time value set to this parameter -
Confidence level, the level of the confidence intervals computed for the metrics of the performance model -
Accuracy, the accuracy of the estimated metrics of the performance model. It is expressed as percentage and it is an integer number (lower the value, higher is the accuracy).
This tab also shows the path of the GreatSPN simulator executable in the simulation server (WNSIM File Path).
Remember to save the changes in a tab of the Model configuration windows by clicking the Apply button before launching a simulation
experiment.
A simulation experiment is run by clicking the Run button in the Modelconfiguration window and it consists of as many simulation runs as the number of configurations selected in the Filters tab.
In the running example, a simulation experiments consists of 10 simulation runs.
The simulation can be monitored using the DICE Simulation perspective that can be set by clicking the clock button (red marked in
the figure below):

The following figure, shows the DICE Simulation perspective while the simulation experiment is running.

In the figure, three key views can be identified:
- The
Debugview that shows information about theSimulation process(identifier, state, exit value, etc.); - The
Consoleview that shows the messages that the simulation process dumps into the standard out and the standard error streams. In the case ofGreatSPN, these messages enable to monitor the accuracy achieved by the running process and the number of simulation steps that have been already performed. If an error happens during the process of simulation, it will be notified in theConsoleview. - The
Invocation Registryview that shows the starting/ending times and the status of the simulation runs belonging to the simulation experiments.
In the DICE Simulation perspective it is also possible to stop the simulation process at any moment by using the Stop button of the GUI
(red marked in the figure).
When the simulation finishes, the user can still access to the simulation console and the simulation process information (until he/she cleans the Console view using the corresponding button
).
As the next image shows, all the simulation runs terminated correctly (exit value 0) but second-last one that terminated with exit value -10, meaning that the simulation run reached the maximum simulation execution time without achieving the accuracy for all the estimated metrics in the performance model (remember that the maximum simulation execution time and the accuracy are two parameters set in the Parameters tab of the Modelconfiguration window).
Therefore, the simulation results are not saved for such run.

The results of a simulation experiment are reported both in textual and graphical formats.
From the Invocation Registry view it is possible to see the estimated performance metrics by right clicking on a particular simulation run
and selecting from the contextual menu the Open Simulation Result option, as shown in the following figure:

A new view, labeled with the id of the simulation run, pops up above the Invocation view as shown in the following figure:

For each performance metric, selected in the Filters tab of the Model configuraton window, the estimated mean value is shown.
When a simulation experiments consists of a set of simulation runs, then it is possible to generate 2D plots showing the trends of the estimated performance metrics against an input parameter, in the range of values set during the configuration step.
To generate the 2D plots, we consider again the Invocation Registry view and right click on the simulation experiment as shown
in the following figure:

The contextual menu shows the Plot Results... options that launches a wizard for the plot generation.
The following three figures shows the windows that pop ups in the wizard:
When the three steps are completed the plot file (data.plot) is saved in the project and a new view pops-up with the 2D plots.
The figure below, shows the 2D plot of the utilization metric vs/ the arrival rate.
The system is clearly not stable for arrival rates greater than 0.14 Hz, this is a reason for the possible long simulation runs that may occur for such parameter configurations.
