Model-based development, using models as the primary source when creating systems and software, is claimed to improve productivity but this is often hard to justify in practice. It requires a lot of resources and time to do it in “academic” way by building the same system twice, having parallel teams, many developers and covering large numbers of development tasks. We describe, based on successful cases from practice, how evaluation can be conducted in practical use in a commercial setting. It is based on the same approach that is the basis for daily project business: Inspect how much effort was needed to implement an application that met customer requirements. We give examples of this by describing evaluations done in two different kind of companies: one developing embedded products for consumer electronics and the other web-based enterprise applications for the cloud. The talk shows the evaluation approaches that are realistic but require modest investments both in time and resources. We detail the evaluation procedures so that participants can repeat them in their own teams and companies. This helps to evaluate if a particular model-based development approach is suitable for the company.
Measuring Productivity from Model-Based Development
1. 1
7 December, 2022
Dr. Juha-Pekka Tolvanen
Measuring Productivity from
Model-Based Development:
A Tale of Two Companies
2. 2
Agenda
◼ Model-based development approaches
◼ Measuring productivity
◼ Evaluations in the industry
– Case 1: Enterprise apps to the cloud
– Case 2: Automation systems
◼ Lessons to be learned
◼ Q&A
9. 9
public class Alarm extends Thread
{
String name;
boolean localTimeAwareness;
String alarmState;
int sleepTime;
AbstractWatchApplication alarmApplication;
static final int dayMs;
volatile boolean isLive=false;
public Alarm(String alarmName, boolean aware, A
public void run()
public void stopAlarm()
}
11. 11
What should be measured?
◼ Code related metrics (e.g. LoC)?
◼ Developer’s effort in the UI: the number of keyboard
inputs, mouse clicks, drag-and-drops etc.?
◼ Opinions on the user-friendliness, ease of learning etc.?
◼ Being able to complete given tasks?
◼ How well the approach can prevent errors (bugs)?
◼ Development time?
– Perhaps most widely applied by academia and industry
– Works over different languages
– Acts as a proxy for the costs
– Is a basis for budgeting, salaries, project planning etc.
12. 12
Evaluations in practice?
◼ Companies conduct empirical comparison rarely
– high costs in terms of time or resources
◼ Many good scientific research methods are simply too
expensive and time-consuming for practical use
– large number of developers
– different groups
– repetition etc.
◼ Limited studies, collected experiences and opinions are
more often mentioned
13. 13
Case 1: Enterprise apps for Azure
◼ Company had invested in developing their own
modeling language and code generators
14. 14
Two different development steps
◼ Traditional Programming:
1. Schema definition with SQL Server Management Studio
using SQL
2. Business logic implementation in C# with Visual Studio
3. User interface implementation in HTML and JavaScript
◼ Model-based development:
1. Modeling the application (data, users, user rights, roles,
alerts etc.) with MetaEdit+ and generating the code
2. Adjusting the user interface with SQL
19. 19
Measured productivity
◼ Pilot by one developer
– Measuring time:
>900% faster
◼ Laboratory study, few hours
– 6 developers
– 2 new at Polar (<1y)
– Measured time varied:
from 75 min to 125 min, mean 105
– Productivity: at least 750% faster
– Asking opinions: results (scale 1-5, 5 best):
20. 20
Case 2: Automation system
◼ Systems with HW, SW, deployment and maintenance
◼ functionality: lights, feeding, monitoring, etc.
◼ hardware: sensors, actuators, cabling
◼ persistent data, database
◼ UI
◼ communication network
◼ material needs
◼ deployment and installation
21. 21
Fish farm automation system
◼ Modeling language covers:
– System structure
• network, sensors, controllers etc.
– Control operations
– Data structures
– User interfaces
– Installation
– Deployment
23. 23
Evaluation results
Defined parts Clone&own (h) Modeling (h)
Software 16 2
Electrical installation
changes
5
On-site installation 80 60
Total 101 62
Productivity ratio: 163%
24. 24
Lessons learned
◼ Raising the level of abstraction with models and the use
of generators clearly improves productivity
◼ Other than code matters too!
◼ Aim to generate various outputs from the same model
◼ Access to metamodel and generators is relevant
25. 25
Summary
◼ Development time is commonly used measure for
productivity
– In both academic studies and industry reports
◼ Evaluations can be conducted with modest investment
– Develop a small but typical system
– Ask additional developers to participate
• if possible, if interested
◼ Focus evaluation on differences in development
approaches
◼ Consider including maintenance tasks
– if possible
27. 27
Effort to create domain-specific
languages
"I could define a domain-specific language in
about six hours — design, testing and one
failed trial included"
"It took only one to a few person days for us
to completely create each of the three
metamodels"
"Creation of the modeling language solution
took 7.5 working days, covering the
development of the code generator"
28. 28
Cost of DSL creation: industry cases
◼ Very few companies (and academic papers) report
figures on the creation effort
0 5 10 15
Blood separators
Military radio system
High level synthesis, ESL
Heating remote control
Voice control
Car infotainment
Sport watches
Touch screen controller
Days