SlideShare a Scribd company logo
1 of 78
Download to read offline
Modeling and Evaluating Quality in the
Presence of Uncertainty
QUATIC 2019
Ciudad Real, September 13, 2019
Antonio Vallecillo
Universidad de Málaga, Spain
Uncertainty
 It applies to: predictions of future events,
estimations,
physical measurements, or
properties of a system, its elements or its environment
 due to:
 Underspecification of the problem or solution domains
 Lack of knowledge of the system, its environment, or its underlying physics
 Lack of precision in measurements
 Imperfect, incorrect, or missing information
 Numerical approximations
 Values and parameters indeterminacy
 Different interpretations of the same evidences by separate parties
2
Uncertainty: Quality or state that involves imperfect and/or unknown information
“There is nothing certain, but the uncertain” (proverb)
Uncertainty in Software Engineering
 Ziv’s Uncertainty Principle: “Uncertainty is inherent and inevitable in software
development processes and products” (1996)
 All projects, no matter the domain, processes, or technology, operate in the presence
of uncertainty – reducible (epistemic) and irreducible (aleatory)
 Humphrey’s Requirements Uncertainty Principle: “For a new software system, the
requirements will not be completely known until after the users have used it.”
 The true role of design is thus to create a workable solution to an ill-defined problem.
 Software engineering variables affected by Uncertainty:
 Cost
 Schedule
 Performance
 Capacity for work
 Productivity
 Quality of results
3
Software development methodologies and uncertainty
4
“The Uncertainty Principle OR How to Choose the Right Methodology”. https://kosmothink.wordpress.com/2010/12/31/the-
uncertainty-principal-or-how-to-choose-the-right-methodology
Uncertainty in Complex Event Processing (CEP) systems
5
Different kinds of uncertainty in Complex Event Processing (CEP) systems
 Selection phase:
 Uncertain events in the stream: Missing events (false negatives, FN); or wrongly
inserted (false positives, FP).
 Uncertainty in the values of the attributes (including their timestamps!) due to
imprecision of the measuring methods or tools (measurement uncertainty, MU).
 Matching phase:
 Uncertainty of comparison operators (=, <, >, ->,...) between uncertain values.
 Uncertainty of logical composition operators (or, and, not) between uncertain
statements
 Production phase:
 Lack of precision in the values of the attributes of derived events, due to the
propagation of uncertainty in their calculation.
 Lack of confidence in the derived event, due to incomplete or erroneous
assumptions about the environment in which the system operates, which many
influence the rule’s confidence.
6
Nathalie Moreno, Manuel F. Bertoa, Loli Burgueño, Antonio Vallecillo: “Managing Measurement and Occurrence Uncertainty
in Complex Event Processing Systems.” IEEE Access 7: 88026-88048 (2019)
Many different formalisms and theories to quantify uncertainty
7
 Bayesian Belief Networks (BBN)
 Monte Carlo simulations
 Decision theory/trees
 Probabilities
 Fuzzy Logic
 …
Current approaches to represent uncertainty in (quality) models
8
Understanding Uncertainty
(It’s not that simple!)
10
A classification of uncertainty (according to its nature)
 Aleatory Uncertainty – A kind of uncertainty that refers to the inherent
uncertainty due to the probabilistic variability or randomness of a
phenomenon
 Examples: measuring the speed of a car, or the duration of a software
development process
 This type of uncertainty is irreducible, in that there will always be variability in
the underlying variables.
 Epistemic Uncertainty – A kind of uncertainty that refers to the lack of
knowledge we may have about the system (modeled or real).
 Examples: Ambiguous or imprecise requirements about the expected system
functionality, its envisioned operating environment, etc.
 This type of uncertainty is reducible, in that additional information or knowledge
may reduce it.
11
A. Der Kiureghian and O. Ditlevsen: "Aleatory or epistemic? Does it matter?" Structural Safety 31(2):105-112, 2009
Reducing the uncertainty
1. Certainty: There is no reducible uncertainty and information is
complete
2. Fully reducible imprecision: There is no full certainty, but
uncertainty can be reduced by collecting additional information
until achieving full certainty (no irreducible uncertainty present)
3. Partially reducible imprecision: There is no full certainty, but
uncertainty can be reduced by collecting additional information.
However, there is still irreducible uncertainty
4. Irreducible imprecision: There is no full certainty, and it cannot
be reduced (only margins can be used)
12
Epistemic
Aleatory
Uncertainty and Knowledge…
13Borrowed from “Introduction to Uncertainty Modeling” presentation at the OMG by T. Tue, S. Ali, B. Selic and A. Watson, 2016
Knowledge vs. Belief
14
Each “knowledge” statement here is based on real evidence!
When dealing with
uncertainty, perhaps
it is best to avoid the
notion of “knowledge”
altogether!
Borrowed from “Introduction to Uncertainty Modeling” presentation at the OMG by T. Tue, S. Ali, B. Selic and A. Watson, 2016
Belief
 Belief: An implicit or explicit opinion or conviction held by a belief agent about a
topic, expressed by one or more belief statements
 Belief agent: An entity (human, institution, even a machine) that holds one or
more beliefs
 Topic: a possible phenomenon or notion belonging to a given subject area.
 Belief Statement: An explicit specification of some belief held by a belief agent.
 It represents a belief, and therefore it is a subjective concept
 It may not always be possible to determine whether or not a belief statement is valid.
 A belief statement may not necessarily correspond to objective reality.
 This means that it could be completely false, or only partially true, or completely true.
 The validity of a statement may only be meaningfully defined within a given context
or purpose.
 Thus, the statement that “the Earth can be represented as a perfect sphere” may be perfectly
valid for some purposes but invalid or only partly valid for others.
15OMG. “Precise Semantics for Uncertainty Modeling” Request For Proposals. OMG Document: ad/2017-12-01, 2017.
The OMG PSUM initiative (Precise Semantics for Uncertainty Modeling)
16
Related concepts
 Risk – The effect of uncertainty on objectives [ISO/IEC 31000].
 An uncertainty may have an associated risk, and a high-risk difficulty or danger
associated with this uncertainty that deserves special attention.
 “Risk does not exist by itself. Risk is created when there is uncertainty.”
 Evidence – Objective information that may be used to justify a belief
 It can be an observation, a record of a real-world event occurrence or,
alternatively, the conclusion of some formalized chain of logical inference that
provides information that can contribute to determining the validity
(truthfulness) of a belief statement
17
OMG. “Precise Semantics for Uncertainty Modeling” Request For Proposals. OMG Document: ad/2017-12-01, 2017.
Types of uncertainty (according to their sources)
 Measurement uncertainty: A kind of aleatory uncertainty that refers to a set of possible
states or outcomes of a measurement, where probabilities are assigned to each possible
state or outcome
 Occurrence uncertainty: a kind of epistemic uncertainty that refers to the degree of belief
that we have on the actual existence of an entity, i.e., the real entity that a model element
represents
 Belief uncertainty: A kind of epistemic uncertainty in which a belief agent is uncertain about
any of the statements made about the system or its environment.
 Design uncertainty: A kind of epistemic uncertainty that refers to a set of possible design
decisions or options, where probabilities are assigned to each decision or option
 Environment uncertainty: lack of certainty about the surroundings, boundaries and usages
of a system and of its elements
 Location uncertainty: lack of certainty about the geographical or physical location of a
system, its elements or its environment
 Time uncertainty: lack of certainty about the time properties expressed in a statement
about the system or its environment
18
Based on M. Zhang, B. Selic, S. Ali, T. Yue, O. Okariz, and R. Norgren, "Understanding Uncertainty in Cyber-Physical Systems: A
Conceptual Model" In Proc. of ECMFA 2016, LNCS vol. 9764, pp. 247-264. Springer, 2016.
19
Measurement uncertainty
 Engineers naturally think about uncertainty
associated with measured values
 Uncertainty is explicitly defined in their models and
considered in model-based simulations
 Precise notations permit representing and
operating with uncertain values and confidences
20
Measurement uncertainty
 Measurement uncertainty: A kind of aleatory uncertainty that refers to a set
of possible states or outcomes of a measurement
 Normally expressed by a parameter, associated with the result of a measurement 𝑥𝑥,
that characterizes the dispersion of the values that could reasonably be attributed to
the measurand: the standard deviation 𝑢𝑢 of the possible variation of the values of 𝑥𝑥
 Representation: 𝒙𝒙 ± 𝒖𝒖 or 𝑥𝑥, 𝑢𝑢
 Examples:
21
JCGM 100:2008. Evaluation of measurement data – Guide to the expression of uncertainty in measurement (GUM).
http://www.bipm.org/utils/common/documents/jcgm/JCGM_100_2008_E.pdf
• Normal distribution: (𝑥𝑥, 𝜎𝜎) with mean 𝑥𝑥, and
and standard deviation 𝜎𝜎
• Interval 𝑎𝑎, 𝑏𝑏 : Uniform distribution is assumed
(𝑥𝑥, 𝑢𝑢) with 𝑥𝑥 =
𝑎𝑎+𝑏𝑏
2
, 𝑢𝑢 =
(𝑏𝑏−𝑎𝑎)
2 3
However, the situation is not the same in software models 
22
Useful applications in software simulation
23
Some problems with Measurement Uncertainty
 Computations with uncertain values have to respect the propagation of
uncertainty (uncertainty analysis)
 In general this is a complex problem, which cannot be manually managed
 Comparison of uncertain values is no longer a Boolean property!
 How to compare 17.7 ± 0.2 with 17.8 ± 0.2?
 Other primitive datatypes are also affected by uncertainty
 Strings (OCR)
 Enumerations
 Collections
24
Primitive datatypes extended with Uncertainty
 Extended primitive datatypes
 Real -> UReal UReal(17.8,0.2) ≡ 17.8 ± 0.2
 Boolean -> UBoolean UBoolean(true, 0.8)
 String -> Ustring UString(“Implementaci6n”,0.93)
 Enum -> UEnum UColor{ (#red,.9), (#orange,0.09), (#purple,0.01) }
 An algebra of operations on uncertain datatypes extending OCL/UML types
 Operations are closed in this algebra and automatically propagate uncertainty
25
M. F. Bertoa, N. Moreno, L. Burgueño, A. Vallecillo. “Incorporating Measurement Uncertainty into OCL/UML Primitive
Datatypes.” Software and Systems Modeling (Sosym), 2019. https://doi.org/10.1007/s10270-019-00741-0
26
Occurrence uncertainty
 Occurrence uncertainty: a kind of epistemic uncertainty that refers to the
degree of belief (confidence) that we have on the actual existence of an
entity, i.e., the real entity that a model element represents
 Assigned to individual objects
 Permit dealing with false positives (elements in the model that do not exist in
the real system) and false negatives (elements in the real system not
captured in the model)
 Normally measured by (Bayesian) probabilities
27
L. Burgueño, M. F. Bertoa, N. Moreno, A. Vallecillo: “Expressing Confidence in model and in model transformation
elements.” In Proc of MODELS 2018: 57-66, 2018.
Uncertainty related to OCL invariants (system integrity constraints)
 Degree of fulfilment of an OCL invariant
 Ocurrence uncertainty of the elements of the system (confidence)
28
[Image borrowed from Mihai Lica Pura “Ad Hoc Networks and Their Security: A Survey”, 2012]
Constraints
inv EnoughSensors: Sensor.allInstances()->size() >= 3000
inv Sorrounded: Enemy.allInstances()->select(e|e.distanceTo(self))->size() < 50
M. Gogolla, A. Vallecillo: “On Softening OCL invariants” Journal of Object Technology 18(2): 6:1-22 (2019).
“Crisp” Invariants
29
context Battle inv FairBattle: self.enemies->size = self.allies->size
context Battle inv EnoughAllies: self.allies->notEmpty
context Battle inv EnoughEnemies: self.enemies->notEmpty
M. Gogolla, A. Vallecillo: “On Softening OCL invariants” Journal of Object Technology 18(2): 6:1-22 (2019).
Soft Invariants
30
context Battle inv FairBattle: self.enemies->size = self.allies->size
FairBattleRSL : Integer -- Required satisfaction level (user defined)
FairBattleCSL : Integer derive = -- Current satisfaction level
let YesE=battle.enemies->select(e|e.confid>=EnemyConfidTh)->size in -- # real enemies
let YesA=battle.allies->select(a|a.confid>=AllyConfidTh)->size in -- # real allies
1 - (YesA - YesE).abs()/(YesE + YesA)
M. Gogolla, A. Vallecillo: “On Softening OCL invariants” Journal of Object Technology 18(2): 6:1-22 (2019).
31
Belief uncertainty
 Belief uncertainty: A kind of epistemic uncertainty in which the modeler, or
any other belief agent, is uncertain about any of the statements made about
the system or its environment.
 By nature, it is always subjective
 Belief agent: An entity (human, institution, even a machine) that holds one or
more beliefs
 Belief statement: Statement qualified by a degree of belief
 Degree of belief: Confidence assigned to a statement by a belief agent.
Normally expressed by quantitative or qualitative methods (e.g., a grade or a
probability “credence”)
32
Loli Burgueño, Robert Clarisó, Jordi Cabot, Sébastien Gérard, Antonio Vallecillo. “Belief uncertainty in software models.” Proc.
of MiSE 2019@ICSE, pp. 19-26. ACM, 2019. https://dl.acm.org/citation.cfm?id=3340709
A simple example of a hotel room
33
Temp. sensor Smoke detector
Alarm center
CO detector
Loli Burgueño, Robert Clarisó, Jordi Cabot, Sébastien Gérard, Antonio Vallecillo. “Belief uncertainty in software models.” Proc.
of MiSE 2019@ICSE, pp. 19-26. ACM, 2019. https://dl.acm.org/citation.cfm?id=3340709
A simple example of a hotel room
34
System attributes, operations, and constraints
35
class AlarmCenter
attributes
hightTemp : UBoolean derive:
self.room.tempSensor.temperature > 30.0
hightCOLevel : UBoolean derive:
self.room.coSensor.coPPM > 20
smoke : UBoolean derive:
self.room.smokeDetector.smoke
fireAlert : UBoolean derive:
self.highTemp and self.highCOLevel and self.smoke
operations
isHot() : UBoolean = self.tempSensor.temperature > 25
isCold() : UBoolean = self.tempSensor.temperature < 18
constraints
inv TempPrecision: self.temperature.uncertainty() <= 0.2
Some Belief Statements about the (model of the) system
 The CO and smoke detectors that we bought have a reliability of 90% (i.e., 10% of
their readings are not meaningful)
 We cannot be completely sure that the precision of the Temperature sensor is ± 0.5o,
as indicated in its datasheet
 We are only 95% confident that the presence of high temperature, high CO level and
smoke really means that there is a fire in the room
 Bob is from the South, so he only assigns a credibility of 50% to the operations that
indicate if the room is hot or cold. In contrast, Mary thinks they are mostly accurate
 Room #3 is close to the kitchen and frequently emits alarms. Everybody thinks that
most of them are false positives
 Joe the modeler doubts that the type of attribute “number” of class “Room” is Integer.
He thinks it may contain characters different from digits.
 Lucy the modeler is unsure if an “AlarmCenter” has to be attached to only one single
Room. She thinks they can also be attached to several.
36
[About the credibility of the values]
[From individual belief agents]
[About individual instances]
[About the model itself: relations]
[About the behavioral rules]
[About the uncertainty of the values]
[About the model itself: types]
>> How to represent these uncertainties in the system specifications?
>> How to incorporate them into the system structural and behavioral models?
UML Profile
37
Operationalization
 A list of pairs (BeliefAgent,credence) for every model statement subject to
Belief Uncertainty
 Operations to add and remove pairs from the list of pairs
 Query operation to know the credence of a statement
38
isHot_Beliefs : Set(Tuple(beliefAgent : BeliefAgent, degreeOfBelief : Real))
isHot_BeliefsAdd(ba : BeliefAgent, d : Real)
post: self.isHot_Beliefs = self.isHot_Beliefs@pre->reject(t|t.beliefAgent=ba)->
including(Tuple{beliefAgent:ba,degreeOfBelief:d})
isHot_credence(a:BeliefAgent): Real =
let baBoD : … = self.isHot_Beliefs->select(t|t.beliefAgent = a) in
let baBoDnull : … = self.isHot_Beliefs->select(t|t.beliefAgent = null) in
if baBoD->isEmpty then -- no explicit credence by “a”
if baBoDnull->notEmpty then -- but if default value exists
baBoDnull->collect(degreeOfBelief)->any(true)
else 1.0 endif
else baBoD->collect(degreeOfBelief)->any(true) endif
Running the system…
39
Hotel> !new BeliefAgent('Bob')
Hotel> !new BeliefAgent('Mary')
Hotel> !r1.isHot_BeliefsAdd(Bob,0.5)
Hotel> !r1.isHot_BeliefsAdd(Mary,0.99)
Hotel> !r1.isHot_BeliefsAdd(null,0.95)
Hotel>
Hotel> ?r1.isHot()
-> UBoolean(true,1.0) : UBoolean
Hotel> ?r1.isHot_credence(Bob)
-> 0.5 : Real
Hotel> ?r1.isHot_credence(Mary)
-> 0.99 : Real
Hotel> ?r1.isHot_credence(null)
-> 0.95 : Real
Credence propagation on dependent belief statements
40
fireAlert_credence(ba:BeliefAgent): Real =
let baBoD : Set(Tuple(beliefAgent:BeliefAgent, degreeOfBelief:Real)) =
self.fireAlert_Beliefs->select(t|t.beliefAgent = ba) in
(if baBoD->isEmpty then …
else baBoD->collect(degreeOfBelief)->any(true)
endif)
* self.fireAlertDeriveExpr_credence(ba)
fireAlertDeriveExpr_credence(ba:BeliefAgent): Real =
let baBoD : Set(Tuple(beliefAgent:BeliefAgent, degreeOfBelief:Real)) =
self.fireAlertDeriveExpr_Beliefs->select(t|t.beliefAgent = ba) in
(if baBoD->isEmpty then …
else baBoD->collect(degreeOfBelief)->any(true)
endif)
* highTemp_credence(ba)
* highCOLevel_credence(ba)
* smoke_credence(ba)
41
Design uncertainty
 Design uncertainty: A kind of epistemic uncertainty that refers to a set of
possible design decisions about the system
 It refers to the uncertainty that the developer has about what the system should
be like, rather than about what conditions it may face during its operation
(environment uncertainty).
42M. Famelis, M. Chechik: “Managing design-time uncertainty.” Software and Systems Modeling 18(2): 1249-1284 (2019)
The Design-Time Uncertainty Management (DeTUM) model
43M. Famelis, M. Chechik: “Managing design-time uncertainty.” Software and Systems Modeling 18(2): 1249-1284 (2019)
This is similar to the “Cone of Uncertainty” (CoU)
 It represents the best case uncertainty needed to inform the decision makers
of the Probability of Project Success at specific phases of the project
44
Further types of uncertainty: Environment
 Environment uncertainty: lack of certainty about the surroundings,
boundaries and usages of a system and of its elements
 Tackled by approaches such as self-adaptation, probabilistic behavior, or
identifying and explicating operational assumptions.
 “Uncertainty-aware” software
45
Uncertainty-wise testing
46
Further types of uncertainty: Location
 Location uncertainty: lack of certainty about the geographical or physical
location of a system, its elements or its environment
 The submarine can now be somewhere in the Mediterranean sea
 Cyber-attacks can come from anywhere
47
Further types of uncertainty: Time
 Time uncertainty: lack of certainty about the time properties expressed in a
statement about the system or its environment
 Mañana (i.e., “not today” )
 “We will call you soon”
 “A man with a watch knows
what time it is. A man with two
watches is never sure.”
(Segal's law)
48
49
Incorporating Uncertainty
in Software Quality Evaluation
Quality evaluation – Prediction Models
1. Identify your target entities and your target stakeholders
 Examples of entities: COTS components, Data stored in DBs, Internet Delivery Service.
 Examples of stakeholders: Developers, Advanced Users, Novice Users, …
2. Choose a Quality Model for evaluating your entities
 E.g., ISO/IEC 25010 “Product” QM
3. Customize the Quality Model
 Select the Characteristics and Subcharacteristics relevant to these entities and
stakeholders
4. Select the Measurable Attributes of the entities relevant to the Quality Model
5. Select the appropriate measures for those measurable attributes
6. Run experiments with samples of entities and groups of stakeholders to:
 Empirically evaluate the “perceived” (subjective) quality subcharacteristic of these
entities
 Empirically evaluate the “objective” quality subcharacteristic of these entities
7. Run regression analyses to identify the set of measures that better explain each
quality subcharacteristic, and define appropriate quality indicators
51
Quality evaluation – Prediction Models
1. Identify your target entities and your target stakeholders
 Examples of entities: COTS components, Data stored in DBs, Internet Delivery Service.
 Examples of stakeholders: Developers, Advanced Users, Novice Users, Any kinds of users.
Evaluate the quality of software components that are candidates to
be integrated in a software system
Our target stakeholders are system developers and maintainers, who
need to select the best candidate components to form part of their
systems
52
Manuel F. Bertoa, José M. Troya, Antonio Vallecillo: “Measuring the usability of software components”, Journal of Systems and
Software, 79(3):427-439, March 2006
Quality evaluation – Prediction Models
2. Choose a Quality Model for evaluating your entities
ISO/IEC 9126
53
Quality evaluation – Prediction Models
3. Customize the Quality Model
 Select the relevant characteristics and subcharacteristics
54
Quality evaluation – Prediction Models
4. Select the relevant Measurable Attributes of the entities w.r.t. the Quality
model
55
Quality evaluation – Prediction Models
5. Select the appropriate measures for those measurable attributes
 Measures related to “Quality of Documentation”
56
Quality evaluation – Prediction Models
5. Select the appropriate measures for those measurable attributes
 Measures related to “Design Complexity”
57
Quality evaluation – Prediction Models
6. Run experiments with samples of entities and groups of stakeholders to
empirically evaluate the “perceived” (subjective) and “objective” quality
58
Quality evaluation – Prediction Models
6. Run experiments with samples of entities and groups of stakeholders to
empirically evaluate the “perceived” (subjective) and “objective” quality
59
Quality evaluation – Prediction Models
7. Run regression analyses to identify the set of measures that better explain
each quality subcharacteristic, and define appropriate quality indicators
60
Quality evaluation – Prediction Models
7. Run regression analyses to identify the set of measures that better explain
each quality subcharacteristic, and define appropriate quality indicators
61
Quality evaluation – Prediction Models
7. Run regression analyses to identify the set of measures that better explain
each quality subcharacteristic, and define appropriate quality indicators
62
Maint = α Und + β Learn + γ Oper
high if Maint > 0.8;
Maintainability = low if Maint < 0.4
medium otherwise
Maintainability of models
63
F. Basciani, J. Rocco, D. Ruscio, L. Iovino,A. Pierantonio “ A tool-supported approach for assessing the quality of modeling
artifacts.” Journal of Computer Languages 51:173-192, April 2019.
M. Genero, M. Piattini. “Empirical validation of measures for class diagram structural complexity through controlled
experiments.” Proc. of QAOOSE WS at ECOOP 2001.
Maintainability of model transformations
64
F. Basciani, J. Rocco, D. Ruscio, L. Iovino,A. Pierantonio “ A tool-supported approach for assessing the quality of modeling
artifacts.” Journal of Computer Languages 51:173-192, April 2019.
Introducing Uncertainty
The Englishman Who Went up a Hill but
Came down a Mountain (1995)
65
Quality evaluation – Sources of uncertainty
1. Identify your target entities and your target stakeholders
 Examples of entities: COTS components, Data stored in DBs, Internet Delivery Service.
 Examples of stakeholders: Developers, Advanced Users, Novice Users, …
2. Choose a Quality Model for evaluating your entities
 E.g., ISO/IEC 25010 “Product” QM
3. Customize the Quality Model
 Select the Characteristics and Subcharacteristics relevant to these entities and
stakeholders
4. Select the Measurable Attributes of the entities relevant to the Quality Model
5. Select the appropriate measures for those measurable attributes
6. Run experiments with samples of entities and groups of stakeholders to:
 Empirically evaluate the “perceived” (subjective) quality subcharacteristic of these
entities
 Empirically evaluate the “objective” quality subcharacteristic of these entities
7. Run regression analyses to identify the set of measures that better explain each
quality subcharacteristic, and define appropriate quality indicators
66
Sources of uncertainty
 Selection of the subset of the quality model
 Selection of incorrect, inappropriate or missing quality subcharacteristics
 Selection of quality measures
 Selection of incorrect or inappropriate quality measures
 Imprecise measurements of quality measures
 Empirical experiments
 Confidence in the entity samples
 Confidence in the selected groups of stakeholders
 Evaluation of perceived and objective quality
 Incorrect or imprecise experiment results
 Statistical analyses and regression tests
 Confidence of estimation models
 Definition of quality indicators
 Confidence in thresholds
 Propagation of measurement uncertainty in decision models
67
Maintainability of models
68
M. Genero, M. Piattini, E. manso, G. Cantone. “Building UML Class Diagram Maintainability Prediction Models based on Early
Metrics.” Proc. of IEEE METRICS 2003.
Estimating quality with uncertainty
69
Maintainability = low
Estimating quality with uncertainty
70
Maintainability = { (low, 0.8), (medium, 0.18), (high, 0.02) }
with a credence of (0.95)
Estimating quality with uncertainty
71
Use > ?system1.maintainability()
-> ULevel((#low, 0.8), (#medium, 0.18), (#high, 0.02)) : ULevel
Use > ?system1.maintainability_credence(agent1)
-> 0.5 : Real
Use > ?r1. maintainability_credence(agent2)
-> 0.99 : Real
Use > ?r1. maintainability_credence(null)
-> 0.95 : Real
Summary (on Evaluating Quality in the presence of Uncertainty)
 Identify the kinds of uncertainty (and their nature) that affect
 Your entities and their attributes
 The quality characteristics you need to evaluate
 Your target stakeholders’ particular needs and backgrounds
 Your quality (base and derived) measures
 Your quality indicators
 Model uncertainty
 Include uncertainty in your quality models and measures as first-class elements
(measurement uncertainty, degrees of belief, credence, etc.)
 Evaluate uncertainty
 Use tools for quantifying and propagating uncertainty
 Document uncertainty
 Produce estimates of the magnitude and impact of these uncertainties
 Manage your quality considering uncertainty
 Make sure decision processes take into account the estimated uncertainties
72
Uncertainty as a first class concept in quality modeling and evaluation
 From “correctness” to “utility”
 Useful, beneficial and profitable to users, instead of objectively correct
 Utility permits accommodating trade-offs between different dimensions
 From “precise” to “approximate”
 Need to evaluate possible deviations and estimate margins
 “How accurate are my models and estimations, and how confident I am on
them?”
 From “open-loop” to “closed-loop”
 Need to (self-)adapt as new information is available, or conditions change
 “How do I change when the level of uncertainty changes?”
73
David Garlan “Software Engineering in an Uncertain World.” In Proc. of FoSER 2010: 125-128.
Takeaways (on Uncertainty)
 “Uncertainty” is not a single concept, it encompasses many different types of
uncertainties (measurement, belief, environment, …)
 Each type of uncertainty requires its own notations, underlying logics and
propagation mechanisms
 Uncertainty can be aleatory or epistemic (irreducible or reducible)
 Uncertainty does not depend so much on knowledge, but on belief
 It is mainly subjective, and diffent people may hold different degrees of belief
about the same statement
 Learn to manage in the presence of uncertainty; it cannot be eliminated.
 You can try to reduce it (for epistemic) with testing, verification, validation,
redundancy and other knowledge acquisition processes.
 Aleatory uncertainty and its risks cannot be reduced. It needs to be calculated,
and its values and risks bounded. Margins and bounds can be used to handle it.
74
Open problems for Quatic
From the QUATIC 2019 Call for Papers:
 Quality Aspects in Requirements Engineering
 Quality Aspects in Model-Driven Engineering
 Quality Aspects in DevOps Development
 Quality Aspects in Process Improvement and Assessment
 Quality Aspects in Verification and Validation
 Quality Aspects in Evidence-Based Software Engineering
 Quality Aspects in Security & Privacy
 Quality Aspects in Cloud-based Platforms and Services
 Quality Aspects in Business Processes
 Quality Aspects in Data Science & Artificial Intelligence
 Quality Aspects in Software Maintenance and Comprehension
75
Uncertainty
76
The technical debt associated to uncertainty management…
77
Modeling and Evaluating Quality in the
Presence of Uncertainty
QUATIC 2019
Ciudad Real, September 13, 2019
Antonio Vallecillo
Universidad de Málaga, Spain

More Related Content

What's hot

Blue gene technology
Blue gene technologyBlue gene technology
Blue gene technology
Vivek Jha
 
Bai giang tu dong hoa trong he thong dien 21 11-2011
Bai giang tu dong hoa trong he thong dien 21 11-2011Bai giang tu dong hoa trong he thong dien 21 11-2011
Bai giang tu dong hoa trong he thong dien 21 11-2011
Hiep Hoang
 

What's hot (18)

Superconducting materials becoming economicaly feasible for energy applications
Superconducting materials becoming economicaly feasible for energy applicationsSuperconducting materials becoming economicaly feasible for energy applications
Superconducting materials becoming economicaly feasible for energy applications
 
Pile a combustible Z (1).pptx
Pile a combustible Z (1).pptxPile a combustible Z (1).pptx
Pile a combustible Z (1).pptx
 
Le stockage d'énergie, un incontournable au plein déploiement des énergies re...
Le stockage d'énergie, un incontournable au plein déploiement des énergies re...Le stockage d'énergie, un incontournable au plein déploiement des énergies re...
Le stockage d'énergie, un incontournable au plein déploiement des énergies re...
 
Bio battery 2003
Bio battery 2003Bio battery 2003
Bio battery 2003
 
Status of Rechargeable Li-ion Battery Industry 2019 by Yole Développement
Status of Rechargeable Li-ion Battery Industry 2019 by Yole DéveloppementStatus of Rechargeable Li-ion Battery Industry 2019 by Yole Développement
Status of Rechargeable Li-ion Battery Industry 2019 by Yole Développement
 
Kinetic Energy Recovery Systems
Kinetic Energy Recovery SystemsKinetic Energy Recovery Systems
Kinetic Energy Recovery Systems
 
E- BICYCLE PPT.pptx
E- BICYCLE PPT.pptxE- BICYCLE PPT.pptx
E- BICYCLE PPT.pptx
 
Slide.pptx
Slide.pptxSlide.pptx
Slide.pptx
 
Thermal Management of Lithium-Ion Battery in Electric Vehicle
Thermal Management of Lithium-Ion Battery in Electric VehicleThermal Management of Lithium-Ion Battery in Electric Vehicle
Thermal Management of Lithium-Ion Battery in Electric Vehicle
 
Blue gene technology
Blue gene technologyBlue gene technology
Blue gene technology
 
Seminar report shaik hussain abbas
Seminar report shaik hussain abbasSeminar report shaik hussain abbas
Seminar report shaik hussain abbas
 
Solar Powered Aircraft
Solar Powered AircraftSolar Powered Aircraft
Solar Powered Aircraft
 
Bai giang tu dong hoa trong he thong dien 21 11-2011
Bai giang tu dong hoa trong he thong dien 21 11-2011Bai giang tu dong hoa trong he thong dien 21 11-2011
Bai giang tu dong hoa trong he thong dien 21 11-2011
 
New Battery Technology Developments
New Battery Technology DevelopmentsNew Battery Technology Developments
New Battery Technology Developments
 
Đề tài tốt nghiệp: Phân tích tình hình tài chính tại công ty, HOT
Đề tài tốt nghiệp: Phân tích tình hình tài chính tại công ty, HOTĐề tài tốt nghiệp: Phân tích tình hình tài chính tại công ty, HOT
Đề tài tốt nghiệp: Phân tích tình hình tài chính tại công ty, HOT
 
Đề tài: Tổn thất điện năng tại công ty điện lực, HAY
Đề tài: Tổn thất điện năng tại công ty điện lực, HAYĐề tài: Tổn thất điện năng tại công ty điện lực, HAY
Đề tài: Tổn thất điện năng tại công ty điện lực, HAY
 
Đề tài: Thiết mạch điều chỉnh tốc độ động cơ một chiều, HAY
Đề tài: Thiết mạch điều chỉnh tốc độ động cơ một chiều, HAYĐề tài: Thiết mạch điều chỉnh tốc độ động cơ một chiều, HAY
Đề tài: Thiết mạch điều chỉnh tốc độ động cơ một chiều, HAY
 
HOW UL STANDARDS SUPPORT THE SAFETY LIFECYCLE OF EV BATTERIES
HOW UL STANDARDS SUPPORT THE SAFETY LIFECYCLE OF EV BATTERIESHOW UL STANDARDS SUPPORT THE SAFETY LIFECYCLE OF EV BATTERIES
HOW UL STANDARDS SUPPORT THE SAFETY LIFECYCLE OF EV BATTERIES
 

Similar to Modeling and Evaluating Quality in the Presence of Uncertainty

Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)
Glen Alleman
 
Efficient reasoning
Efficient reasoningEfficient reasoning
Efficient reasoning
unyil96
 
Dynamic Rule Base Construction and Maintenance Scheme for Disease Prediction
Dynamic Rule Base Construction and Maintenance Scheme for Disease PredictionDynamic Rule Base Construction and Maintenance Scheme for Disease Prediction
Dynamic Rule Base Construction and Maintenance Scheme for Disease Prediction
ijsrd.com
 

Similar to Modeling and Evaluating Quality in the Presence of Uncertainty (20)

Expressing Confidence in Model and Model Transformation Elements
Expressing Confidence in Model and Model Transformation ElementsExpressing Confidence in Model and Model Transformation Elements
Expressing Confidence in Model and Model Transformation Elements
 
Representing and generating uncertainty effectively presentatıon
Representing and generating uncertainty effectively presentatıonRepresenting and generating uncertainty effectively presentatıon
Representing and generating uncertainty effectively presentatıon
 
Belief Uncertainty in Software Models
Belief Uncertainty in Software ModelsBelief Uncertainty in Software Models
Belief Uncertainty in Software Models
 
A rendezvous with the uncertainty monster
A rendezvous with the uncertainty monsterA rendezvous with the uncertainty monster
A rendezvous with the uncertainty monster
 
Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)
 
Bayesian Assurance: Formalizing Sensitivity Analysis For Sample Size
Bayesian Assurance: Formalizing Sensitivity Analysis For Sample SizeBayesian Assurance: Formalizing Sensitivity Analysis For Sample Size
Bayesian Assurance: Formalizing Sensitivity Analysis For Sample Size
 
Detecting Unknown Insider Threat Scenarios
Detecting Unknown Insider Threat Scenarios Detecting Unknown Insider Threat Scenarios
Detecting Unknown Insider Threat Scenarios
 
Managing in the presence of uncertainty
Managing in the presence of uncertaintyManaging in the presence of uncertainty
Managing in the presence of uncertainty
 
Datascience
DatascienceDatascience
Datascience
 
datascience.docx
datascience.docxdatascience.docx
datascience.docx
 
man0 ppt.pptx
man0 ppt.pptxman0 ppt.pptx
man0 ppt.pptx
 
Measurement Uncertainty (1).ppt
Measurement Uncertainty (1).pptMeasurement Uncertainty (1).ppt
Measurement Uncertainty (1).ppt
 
Lime
LimeLime
Lime
 
03 quantitative method
03 quantitative method03 quantitative method
03 quantitative method
 
AI CHAPTER 7.pdf
AI CHAPTER 7.pdfAI CHAPTER 7.pdf
AI CHAPTER 7.pdf
 
Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)Programmatic risk management workshop (handbook)
Programmatic risk management workshop (handbook)
 
hisory of computers in pharmaceutical research presentation.pptx
hisory of computers in pharmaceutical research presentation.pptxhisory of computers in pharmaceutical research presentation.pptx
hisory of computers in pharmaceutical research presentation.pptx
 
Efficient reasoning
Efficient reasoningEfficient reasoning
Efficient reasoning
 
Topic 7 measurement in research
Topic 7   measurement in researchTopic 7   measurement in research
Topic 7 measurement in research
 
Dynamic Rule Base Construction and Maintenance Scheme for Disease Prediction
Dynamic Rule Base Construction and Maintenance Scheme for Disease PredictionDynamic Rule Base Construction and Maintenance Scheme for Disease Prediction
Dynamic Rule Base Construction and Maintenance Scheme for Disease Prediction
 

More from Antonio Vallecillo

Accountable objects: Modeling Liability in Open Distributed Systems
Accountable objects: Modeling Liability in Open Distributed SystemsAccountable objects: Modeling Liability in Open Distributed Systems
Accountable objects: Modeling Liability in Open Distributed Systems
Antonio Vallecillo
 

More from Antonio Vallecillo (19)

Modeling Objects with Uncertain Behaviors
Modeling Objects with Uncertain BehaviorsModeling Objects with Uncertain Behaviors
Modeling Objects with Uncertain Behaviors
 
Introducing Subjective Knowledge Graphs
Introducing Subjective Knowledge GraphsIntroducing Subjective Knowledge Graphs
Introducing Subjective Knowledge Graphs
 
Using UML and OCL Models to realize High-Level Digital Twins
Using UML and OCL Models to realize High-Level Digital TwinsUsing UML and OCL Models to realize High-Level Digital Twins
Using UML and OCL Models to realize High-Level Digital Twins
 
Modeling behavioral deontic constraints using UML and OCL
Modeling behavioral deontic constraints using UML and OCLModeling behavioral deontic constraints using UML and OCL
Modeling behavioral deontic constraints using UML and OCL
 
Research Evaluation - The current situation in Spain
Research Evaluation - The current situation in SpainResearch Evaluation - The current situation in Spain
Research Evaluation - The current situation in Spain
 
Adding Random Operations to OCL
Adding Random Operations to OCLAdding Random Operations to OCL
Adding Random Operations to OCL
 
Extending Complex Event Processing to Graph-structured Information
Extending Complex Event Processing to Graph-structured InformationExtending Complex Event Processing to Graph-structured Information
Extending Complex Event Processing to Graph-structured Information
 
Towards a Body of Knowledge for Model-Based Software Engineering
Towards a Body of Knowledge for Model-Based Software EngineeringTowards a Body of Knowledge for Model-Based Software Engineering
Towards a Body of Knowledge for Model-Based Software Engineering
 
La Ingeniería Informática no es una Ciencia -- Reflexiones sobre la Educación...
La Ingeniería Informática no es una Ciencia -- Reflexiones sobre la Educación...La Ingeniería Informática no es una Ciencia -- Reflexiones sobre la Educación...
La Ingeniería Informática no es una Ciencia -- Reflexiones sobre la Educación...
 
La Ética en la Ingeniería de Software de Pruebas: Necesidad de un Código Ético
La Ética en la Ingeniería de Software de Pruebas: Necesidad de un Código ÉticoLa Ética en la Ingeniería de Software de Pruebas: Necesidad de un Código Ético
La Ética en la Ingeniería de Software de Pruebas: Necesidad de un Código Ético
 
La ingeniería del software en España: retos y oportunidades
La ingeniería del software en España: retos y oportunidadesLa ingeniería del software en España: retos y oportunidades
La ingeniería del software en España: retos y oportunidades
 
Los Estudios de Posgrado de la Universidad de Málaga
Los Estudios de Posgrado de la Universidad de MálagaLos Estudios de Posgrado de la Universidad de Málaga
Los Estudios de Posgrado de la Universidad de Málaga
 
El papel de los MOOCs en la Formación de Posgrado. El reto de la Universidad...
El papel de los MOOCs en la Formación de Posgrado. El reto de la Universidad...El papel de los MOOCs en la Formación de Posgrado. El reto de la Universidad...
El papel de los MOOCs en la Formación de Posgrado. El reto de la Universidad...
 
La enseñanza digital y los MOOC en la UMA. Presentación en el XV encuentro de...
La enseñanza digital y los MOOC en la UMA. Presentación en el XV encuentro de...La enseñanza digital y los MOOC en la UMA. Presentación en el XV encuentro de...
La enseñanza digital y los MOOC en la UMA. Presentación en el XV encuentro de...
 
El doctorado en Informática: ¿Nuevo vino en viejas botellas? (Charla U. Sevil...
El doctorado en Informática: ¿Nuevo vino en viejas botellas? (Charla U. Sevil...El doctorado en Informática: ¿Nuevo vino en viejas botellas? (Charla U. Sevil...
El doctorado en Informática: ¿Nuevo vino en viejas botellas? (Charla U. Sevil...
 
Accountable objects: Modeling Liability in Open Distributed Systems
Accountable objects: Modeling Liability in Open Distributed SystemsAccountable objects: Modeling Liability in Open Distributed Systems
Accountable objects: Modeling Liability in Open Distributed Systems
 
Models And Meanings
Models And MeaningsModels And Meanings
Models And Meanings
 
Improving Naming and Grouping in UML
Improving Naming and Grouping in UMLImproving Naming and Grouping in UML
Improving Naming and Grouping in UML
 
On the Combination of Domain Specific Modeling Languages
On the Combination of Domain Specific Modeling LanguagesOn the Combination of Domain Specific Modeling Languages
On the Combination of Domain Specific Modeling Languages
 

Recently uploaded

Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 

Recently uploaded (20)

Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 

Modeling and Evaluating Quality in the Presence of Uncertainty

  • 1. Modeling and Evaluating Quality in the Presence of Uncertainty QUATIC 2019 Ciudad Real, September 13, 2019 Antonio Vallecillo Universidad de Málaga, Spain
  • 2. Uncertainty  It applies to: predictions of future events, estimations, physical measurements, or properties of a system, its elements or its environment  due to:  Underspecification of the problem or solution domains  Lack of knowledge of the system, its environment, or its underlying physics  Lack of precision in measurements  Imperfect, incorrect, or missing information  Numerical approximations  Values and parameters indeterminacy  Different interpretations of the same evidences by separate parties 2 Uncertainty: Quality or state that involves imperfect and/or unknown information “There is nothing certain, but the uncertain” (proverb)
  • 3. Uncertainty in Software Engineering  Ziv’s Uncertainty Principle: “Uncertainty is inherent and inevitable in software development processes and products” (1996)  All projects, no matter the domain, processes, or technology, operate in the presence of uncertainty – reducible (epistemic) and irreducible (aleatory)  Humphrey’s Requirements Uncertainty Principle: “For a new software system, the requirements will not be completely known until after the users have used it.”  The true role of design is thus to create a workable solution to an ill-defined problem.  Software engineering variables affected by Uncertainty:  Cost  Schedule  Performance  Capacity for work  Productivity  Quality of results 3
  • 4. Software development methodologies and uncertainty 4 “The Uncertainty Principle OR How to Choose the Right Methodology”. https://kosmothink.wordpress.com/2010/12/31/the- uncertainty-principal-or-how-to-choose-the-right-methodology
  • 5. Uncertainty in Complex Event Processing (CEP) systems 5
  • 6. Different kinds of uncertainty in Complex Event Processing (CEP) systems  Selection phase:  Uncertain events in the stream: Missing events (false negatives, FN); or wrongly inserted (false positives, FP).  Uncertainty in the values of the attributes (including their timestamps!) due to imprecision of the measuring methods or tools (measurement uncertainty, MU).  Matching phase:  Uncertainty of comparison operators (=, <, >, ->,...) between uncertain values.  Uncertainty of logical composition operators (or, and, not) between uncertain statements  Production phase:  Lack of precision in the values of the attributes of derived events, due to the propagation of uncertainty in their calculation.  Lack of confidence in the derived event, due to incomplete or erroneous assumptions about the environment in which the system operates, which many influence the rule’s confidence. 6 Nathalie Moreno, Manuel F. Bertoa, Loli Burgueño, Antonio Vallecillo: “Managing Measurement and Occurrence Uncertainty in Complex Event Processing Systems.” IEEE Access 7: 88026-88048 (2019)
  • 7. Many different formalisms and theories to quantify uncertainty 7  Bayesian Belief Networks (BBN)  Monte Carlo simulations  Decision theory/trees  Probabilities  Fuzzy Logic  …
  • 8. Current approaches to represent uncertainty in (quality) models 8
  • 10. 10
  • 11. A classification of uncertainty (according to its nature)  Aleatory Uncertainty – A kind of uncertainty that refers to the inherent uncertainty due to the probabilistic variability or randomness of a phenomenon  Examples: measuring the speed of a car, or the duration of a software development process  This type of uncertainty is irreducible, in that there will always be variability in the underlying variables.  Epistemic Uncertainty – A kind of uncertainty that refers to the lack of knowledge we may have about the system (modeled or real).  Examples: Ambiguous or imprecise requirements about the expected system functionality, its envisioned operating environment, etc.  This type of uncertainty is reducible, in that additional information or knowledge may reduce it. 11 A. Der Kiureghian and O. Ditlevsen: "Aleatory or epistemic? Does it matter?" Structural Safety 31(2):105-112, 2009
  • 12. Reducing the uncertainty 1. Certainty: There is no reducible uncertainty and information is complete 2. Fully reducible imprecision: There is no full certainty, but uncertainty can be reduced by collecting additional information until achieving full certainty (no irreducible uncertainty present) 3. Partially reducible imprecision: There is no full certainty, but uncertainty can be reduced by collecting additional information. However, there is still irreducible uncertainty 4. Irreducible imprecision: There is no full certainty, and it cannot be reduced (only margins can be used) 12 Epistemic Aleatory
  • 13. Uncertainty and Knowledge… 13Borrowed from “Introduction to Uncertainty Modeling” presentation at the OMG by T. Tue, S. Ali, B. Selic and A. Watson, 2016
  • 14. Knowledge vs. Belief 14 Each “knowledge” statement here is based on real evidence! When dealing with uncertainty, perhaps it is best to avoid the notion of “knowledge” altogether! Borrowed from “Introduction to Uncertainty Modeling” presentation at the OMG by T. Tue, S. Ali, B. Selic and A. Watson, 2016
  • 15. Belief  Belief: An implicit or explicit opinion or conviction held by a belief agent about a topic, expressed by one or more belief statements  Belief agent: An entity (human, institution, even a machine) that holds one or more beliefs  Topic: a possible phenomenon or notion belonging to a given subject area.  Belief Statement: An explicit specification of some belief held by a belief agent.  It represents a belief, and therefore it is a subjective concept  It may not always be possible to determine whether or not a belief statement is valid.  A belief statement may not necessarily correspond to objective reality.  This means that it could be completely false, or only partially true, or completely true.  The validity of a statement may only be meaningfully defined within a given context or purpose.  Thus, the statement that “the Earth can be represented as a perfect sphere” may be perfectly valid for some purposes but invalid or only partly valid for others. 15OMG. “Precise Semantics for Uncertainty Modeling” Request For Proposals. OMG Document: ad/2017-12-01, 2017.
  • 16. The OMG PSUM initiative (Precise Semantics for Uncertainty Modeling) 16
  • 17. Related concepts  Risk – The effect of uncertainty on objectives [ISO/IEC 31000].  An uncertainty may have an associated risk, and a high-risk difficulty or danger associated with this uncertainty that deserves special attention.  “Risk does not exist by itself. Risk is created when there is uncertainty.”  Evidence – Objective information that may be used to justify a belief  It can be an observation, a record of a real-world event occurrence or, alternatively, the conclusion of some formalized chain of logical inference that provides information that can contribute to determining the validity (truthfulness) of a belief statement 17 OMG. “Precise Semantics for Uncertainty Modeling” Request For Proposals. OMG Document: ad/2017-12-01, 2017.
  • 18. Types of uncertainty (according to their sources)  Measurement uncertainty: A kind of aleatory uncertainty that refers to a set of possible states or outcomes of a measurement, where probabilities are assigned to each possible state or outcome  Occurrence uncertainty: a kind of epistemic uncertainty that refers to the degree of belief that we have on the actual existence of an entity, i.e., the real entity that a model element represents  Belief uncertainty: A kind of epistemic uncertainty in which a belief agent is uncertain about any of the statements made about the system or its environment.  Design uncertainty: A kind of epistemic uncertainty that refers to a set of possible design decisions or options, where probabilities are assigned to each decision or option  Environment uncertainty: lack of certainty about the surroundings, boundaries and usages of a system and of its elements  Location uncertainty: lack of certainty about the geographical or physical location of a system, its elements or its environment  Time uncertainty: lack of certainty about the time properties expressed in a statement about the system or its environment 18 Based on M. Zhang, B. Selic, S. Ali, T. Yue, O. Okariz, and R. Norgren, "Understanding Uncertainty in Cyber-Physical Systems: A Conceptual Model" In Proc. of ECMFA 2016, LNCS vol. 9764, pp. 247-264. Springer, 2016.
  • 19. 19
  • 20. Measurement uncertainty  Engineers naturally think about uncertainty associated with measured values  Uncertainty is explicitly defined in their models and considered in model-based simulations  Precise notations permit representing and operating with uncertain values and confidences 20
  • 21. Measurement uncertainty  Measurement uncertainty: A kind of aleatory uncertainty that refers to a set of possible states or outcomes of a measurement  Normally expressed by a parameter, associated with the result of a measurement 𝑥𝑥, that characterizes the dispersion of the values that could reasonably be attributed to the measurand: the standard deviation 𝑢𝑢 of the possible variation of the values of 𝑥𝑥  Representation: 𝒙𝒙 ± 𝒖𝒖 or 𝑥𝑥, 𝑢𝑢  Examples: 21 JCGM 100:2008. Evaluation of measurement data – Guide to the expression of uncertainty in measurement (GUM). http://www.bipm.org/utils/common/documents/jcgm/JCGM_100_2008_E.pdf • Normal distribution: (𝑥𝑥, 𝜎𝜎) with mean 𝑥𝑥, and and standard deviation 𝜎𝜎 • Interval 𝑎𝑎, 𝑏𝑏 : Uniform distribution is assumed (𝑥𝑥, 𝑢𝑢) with 𝑥𝑥 = 𝑎𝑎+𝑏𝑏 2 , 𝑢𝑢 = (𝑏𝑏−𝑎𝑎) 2 3
  • 22. However, the situation is not the same in software models  22
  • 23. Useful applications in software simulation 23
  • 24. Some problems with Measurement Uncertainty  Computations with uncertain values have to respect the propagation of uncertainty (uncertainty analysis)  In general this is a complex problem, which cannot be manually managed  Comparison of uncertain values is no longer a Boolean property!  How to compare 17.7 ± 0.2 with 17.8 ± 0.2?  Other primitive datatypes are also affected by uncertainty  Strings (OCR)  Enumerations  Collections 24
  • 25. Primitive datatypes extended with Uncertainty  Extended primitive datatypes  Real -> UReal UReal(17.8,0.2) ≡ 17.8 ± 0.2  Boolean -> UBoolean UBoolean(true, 0.8)  String -> Ustring UString(“Implementaci6n”,0.93)  Enum -> UEnum UColor{ (#red,.9), (#orange,0.09), (#purple,0.01) }  An algebra of operations on uncertain datatypes extending OCL/UML types  Operations are closed in this algebra and automatically propagate uncertainty 25 M. F. Bertoa, N. Moreno, L. Burgueño, A. Vallecillo. “Incorporating Measurement Uncertainty into OCL/UML Primitive Datatypes.” Software and Systems Modeling (Sosym), 2019. https://doi.org/10.1007/s10270-019-00741-0
  • 26. 26
  • 27. Occurrence uncertainty  Occurrence uncertainty: a kind of epistemic uncertainty that refers to the degree of belief (confidence) that we have on the actual existence of an entity, i.e., the real entity that a model element represents  Assigned to individual objects  Permit dealing with false positives (elements in the model that do not exist in the real system) and false negatives (elements in the real system not captured in the model)  Normally measured by (Bayesian) probabilities 27 L. Burgueño, M. F. Bertoa, N. Moreno, A. Vallecillo: “Expressing Confidence in model and in model transformation elements.” In Proc of MODELS 2018: 57-66, 2018.
  • 28. Uncertainty related to OCL invariants (system integrity constraints)  Degree of fulfilment of an OCL invariant  Ocurrence uncertainty of the elements of the system (confidence) 28 [Image borrowed from Mihai Lica Pura “Ad Hoc Networks and Their Security: A Survey”, 2012] Constraints inv EnoughSensors: Sensor.allInstances()->size() >= 3000 inv Sorrounded: Enemy.allInstances()->select(e|e.distanceTo(self))->size() < 50 M. Gogolla, A. Vallecillo: “On Softening OCL invariants” Journal of Object Technology 18(2): 6:1-22 (2019).
  • 29. “Crisp” Invariants 29 context Battle inv FairBattle: self.enemies->size = self.allies->size context Battle inv EnoughAllies: self.allies->notEmpty context Battle inv EnoughEnemies: self.enemies->notEmpty M. Gogolla, A. Vallecillo: “On Softening OCL invariants” Journal of Object Technology 18(2): 6:1-22 (2019).
  • 30. Soft Invariants 30 context Battle inv FairBattle: self.enemies->size = self.allies->size FairBattleRSL : Integer -- Required satisfaction level (user defined) FairBattleCSL : Integer derive = -- Current satisfaction level let YesE=battle.enemies->select(e|e.confid>=EnemyConfidTh)->size in -- # real enemies let YesA=battle.allies->select(a|a.confid>=AllyConfidTh)->size in -- # real allies 1 - (YesA - YesE).abs()/(YesE + YesA) M. Gogolla, A. Vallecillo: “On Softening OCL invariants” Journal of Object Technology 18(2): 6:1-22 (2019).
  • 31. 31
  • 32. Belief uncertainty  Belief uncertainty: A kind of epistemic uncertainty in which the modeler, or any other belief agent, is uncertain about any of the statements made about the system or its environment.  By nature, it is always subjective  Belief agent: An entity (human, institution, even a machine) that holds one or more beliefs  Belief statement: Statement qualified by a degree of belief  Degree of belief: Confidence assigned to a statement by a belief agent. Normally expressed by quantitative or qualitative methods (e.g., a grade or a probability “credence”) 32 Loli Burgueño, Robert Clarisó, Jordi Cabot, Sébastien Gérard, Antonio Vallecillo. “Belief uncertainty in software models.” Proc. of MiSE 2019@ICSE, pp. 19-26. ACM, 2019. https://dl.acm.org/citation.cfm?id=3340709
  • 33. A simple example of a hotel room 33 Temp. sensor Smoke detector Alarm center CO detector Loli Burgueño, Robert Clarisó, Jordi Cabot, Sébastien Gérard, Antonio Vallecillo. “Belief uncertainty in software models.” Proc. of MiSE 2019@ICSE, pp. 19-26. ACM, 2019. https://dl.acm.org/citation.cfm?id=3340709
  • 34. A simple example of a hotel room 34
  • 35. System attributes, operations, and constraints 35 class AlarmCenter attributes hightTemp : UBoolean derive: self.room.tempSensor.temperature > 30.0 hightCOLevel : UBoolean derive: self.room.coSensor.coPPM > 20 smoke : UBoolean derive: self.room.smokeDetector.smoke fireAlert : UBoolean derive: self.highTemp and self.highCOLevel and self.smoke operations isHot() : UBoolean = self.tempSensor.temperature > 25 isCold() : UBoolean = self.tempSensor.temperature < 18 constraints inv TempPrecision: self.temperature.uncertainty() <= 0.2
  • 36. Some Belief Statements about the (model of the) system  The CO and smoke detectors that we bought have a reliability of 90% (i.e., 10% of their readings are not meaningful)  We cannot be completely sure that the precision of the Temperature sensor is ± 0.5o, as indicated in its datasheet  We are only 95% confident that the presence of high temperature, high CO level and smoke really means that there is a fire in the room  Bob is from the South, so he only assigns a credibility of 50% to the operations that indicate if the room is hot or cold. In contrast, Mary thinks they are mostly accurate  Room #3 is close to the kitchen and frequently emits alarms. Everybody thinks that most of them are false positives  Joe the modeler doubts that the type of attribute “number” of class “Room” is Integer. He thinks it may contain characters different from digits.  Lucy the modeler is unsure if an “AlarmCenter” has to be attached to only one single Room. She thinks they can also be attached to several. 36 [About the credibility of the values] [From individual belief agents] [About individual instances] [About the model itself: relations] [About the behavioral rules] [About the uncertainty of the values] [About the model itself: types] >> How to represent these uncertainties in the system specifications? >> How to incorporate them into the system structural and behavioral models?
  • 38. Operationalization  A list of pairs (BeliefAgent,credence) for every model statement subject to Belief Uncertainty  Operations to add and remove pairs from the list of pairs  Query operation to know the credence of a statement 38 isHot_Beliefs : Set(Tuple(beliefAgent : BeliefAgent, degreeOfBelief : Real)) isHot_BeliefsAdd(ba : BeliefAgent, d : Real) post: self.isHot_Beliefs = self.isHot_Beliefs@pre->reject(t|t.beliefAgent=ba)-> including(Tuple{beliefAgent:ba,degreeOfBelief:d}) isHot_credence(a:BeliefAgent): Real = let baBoD : … = self.isHot_Beliefs->select(t|t.beliefAgent = a) in let baBoDnull : … = self.isHot_Beliefs->select(t|t.beliefAgent = null) in if baBoD->isEmpty then -- no explicit credence by “a” if baBoDnull->notEmpty then -- but if default value exists baBoDnull->collect(degreeOfBelief)->any(true) else 1.0 endif else baBoD->collect(degreeOfBelief)->any(true) endif
  • 39. Running the system… 39 Hotel> !new BeliefAgent('Bob') Hotel> !new BeliefAgent('Mary') Hotel> !r1.isHot_BeliefsAdd(Bob,0.5) Hotel> !r1.isHot_BeliefsAdd(Mary,0.99) Hotel> !r1.isHot_BeliefsAdd(null,0.95) Hotel> Hotel> ?r1.isHot() -> UBoolean(true,1.0) : UBoolean Hotel> ?r1.isHot_credence(Bob) -> 0.5 : Real Hotel> ?r1.isHot_credence(Mary) -> 0.99 : Real Hotel> ?r1.isHot_credence(null) -> 0.95 : Real
  • 40. Credence propagation on dependent belief statements 40 fireAlert_credence(ba:BeliefAgent): Real = let baBoD : Set(Tuple(beliefAgent:BeliefAgent, degreeOfBelief:Real)) = self.fireAlert_Beliefs->select(t|t.beliefAgent = ba) in (if baBoD->isEmpty then … else baBoD->collect(degreeOfBelief)->any(true) endif) * self.fireAlertDeriveExpr_credence(ba) fireAlertDeriveExpr_credence(ba:BeliefAgent): Real = let baBoD : Set(Tuple(beliefAgent:BeliefAgent, degreeOfBelief:Real)) = self.fireAlertDeriveExpr_Beliefs->select(t|t.beliefAgent = ba) in (if baBoD->isEmpty then … else baBoD->collect(degreeOfBelief)->any(true) endif) * highTemp_credence(ba) * highCOLevel_credence(ba) * smoke_credence(ba)
  • 41. 41
  • 42. Design uncertainty  Design uncertainty: A kind of epistemic uncertainty that refers to a set of possible design decisions about the system  It refers to the uncertainty that the developer has about what the system should be like, rather than about what conditions it may face during its operation (environment uncertainty). 42M. Famelis, M. Chechik: “Managing design-time uncertainty.” Software and Systems Modeling 18(2): 1249-1284 (2019)
  • 43. The Design-Time Uncertainty Management (DeTUM) model 43M. Famelis, M. Chechik: “Managing design-time uncertainty.” Software and Systems Modeling 18(2): 1249-1284 (2019)
  • 44. This is similar to the “Cone of Uncertainty” (CoU)  It represents the best case uncertainty needed to inform the decision makers of the Probability of Project Success at specific phases of the project 44
  • 45. Further types of uncertainty: Environment  Environment uncertainty: lack of certainty about the surroundings, boundaries and usages of a system and of its elements  Tackled by approaches such as self-adaptation, probabilistic behavior, or identifying and explicating operational assumptions.  “Uncertainty-aware” software 45
  • 47. Further types of uncertainty: Location  Location uncertainty: lack of certainty about the geographical or physical location of a system, its elements or its environment  The submarine can now be somewhere in the Mediterranean sea  Cyber-attacks can come from anywhere 47
  • 48. Further types of uncertainty: Time  Time uncertainty: lack of certainty about the time properties expressed in a statement about the system or its environment  Mañana (i.e., “not today” )  “We will call you soon”  “A man with a watch knows what time it is. A man with two watches is never sure.” (Segal's law) 48
  • 49. 49
  • 51. Quality evaluation – Prediction Models 1. Identify your target entities and your target stakeholders  Examples of entities: COTS components, Data stored in DBs, Internet Delivery Service.  Examples of stakeholders: Developers, Advanced Users, Novice Users, … 2. Choose a Quality Model for evaluating your entities  E.g., ISO/IEC 25010 “Product” QM 3. Customize the Quality Model  Select the Characteristics and Subcharacteristics relevant to these entities and stakeholders 4. Select the Measurable Attributes of the entities relevant to the Quality Model 5. Select the appropriate measures for those measurable attributes 6. Run experiments with samples of entities and groups of stakeholders to:  Empirically evaluate the “perceived” (subjective) quality subcharacteristic of these entities  Empirically evaluate the “objective” quality subcharacteristic of these entities 7. Run regression analyses to identify the set of measures that better explain each quality subcharacteristic, and define appropriate quality indicators 51
  • 52. Quality evaluation – Prediction Models 1. Identify your target entities and your target stakeholders  Examples of entities: COTS components, Data stored in DBs, Internet Delivery Service.  Examples of stakeholders: Developers, Advanced Users, Novice Users, Any kinds of users. Evaluate the quality of software components that are candidates to be integrated in a software system Our target stakeholders are system developers and maintainers, who need to select the best candidate components to form part of their systems 52 Manuel F. Bertoa, José M. Troya, Antonio Vallecillo: “Measuring the usability of software components”, Journal of Systems and Software, 79(3):427-439, March 2006
  • 53. Quality evaluation – Prediction Models 2. Choose a Quality Model for evaluating your entities ISO/IEC 9126 53
  • 54. Quality evaluation – Prediction Models 3. Customize the Quality Model  Select the relevant characteristics and subcharacteristics 54
  • 55. Quality evaluation – Prediction Models 4. Select the relevant Measurable Attributes of the entities w.r.t. the Quality model 55
  • 56. Quality evaluation – Prediction Models 5. Select the appropriate measures for those measurable attributes  Measures related to “Quality of Documentation” 56
  • 57. Quality evaluation – Prediction Models 5. Select the appropriate measures for those measurable attributes  Measures related to “Design Complexity” 57
  • 58. Quality evaluation – Prediction Models 6. Run experiments with samples of entities and groups of stakeholders to empirically evaluate the “perceived” (subjective) and “objective” quality 58
  • 59. Quality evaluation – Prediction Models 6. Run experiments with samples of entities and groups of stakeholders to empirically evaluate the “perceived” (subjective) and “objective” quality 59
  • 60. Quality evaluation – Prediction Models 7. Run regression analyses to identify the set of measures that better explain each quality subcharacteristic, and define appropriate quality indicators 60
  • 61. Quality evaluation – Prediction Models 7. Run regression analyses to identify the set of measures that better explain each quality subcharacteristic, and define appropriate quality indicators 61
  • 62. Quality evaluation – Prediction Models 7. Run regression analyses to identify the set of measures that better explain each quality subcharacteristic, and define appropriate quality indicators 62 Maint = α Und + β Learn + γ Oper high if Maint > 0.8; Maintainability = low if Maint < 0.4 medium otherwise
  • 63. Maintainability of models 63 F. Basciani, J. Rocco, D. Ruscio, L. Iovino,A. Pierantonio “ A tool-supported approach for assessing the quality of modeling artifacts.” Journal of Computer Languages 51:173-192, April 2019. M. Genero, M. Piattini. “Empirical validation of measures for class diagram structural complexity through controlled experiments.” Proc. of QAOOSE WS at ECOOP 2001.
  • 64. Maintainability of model transformations 64 F. Basciani, J. Rocco, D. Ruscio, L. Iovino,A. Pierantonio “ A tool-supported approach for assessing the quality of modeling artifacts.” Journal of Computer Languages 51:173-192, April 2019.
  • 65. Introducing Uncertainty The Englishman Who Went up a Hill but Came down a Mountain (1995) 65
  • 66. Quality evaluation – Sources of uncertainty 1. Identify your target entities and your target stakeholders  Examples of entities: COTS components, Data stored in DBs, Internet Delivery Service.  Examples of stakeholders: Developers, Advanced Users, Novice Users, … 2. Choose a Quality Model for evaluating your entities  E.g., ISO/IEC 25010 “Product” QM 3. Customize the Quality Model  Select the Characteristics and Subcharacteristics relevant to these entities and stakeholders 4. Select the Measurable Attributes of the entities relevant to the Quality Model 5. Select the appropriate measures for those measurable attributes 6. Run experiments with samples of entities and groups of stakeholders to:  Empirically evaluate the “perceived” (subjective) quality subcharacteristic of these entities  Empirically evaluate the “objective” quality subcharacteristic of these entities 7. Run regression analyses to identify the set of measures that better explain each quality subcharacteristic, and define appropriate quality indicators 66
  • 67. Sources of uncertainty  Selection of the subset of the quality model  Selection of incorrect, inappropriate or missing quality subcharacteristics  Selection of quality measures  Selection of incorrect or inappropriate quality measures  Imprecise measurements of quality measures  Empirical experiments  Confidence in the entity samples  Confidence in the selected groups of stakeholders  Evaluation of perceived and objective quality  Incorrect or imprecise experiment results  Statistical analyses and regression tests  Confidence of estimation models  Definition of quality indicators  Confidence in thresholds  Propagation of measurement uncertainty in decision models 67
  • 68. Maintainability of models 68 M. Genero, M. Piattini, E. manso, G. Cantone. “Building UML Class Diagram Maintainability Prediction Models based on Early Metrics.” Proc. of IEEE METRICS 2003.
  • 69. Estimating quality with uncertainty 69 Maintainability = low
  • 70. Estimating quality with uncertainty 70 Maintainability = { (low, 0.8), (medium, 0.18), (high, 0.02) } with a credence of (0.95)
  • 71. Estimating quality with uncertainty 71 Use > ?system1.maintainability() -> ULevel((#low, 0.8), (#medium, 0.18), (#high, 0.02)) : ULevel Use > ?system1.maintainability_credence(agent1) -> 0.5 : Real Use > ?r1. maintainability_credence(agent2) -> 0.99 : Real Use > ?r1. maintainability_credence(null) -> 0.95 : Real
  • 72. Summary (on Evaluating Quality in the presence of Uncertainty)  Identify the kinds of uncertainty (and their nature) that affect  Your entities and their attributes  The quality characteristics you need to evaluate  Your target stakeholders’ particular needs and backgrounds  Your quality (base and derived) measures  Your quality indicators  Model uncertainty  Include uncertainty in your quality models and measures as first-class elements (measurement uncertainty, degrees of belief, credence, etc.)  Evaluate uncertainty  Use tools for quantifying and propagating uncertainty  Document uncertainty  Produce estimates of the magnitude and impact of these uncertainties  Manage your quality considering uncertainty  Make sure decision processes take into account the estimated uncertainties 72
  • 73. Uncertainty as a first class concept in quality modeling and evaluation  From “correctness” to “utility”  Useful, beneficial and profitable to users, instead of objectively correct  Utility permits accommodating trade-offs between different dimensions  From “precise” to “approximate”  Need to evaluate possible deviations and estimate margins  “How accurate are my models and estimations, and how confident I am on them?”  From “open-loop” to “closed-loop”  Need to (self-)adapt as new information is available, or conditions change  “How do I change when the level of uncertainty changes?” 73 David Garlan “Software Engineering in an Uncertain World.” In Proc. of FoSER 2010: 125-128.
  • 74. Takeaways (on Uncertainty)  “Uncertainty” is not a single concept, it encompasses many different types of uncertainties (measurement, belief, environment, …)  Each type of uncertainty requires its own notations, underlying logics and propagation mechanisms  Uncertainty can be aleatory or epistemic (irreducible or reducible)  Uncertainty does not depend so much on knowledge, but on belief  It is mainly subjective, and diffent people may hold different degrees of belief about the same statement  Learn to manage in the presence of uncertainty; it cannot be eliminated.  You can try to reduce it (for epistemic) with testing, verification, validation, redundancy and other knowledge acquisition processes.  Aleatory uncertainty and its risks cannot be reduced. It needs to be calculated, and its values and risks bounded. Margins and bounds can be used to handle it. 74
  • 75. Open problems for Quatic From the QUATIC 2019 Call for Papers:  Quality Aspects in Requirements Engineering  Quality Aspects in Model-Driven Engineering  Quality Aspects in DevOps Development  Quality Aspects in Process Improvement and Assessment  Quality Aspects in Verification and Validation  Quality Aspects in Evidence-Based Software Engineering  Quality Aspects in Security & Privacy  Quality Aspects in Cloud-based Platforms and Services  Quality Aspects in Business Processes  Quality Aspects in Data Science & Artificial Intelligence  Quality Aspects in Software Maintenance and Comprehension 75
  • 77. The technical debt associated to uncertainty management… 77
  • 78. Modeling and Evaluating Quality in the Presence of Uncertainty QUATIC 2019 Ciudad Real, September 13, 2019 Antonio Vallecillo Universidad de Málaga, Spain