SlideShare a Scribd company logo
1 of 56
Download to read offline
Case Study Research In
Software Engineering
Alessio Ferrari, CNR-ISTI, Pisa, Italy

alessio.ferrari@isti.cnr.it
May, 2020
Runeson et al., 2012 http://www.egov.ee/media/1267/case-study-research-in-software-engineering.pdf
Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf
Shull et al., 2008 https://bit.ly/3bT3gXD
Case Study
• “Case study in software engineering is an empirical enquiry that draws on
multiple sources of evidence to investigate one instance (or a small
number of instances) of a contemporary software engineering phenomenon
within its real-life context, especially when the boundary between
phenomenon and context cannot be clearly specified” cf. Runeson et al.,
2012

• A case study is a research-industry collaboration carried out and reported
with an empirical software engineering mindset
• A case study is any research that: 

• uses REAL industrial data sources (developers, documents, software)

• analyse the data considering their specific context (the characteristics of
the company and the people)

• is designed and reported with an effort towards objectivity and
generalisability of the findings
Experiment vs Case Study
Controlled
Experiment Case Study
simplifies reality
minimise context
considers context
maximise realism
requires
knowledge
of reality
requires
knowledge of
empirical tools
The ABC of Software Engineering Research 11:11
Fig. 1. The ABC framework: eight research strategies as categories of research methods for software engi-
Jungle
Natural
Reserve
Flight SimulatorIn Vitro Experiment
Courtroom
Referendum
Mathematical Model Forecasting System
When to Use Case Studies?
1. When you can't control the variables

2. When there are many more variables than data points

3. When you cannot separate phenomena from context

• Phenomena that don't occur in a lab setting (e.g. large scale, complex
software projects)

• Effects can be wide-ranging

• Effects can take a long time to appear (weeks, months, years!)

4. When the context is important (e.g. When you need to know how context
affects the phenomena)

5. When you need to know whether your theory applies to a specific real
world setting
Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf
in Theory
When to Use Case Studies?
• When a company wants to try a software engineering support
tool that you developed

• When a company wants to try an existing tool in their process

• When a company wants you to develop some tool or process
to support their software engineering activities

• If you are in a (funded) collaboration project with industry and
you want to publish the results in a software engineering
venue and not just do the work for the company

• If you had good established contacts with companies and you
want to perform a qualitative study to analyse a general
software engineering problem in practice
in Practice
Typical Examples
• You are in the company 4 days per week; you
experiment with a commercial tool for code
generation using requirements from a previous
project of the company

• You stay at the university, but use requirements
and code from the company to develop a
customised tool for automated trace link detection

• You are at the company and perform a qualitative
study based on observations and/or interviews
(see previous lectures)
What is NOT a Case Study
• Not an exemplar

• Not a report of something interesting that was tried on a toy
problem

• Not an experience report

• Retrospective report on an experience (typically, industrial) with
lessons learned

• Not a quasi-experiment with small n

• Weaker form of experiment with a small sample size; uses a
different logic for designing the study and for generalizing from
results
Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf
Fundamental Characteristics
of Case Studies
• Has research questions set out from the beginning of
the study

• Data is collected in a planned and consistent manner

• Inferences are made from the data to answer the research
questions

• Produces an explanation, description, prediction, or
improvement of a phenomenon

• Threats to validity are addressed in a systematic way
Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf
Types of Case Study
Exploratory
Descriptive
Explanatory
Predictive
ImprovingConfirmatory
GOALSTAGE
WHAT HOW
WHY HOW TO
THEORY
Case Study Research Process
PREPARATION EXECUTION REPORTING
Theory
Research Design
(Units, Collection,
Analysis, Validity
Procedures)
Research Questions
and Constructs
Identify Context
(Subjects and Objects)
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Internal and
Construct Validity
Discuss
Reliability
Theory
There is theory before and after!
Case Study Research Process
PREPARATION EXECUTION REPORTING
Theory
Research Design
(Units, Collection,
Analysis, Validity
Procedures)
Research Questions
and Constructs
Identify Context
(Subjects and Objects)
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Internal and
Construct Validity
Discuss
Reliability
Theory
Research design normally combines quantitative and qualitative data
ITERATIONS can be performed with different goals and levels (iterative case studies)
ITERATIONS are not always needed
There is theory before and after!
Example: Combining Quantitative and Qualitative Data
• A tool to detect requirements defects is tried by the company on their
requirements (previously analysed for defects)
defective
defective
Requirements
with manual
defect analysis
(expected results
aka Ground Truth)
Example: Combining Quantitative and Qualitative Data
• A tool to detect requirements defects is tried by the company on their
requirements (previously analysed for defects)
defective
defective
Requirements
with manual
defect analysis
(expected results
aka Ground Truth)
Requirements
Tool
Requirements
with automatic
defect analysis
Example: Combining Quantitative and Qualitative Data
• A tool to detect requirements defects is tried by the company on their
requirements (previously analysed for defects)
defective
defective
Requirements
with manual
defect analysis
(expected results
aka Ground Truth)
Usability Accuracy
Sources of
Inaccuracy
Improvements
QUANTITATIVE
QUALITATIVE
Requirements
Tool
Requirements
with automatic
defect analysis
Example: Combining Quantitative and
Qualitative Data
• A tool to detect defective requirements is tried by the company
on their requirements documents —previously analysed for defects

• Quantitative: accuracy is quantitative evaluated with precision and
recall measures

• Qualitative: false positive and false negative are analysed in a
qualitative way to understand typical patters of inaccuracy

• Quantitative: a set of users is selected by the company to try and
evaluate the tool with a usability questionnaire

• Qualitative: a set of users is selected by the company to try and
evaluate the tool and give feedback for improvement
This will be our running example for today
Theory
Research Questions
and Context
Theory (and Motivation)
• Existing theories about requirements defects and other aspects:

• Standards (ISO/IEC/IEEE 29148:2011)

• Handbook of Requirements Ambiguity (from Dan Berry)

• Studies on Usability (e.g., System Usability Score - SUS)

• Other tools for defect identification in requirements

• Gaps in current theory (MOTIVATION for your case study):

• Previous tools never tried in a the context of a safety-critical
company

• Other tools are not made publicly available

• No usability analysis of these tools in practice
Research Questions (RQs) and Constructs
• Explicit set of questions that you aim to answer with your study

• Goal-question-metric (GQM) approach:

• Goal: improve the detection of requirements defects

• RQ1: how much accurate is the tool in detecting defects?

• RQ2: which are typical sources of inaccuracies?
• RQ3: how much usable is the tool considered by the analysts?

• RQ4: which are the improvements required?
• Metrics (Quantitative) and Evaluation Strategies (Qualitative):
• accurate (quantitative): precision and recall measures

• sources of inaccuracies (qualitative): analysis of the false positive and false negative cases

• usable (quantitative): usability score (SUS) from a questionnaire

• improvements (qualitative): feedback from analysis
See first lecture on how to formulate RQs
Consider each construct
in the RQs and plan its evaluation
cf. Basili et al., https://www.cs.umd.edu/users/mvz/handouts/gqm.pdf
GQM
Context
• It is extremely important to characterise the context of your study, i.e., to describe
it in an accurate way 

• Domain (e.g, railway software company)
• Size (e.g., large, localised company)
• Activity (e.g., requirements reviews)

• Maturity (e.g., years in the field, CMMI — https://en.wikipedia.org/wiki/
Capability_Maturity_Model_Integration)
• Type of Process (agile, V-process, etc.)

• Actors (e.g., requirements analysts)

• Other Peculiarities (anything that specifically characterise your company, e.g.,
culture, mix of ethnicity, backgrounds, gender, type of environment)
TIPS: If this was an experiment,
what are the variables that you cannot control?
This is the CONTEXT that you should report
Context: Study Subjects and Objects
• Study subjects and objects belong to the context and need to be
characterised as well

• Subjects:
• 5 developers with 1 to 3 years experience

• 4 analysts with 5 to 10 years experience

• Objects:
• Low level requirements document with 300 Requirements, and 30
marked as defective

• High level requirements document with 100 Requirements, 30 defective
You get what is available! You cannot be picky!
Select what is most representative or relevant for the company
Research Design: Data
Collection, Analysis and
Validity
Research Design: Units of Analysis
can be Peple, Companies, Activities, Artefacts…
Context: the Company
Case (Unit of Analysis)
Automatic Defect Detection Task
Study Subjects:
analysts who will
evaluate the tool
Study Objects:
documents that
will be used to
evaluate the tool
Define THE CASE that you study in the context
Sigle-unit, Holistic case study
Case Study Designs
Case
Context
Case
Context
Case
Context
Case
Context
Case
Context
Context
Case
Embedded Unit
Embedded Unit
Context
ContextContext
Context
Single (e.g., one company) Multiple (e.g., multiple companies)
Holistic
(e.g., one
team)
Embedded
(e.g., different
teams)
Remarks on Design: Variant
• The design that you choose depends on how you want to frame your case study
and which is your focus
Case: Defect detection on High Level
Requirements
Study Objects:
High-level documents
Context: the Company
Case: Defect detection on Low Level
Requirements
Study Subjects:
analysts who will
evaluate the tool
Study Objects:
Low-level documents
If I want to differentiate evaluation by requirements types
Remarks on Design: Variant
• The design that you choose depends on how you want to frame your case study
and which is your focus
Case: Defect detection by Developers
Study Subjects:
Developers
Context: the Company
Case: Defect detection by Analysts
Study Objects:
Requirements
Study Subjects:
Analysts
If I want to differentiate evaluation by viewpoints
Data Collection Procedures
See Initial Lecture on Data Collection
Documentation Archival
Records
Interviews Observation Systems
Combine Multiple Sources: you will not have much data, different viewpoints needed
Create a Case Study Database: collect and organise the data in a structured manner
Maintain a Chain of Evidence: link data to constructs, and to other data
Data Sources
Principles of Data Collection
Example: Data Collection Procedures
Example: Data Collection Procedures
Usability Accuracy
Sources of
Inaccuracy
Improvements
QUANTITATIVE
QUALITATIVE
Example: Data Collection Procedures
Usability Accuracy
Sources of
Inaccuracy
Improvements
QUANTITATIVE
QUALITATIVE
usability
evaluation
questionnaire
Example: Data Collection Procedures
Usability Accuracy
Sources of
Inaccuracy
Improvements
QUANTITATIVE
QUALITATIVE
actual
output
expected
output
precision
& recall
measures
usability
evaluation
questionnaire
Example: Data Collection Procedures
Usability Accuracy
Sources of
Inaccuracy
Improvements
QUANTITATIVE
QUALITATIVE
actual
output
expected
output
precision
& recall
measures
usability
evaluation
questionnaire
manual
comparison
Example: Data Collection Procedures
Usability Accuracy
Sources of
Inaccuracy
Improvements
QUANTITATIVE
QUALITATIVE
actual
output
expected
output
precision
& recall
measures
usability
evaluation
questionnaire
manual
comparisonfeedback
(focus
group)
Data Analysis Procedures
• Quantitative:
• you normally do not have enough data points to perform
hypothesis testing

• you can provide descriptive statistics (like for surveys)

• you can use measures and approaches from other fields (e.g.,
machine learning and information retrieval if you developed an
automated software engineering tool)

• Qualitative:
• apply coding, thematic analysis and grounded theory
Example: Data Analysis Procedures
Usability
Improvements
QUANTITATIVE
QUALITATIVE
usability
evaluation
questionnaire
I think the GUI should distinguish
between types of defect
I think the they should use color coding
for different defects
Different defects shall be associated also
to degree of severity
Coding and
Thematic
Analysis
“SUS score above a 68
would be considered
above average
and anything below 68
is below average”*
*cf. https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
Example: Data Analysis Procedures
Accuracy
Sources of
Inaccuracy
QUANTITATIVE
QUALITATIVE
precision and
recall
baseline
tool
precision and
recall
Compare
manual
comparison
Baseline can be
another tool, or fictional
(e.g., random predictor)
Coding and
Thematic
Analysis
Validity Procedures: Triangulation
• Data (Source) Triangulation – using more than one data
source or collecting the same data at different occasions.

• Observer Triangulation – using more than one observer in
the study.

• Methodological Triangulation – combining different types
of data collection methods, for example, qualitative and
quantitative methods.

• Theory Triangulation – using alternative theories or
viewpoints.
You do not have a representative sample, or randomised experiment
Triangulation is used to increase OBJECTIVITY
Triangulation Types: Examples
• Data (Source) Triangulation: multiple types of documents with
different degrees of defects, multiple analysts perform defect
analysis on the same document (see next slides: The Ground
Truth)

• Observer Triangulation: more than one interviewer ask users’
their feedback

• Methodological Triangulation: analyse quantitatively and
qualitatively both accuracy and usability (what we actually did…)

• Theory Triangulation – information retrieval theory and usability
theory to evaluate “fitness” of the tool for the company
Example: the Ground Truth
We initially assumed that the ground truth was available,
what if it is NOT? (i.e., no defect classification?)
Ground Truth
Example: the Ground Truth
We initially assumed that the ground truth was available,
what if it is NOT? (i.e., no defect classification?)
Independent Manual
Classification
Ground Truth
Example: the Ground Truth
We initially assumed that the ground truth was available,
what if it is NOT? (i.e., no defect classification?)
Independent Manual
Classification
Ground Truth
Discuss
Disagreement
Compute
Agreement
Cohen’s k = 0.8
Example: the Ground Truth
We initially assumed that the ground truth was available,
what if it is NOT? (i.e., no defect classification?)
Independent Manual
Classification
Ground Truth
Discuss
Disagreement
Compute
Agreement
Cohen’s k = 0.8
NOTE: You may need to better define your
task if agreement is low!
(i.e., give better instructions)
Example: the Ground Truth
• In many cases, you need to automate some software engineering activity (e.g., defect
detection, tracing)

• In most of the cases, a form of classification is required (e.g., defective vs not-
defective, relevant vs not relevant)

• Comparison between humans and tools is often needed to evaluate accuracy

• A Ground Truth is built: 

• Two or more subjects independently classify the data (triangulation of sources)

• Agreement is evaluated with Fleiss Kappa or Cohen’s Kappa

• If agreement is high, it means that the task is well defined, otherwise you have
to define it better

• Subjects discuss on the disagreed cases and define the Ground Truth

• Your tool’s output will be compared with the Ground Truth More complex evaluation
approaches may be needed!
Threats to Validity
in Case Study Research
Threats to Validity
PREPARATION EXECUTION REPORTING
Theory
Research Design
(Units, Collection,
Analysis, Validity
Procedures)
Research Questions
and Constructs
Identify Context
(Subjects and Objects)
Collect Data
Analyse Data
Report Answers
Internal Validity
External Validity
Construct
Validity
Discuss
Reliability
Theory
Threats to Validity
• Construct Validity: did I operationalise my constructs
correctly? Are subjective measures involved (apply
triangulation)?
• Internal Validity: what uncontrolled variables may have
influenced the outcome of my study?
• External Validity: which are the features that makes my
study applicable to other contexts?
• Reliability: how much evidence did I provide, and how
much data did I share? how much can be replicated?
Mix aspects of Quantitative and Qualitative Research
Refer to all the threats to validity that we studied, and check those that apply!
Example: Threats to Validity (and Mitigations)
• Construct: standard evaluation measures were used for usability and
accuracy. Subjectivity in data analysis mitigated through triangulation.
• Internal: Subjects involved in usability analysis are representative of the
company analysts, and have different degrees of experience to their
experience did not influence the outcome. Focus group may have not
allowed everyone to contribute (add possible mitigation).
• External: results applicable to large safety-critical companies, with mostly
local development, and low level requirements analysed from the
perspective of requirements analysts.

• Reliability: we cannot share the data, but we share the tool to enable
replication. The tool was tested for correctness with respect to its
requirements, and was developed following the standards of the company
for internal tools’ development (tool can be qualified)
More threats based on the previous lectures!
List potential threats to validity, and discuss adopted mitigations
Reporting Case Studies
• Can be tricky especially if the case is complex and iterative, so you must
be CREATIVE
• Thick report of the Context
• Implications for research: in which way does your work advance the
theory and what can other researchers do now?

• e.g., other researchers can improve the tool for inaccuracies, other
researchers can test in other domains, or with different requirements
(extend the scope of validity)
• Implications for practice: in which way does your work help the
company involved in the study and other companies?

• e.g., other companies can use your tool, other companies can
improve their requirements review procedures
Reporting Case Studies
• Abstract: Context & Motivation, Problem, Solution, Results, Contribution (to Theory)

• Introduction

• Background and Motivation (pre-existing theories, related work and the need for the study)

• [Optional: for Improving Case Studies] Adopted Tools/Methods
• Study Design

• [Optional: for iterative case studies or complex ones] Overview
• Context and Units of Analysis (the company, study subjects, objects)

• Research Questions & Constructs

• Data Collection Procedures 

• Data Analysis Procedures

• Validity Procedures

• Execution and Results (in relation to RQs)

• Threats to Validity

• Discussion (in relation to results and theory)

• Implications for Research and Practice

• Conclusion and Future Work
Unexpected Results and Iterations
• In some (MANY!) cases, things do not go as expected and you may need
different iterations

• Make sure to REPORT a clear structure for each iteration, and to make the
iterations compatible in terms of structure

• In other cases, you may want to perform different studies with different stages
(e.g., Exploratory, Confirmatory), and different goals (e.g., Descriptive, Improving)

• 1) First you want to understand which types of errors can be detected by your
tools (Exploratory, Descriptive)

• 2) Then you want to improve the tool for inaccuracies (Exploratory,
Improving)

• 3) Then you want to test the improved tool on a larger dataset
(Confirmatory, Improving)
Example: Unexpected Results and Iterations
20
Table 5: Outline of the di↵erent iterations performed.
ID Iteration Name Nature RQs Patterns Dataset
0 Pilot Exploratory
RQ1
RQ2
Def. Det. Patterns D-Pilot
1 Large-scale - 1st Exploratory
RQ1
RQ2
Def. Det. Patterns D-Large
2 Large-scale - 2nd Explanatory
RQ1
RQ2
Def. Det. Patterns D-Large
3 Large-scale - 3rd Improving RQ3
Def. Det. Patterns
+
Discard Patterns
D-Large
4 Large-scale - 4th Improving RQ4.1 SREE D-Large
5 Large-scale - 5th Explanatory
RQ4.2
RQ4.3
RQ4.4
SREE-reduced D-Large
Table 6: Tasks performed and subjects involved in each iteration.
ID
Res.
Quest.
Pat.
Def.
Data.
Sel.
Data.
Ann.
Pat.
App.
Out.
Ann.
Quant.
Eval.
Qual.
Eval.
0
VE1
NLP-E
VE1
NLP-E
VE1 VE1 VE1 - VE1
VE1
NLP-E
1
VE1
NLP-E
- VE1 VE3
VE1/
VE2
-
VE1/
VE2
VE1
NLP-E
2
VE1
NLP-E
- - - -
VE1
VE2
VE1/
VE2
VE1/VE2
NLP-E
3
VE2
NLP-E
VE2
NLP-E
- - VE2 - VE2 -
4
VE2
NLP-E
VE2
NLP-E
- - VE2 - VE2
VE2
NLP-E
5
VE2
NLP-E
VE2
NLP-E
- - VE2
VE1
VE2
VE2
VE2
NLP-E
They were originally written by the customer in international English lan-
guage and refined by the company. No particular glossary restrictions are
applied and no guideline was provided. This dataset is composed by the
following requirements types: functional, architectural, interface and er-
15
Research
Questions
(NLP-E, VE)
Patterns
Definition
(NLP-E, VE)
Dataset
Selection
(VE)
Dataset
Annotation
(VEs)
Patterns
Application
(VE)
Quantitative
Evaluation
(VE)
Qualitative
Evaluation
(NLP-E, VE)
Output
Annotation
(VEs)
Preparation
Data CollectionData Analysis
Fig. 1: Template structure adopted in the iterations of the case-study.
Structure for each Iteration
Instantiation of each Iteration
Nature and data of each iteration
Detecting Requirements Defects with NLP Patterns:
an Industrial Experience in the Railway Domain
cf. Ferrari et al., 2018, https://bit.ly/2zZWIZv
Industry-Academia
Collaborations
Where Case Studies Happen!
cf. Lethbridge et al., 2008 https://bit.ly/3bT3gXD
Benefits
In what follows we separately enumerate the benefits to the company, to faculty
members and to students involved in the research. These are summarized in Table 1
While many of these benefits might be self-evident, the parties may not necessarily
Table 1 Benefits of industry–company research collaborations
Typical amount
of benefit (impact *
Category of benefit Benefit type probability of occurrence)
To the company
Direct benefits • New or improved Medium
technology or product
• Data and knowledge useful High
for decision making
• Patents Low
Indirect benefits • Potential employees for Medium
company
• Ideas and expertise High
of researchers
• Public relations Medium
Factors lowering risk • Graduate students are often Medium
of research top achievers
• Researchers have a personal Medium
stake in success
• Low cost compared to High
in-house research
• Government matching funds High
and tax incentives
To researchers
Direct benefits • Funding High
• Interesting and challenging High
problems and data
of researchers
• Public relations M
Factors lowering risk • Graduate students are often M
of research top achievers
• Researchers have a personal M
stake in success
• Low cost compared to H
in-house research
• Government matching funds H
and tax incentives
To researchers
Direct benefits • Funding H
• Interesting and challenging H
problems and data
• Test-bed for ideas H
Indirect benefits • Exposure to the ‘real world’: H
Provides valid and relevant
knowledge, consulting
and networking.
To the public
Indirect benefits • Advancement of state-of-the H
art and state-of-the-practice
To the company
Direct benefits • New or improved M
technology or product
• Data and knowledge useful Hi
for decision making
• Patents Lo
Indirect benefits • Potential employees for M
company
• Ideas and expertise Hi
of researchers
• Public relations M
Factors lowering risk • Graduate students are often M
of research top achievers
• Researchers have a personal M
stake in success
• Low cost compared to Hi
in-house research
• Government matching funds Hi
and tax incentives
To researchers
Direct benefits • Funding Hi
• Interesting and challenging Hi
problems and data
• Test-bed for ideas Hi
Indirect benefits • Exposure to the ‘real world’: Hi
Provides valid and relevant
knowledge, consulting
and networking.
To the public
Indirect benefits • Advancement of state-of-the Hi
art and state-of-the-practice
Drawbacks
10 The Management of University–Industry Collaborations
Table 2 Drawbacks of industry–company research collabo
Category of drawback Drawback type
To the company
Costs • Cash funding
• Consumption of
employee time
• Office space and
equipment
Risk factors • Different definitions
of success (bottom
line for industry vs.
scientific results and
publication for
researchers)
• Unknown consumption
of employee time
• Inappropriate release
of intellectual property
To researchers
Costs • Constrained research
freedom
• Excess consumption of time
Risk factor • Company-initiated
cancellation
To the project as
a whole
10 The Management of University–Industry Collaborations
Table 2 Drawbacks of industry–company research collabor
Category of drawback Drawback type
To the company
Costs • Cash funding
• Consumption of
employee time
• Office space and
equipment
Risk factors • Different definitions
of success (bottom
line for industry vs.
scientific results and
publication for
researchers)
• Unknown consumption
of employee time
• Inappropriate release
of intellectual property
To researchers
Costs • Constrained research
freedom
• Excess consumption of time
Risk factor • Company-initiated
cancellation
Note that some projects are initiated by researchers while others are initiated by
companies who have an active need to solve to a problem. Some risks are consider-
ably higher in the latter case.
• Excess consumption of time Moderate to high, depending on
experience of researchers and
research design
Risk factor • Company-initiated Varies from low to high depending
cancellation on corporate priorities and
rapport between researchers and
the company
To the project as
a whole
Risk factors • Different perceptions of High if the company has defined to
the problem the problem for researchers
solve; otherwise low
• Failure to staff the project Medium
with sufficient numbers of
skilled researchers
• Unknown skill level of Varies from low to high depending
researchers, including their on experience of researchers
ability to estimate the
required effort
• Failure to find or keep Varies from low to high; depending
adequate numbers on effort needed, management
of participants support, and other factors
• Inconclusive or non- Low, but higher when the objective
useful results is to validate a hypothesis
Collaboration Checklist
270 T.C. Lethbridge et al.
Table 3 Checklist of activities that should be part of the planning and management process of
industry–university collaborations involving empirical studies
Activity Involves or decided by
• Decision: To use university researchers or in-house Company
employees (refer to Tables 1 and 2 for
decision-making information)
• Attracting companies Researchers
• Decision: Level and type of commitment (finances, Negotiated
resources, timetable, deliverables)
• Decision: How on-going management and risk Negotiated
management will be handled?
• Decision: What is the research focus, what are the goals Negotiated, but may be
and what are the research questions? largely determined by
either party
• Decision: What participants will be available and when? Negotiated
• Decision: What information must be confidential? Negotiated
• Decision: How will publication of results be handled? Negotiated
• Decision: Who owns intellectual property? Negotiated
• Obtain ethics approval Researchers
• Find researcher team members and train them Researchers
• Plan the details of work with participants Researchers
• Plan for data analysis Researchers
• Evaluate the risks and manage changes Both parties
Summary
• Case studies are context-dependent research approaches that are very common in SE (Software
Engineering) — as SE itself is context-dependent!

• Knowledge of ALL other empirical approaches is needed for a successful case study

• A case study is a structured and planned experience with industrial data, with RQs, Data
Collection, Data Analysis, Validity, Results and Discussion
• Triangulation is KEY

• Good industrial collaboration is KEY

• Recommendations:
• Do your best to be RIGOROUS, little choices can lead to unpublishable results

• Expect the unexpected (be FLEXIBLE)
• Publish also negative results and reason on what did not work — claiming success when there
was none is useless

• Share as much data and information as possible —replication is the only way to consolidate
theory and extend scope of validity

More Related Content

What's hot

Requirement modeling
Requirement modelingRequirement modeling
Requirement modelingAbdul Basit
 
Software engineering a practitioners approach 8th edition pressman solutions ...
Software engineering a practitioners approach 8th edition pressman solutions ...Software engineering a practitioners approach 8th edition pressman solutions ...
Software engineering a practitioners approach 8th edition pressman solutions ...Drusilla918
 
Software testing life cycle
Software testing life cycleSoftware testing life cycle
Software testing life cycleGaruda Trainings
 
CSE Final Year Project Presentation on Android Application
CSE Final Year Project Presentation on Android ApplicationCSE Final Year Project Presentation on Android Application
CSE Final Year Project Presentation on Android ApplicationAhammad Karim
 
SRS(software requirement specification)
SRS(software requirement specification)SRS(software requirement specification)
SRS(software requirement specification)Akash Kumar Dhameja
 
Lecture 12 requirements modeling - (system analysis)
Lecture 12   requirements modeling - (system analysis)Lecture 12   requirements modeling - (system analysis)
Lecture 12 requirements modeling - (system analysis)IIUI
 
Software architecture design ppt
Software architecture design pptSoftware architecture design ppt
Software architecture design pptfarazimlak
 
Software requirements engineering problems and challenges erp implementation ...
Software requirements engineering problems and challenges erp implementation ...Software requirements engineering problems and challenges erp implementation ...
Software requirements engineering problems and challenges erp implementation ...Dr. Hamdan Al-Sabri
 
Survey on Software Defect Prediction
Survey on Software Defect PredictionSurvey on Software Defect Prediction
Survey on Software Defect PredictionSung Kim
 
Prototype model
Prototype modelPrototype model
Prototype modelsadhana8
 
Cost estimation using cocomo model
Cost estimation using cocomo modelCost estimation using cocomo model
Cost estimation using cocomo modelNitesh Bichwani
 
functional testing
functional testing functional testing
functional testing bharathanche
 
software Prototyping model
software Prototyping modelsoftware Prototyping model
software Prototyping modelSankeerthanaS1
 

What's hot (20)

Requirement modeling
Requirement modelingRequirement modeling
Requirement modeling
 
Unified Modeling Language
Unified Modeling LanguageUnified Modeling Language
Unified Modeling Language
 
Software engineering a practitioners approach 8th edition pressman solutions ...
Software engineering a practitioners approach 8th edition pressman solutions ...Software engineering a practitioners approach 8th edition pressman solutions ...
Software engineering a practitioners approach 8th edition pressman solutions ...
 
Software testing life cycle
Software testing life cycleSoftware testing life cycle
Software testing life cycle
 
CSE Final Year Project Presentation on Android Application
CSE Final Year Project Presentation on Android ApplicationCSE Final Year Project Presentation on Android Application
CSE Final Year Project Presentation on Android Application
 
SE chapter 5
SE chapter 5SE chapter 5
SE chapter 5
 
Unit testing
Unit testing Unit testing
Unit testing
 
SRS(software requirement specification)
SRS(software requirement specification)SRS(software requirement specification)
SRS(software requirement specification)
 
Software requirements
Software requirementsSoftware requirements
Software requirements
 
Cocomo model
Cocomo modelCocomo model
Cocomo model
 
Lecture 12 requirements modeling - (system analysis)
Lecture 12   requirements modeling - (system analysis)Lecture 12   requirements modeling - (system analysis)
Lecture 12 requirements modeling - (system analysis)
 
Software architecture design ppt
Software architecture design pptSoftware architecture design ppt
Software architecture design ppt
 
Software requirements engineering problems and challenges erp implementation ...
Software requirements engineering problems and challenges erp implementation ...Software requirements engineering problems and challenges erp implementation ...
Software requirements engineering problems and challenges erp implementation ...
 
Survey on Software Defect Prediction
Survey on Software Defect PredictionSurvey on Software Defect Prediction
Survey on Software Defect Prediction
 
Prototype model
Prototype modelPrototype model
Prototype model
 
Software documentation
Software documentationSoftware documentation
Software documentation
 
Cost estimation using cocomo model
Cost estimation using cocomo modelCost estimation using cocomo model
Cost estimation using cocomo model
 
functional testing
functional testing functional testing
functional testing
 
software Prototyping model
software Prototyping modelsoftware Prototyping model
software Prototyping model
 
Usecase Presentation
Usecase PresentationUsecase Presentation
Usecase Presentation
 

Similar to Case Study Research in Software Engineering

Case studies in industry - fundamentals and lessons learnt
Case studies in industry - fundamentals and lessons learntCase studies in industry - fundamentals and lessons learnt
Case studies in industry - fundamentals and lessons learntDaniel Mendez
 
Empirical Software Engineering
Empirical Software EngineeringEmpirical Software Engineering
Empirical Software EngineeringRahimLotfi
 
Qualitative Studies in Software Engineering - Interviews, Observation, Ground...
Qualitative Studies in Software Engineering - Interviews, Observation, Ground...Qualitative Studies in Software Engineering - Interviews, Observation, Ground...
Qualitative Studies in Software Engineering - Interviews, Observation, Ground...alessio_ferrari
 
Survey Research In Empirical Software Engineering
Survey Research In Empirical Software EngineeringSurvey Research In Empirical Software Engineering
Survey Research In Empirical Software Engineeringalessio_ferrari
 
[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineering[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineeringIvano Malavolta
 
Software Engineering Research: Leading a Double-Agent Life.
Software Engineering Research: Leading a Double-Agent Life.Software Engineering Research: Leading a Double-Agent Life.
Software Engineering Research: Leading a Double-Agent Life.Lionel Briand
 
Empirical research methods for software engineering
Empirical research methods for software engineeringEmpirical research methods for software engineering
Empirical research methods for software engineeringsarfraznawaz
 
empirical-SLR.pptx
empirical-SLR.pptxempirical-SLR.pptx
empirical-SLR.pptxJitha Kannan
 
Information Systems Action design research method
Information Systems Action design research methodInformation Systems Action design research method
Information Systems Action design research methodRaimo Halinen
 
Planning and Executing Practice-Impactful Research
Planning and Executing Practice-Impactful ResearchPlanning and Executing Practice-Impactful Research
Planning and Executing Practice-Impactful ResearchTao Xie
 
Data_Scientist_Position_Description
Data_Scientist_Position_DescriptionData_Scientist_Position_Description
Data_Scientist_Position_DescriptionSuman Banerjee
 
A Standardized Case Study Framework and Methodology to Identify quot Best Pr...
A Standardized Case Study Framework and Methodology to Identify  quot Best Pr...A Standardized Case Study Framework and Methodology to Identify  quot Best Pr...
A Standardized Case Study Framework and Methodology to Identify quot Best Pr...Carrie Tran
 
requirement analysis characteristics
requirement analysis characteristics requirement analysis characteristics
requirement analysis characteristics Helmy Faisal
 
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchTao Xie
 
Lionel Briand ICSM 2011 Keynote
Lionel Briand ICSM 2011 KeynoteLionel Briand ICSM 2011 Keynote
Lionel Briand ICSM 2011 KeynoteICSM 2011
 

Similar to Case Study Research in Software Engineering (20)

Case studies in industry - fundamentals and lessons learnt
Case studies in industry - fundamentals and lessons learntCase studies in industry - fundamentals and lessons learnt
Case studies in industry - fundamentals and lessons learnt
 
Systematic Literature Review
Systematic Literature ReviewSystematic Literature Review
Systematic Literature Review
 
Empirical Software Engineering
Empirical Software EngineeringEmpirical Software Engineering
Empirical Software Engineering
 
Qualitative Studies in Software Engineering - Interviews, Observation, Ground...
Qualitative Studies in Software Engineering - Interviews, Observation, Ground...Qualitative Studies in Software Engineering - Interviews, Observation, Ground...
Qualitative Studies in Software Engineering - Interviews, Observation, Ground...
 
Survey Research In Empirical Software Engineering
Survey Research In Empirical Software EngineeringSurvey Research In Empirical Software Engineering
Survey Research In Empirical Software Engineering
 
[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineering[2017/2018] RESEARCH in software engineering
[2017/2018] RESEARCH in software engineering
 
Software Engineering Research: Leading a Double-Agent Life.
Software Engineering Research: Leading a Double-Agent Life.Software Engineering Research: Leading a Double-Agent Life.
Software Engineering Research: Leading a Double-Agent Life.
 
Empirical research methods for software engineering
Empirical research methods for software engineeringEmpirical research methods for software engineering
Empirical research methods for software engineering
 
empirical-SLR.pptx
empirical-SLR.pptxempirical-SLR.pptx
empirical-SLR.pptx
 
Information Systems Action design research method
Information Systems Action design research methodInformation Systems Action design research method
Information Systems Action design research method
 
Planning and Executing Practice-Impactful Research
Planning and Executing Practice-Impactful ResearchPlanning and Executing Practice-Impactful Research
Planning and Executing Practice-Impactful Research
 
QA process Presentation
QA process PresentationQA process Presentation
QA process Presentation
 
Data mining
Data miningData mining
Data mining
 
ES_140_METHODS_OF_RESEARCH.pdf
ES_140_METHODS_OF_RESEARCH.pdfES_140_METHODS_OF_RESEARCH.pdf
ES_140_METHODS_OF_RESEARCH.pdf
 
Data_Scientist_Position_Description
Data_Scientist_Position_DescriptionData_Scientist_Position_Description
Data_Scientist_Position_Description
 
1.1 business research class discussions
1.1 business research class discussions1.1 business research class discussions
1.1 business research class discussions
 
A Standardized Case Study Framework and Methodology to Identify quot Best Pr...
A Standardized Case Study Framework and Methodology to Identify  quot Best Pr...A Standardized Case Study Framework and Methodology to Identify  quot Best Pr...
A Standardized Case Study Framework and Methodology to Identify quot Best Pr...
 
requirement analysis characteristics
requirement analysis characteristics requirement analysis characteristics
requirement analysis characteristics
 
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
 
Lionel Briand ICSM 2011 Keynote
Lionel Briand ICSM 2011 KeynoteLionel Briand ICSM 2011 Keynote
Lionel Briand ICSM 2011 Keynote
 

More from alessio_ferrari

Natural language processing for requirements engineering: ICSE 2021 Technical...
Natural language processing for requirements engineering: ICSE 2021 Technical...Natural language processing for requirements engineering: ICSE 2021 Technical...
Natural language processing for requirements engineering: ICSE 2021 Technical...alessio_ferrari
 
Controlled experiments, Hypothesis Testing, Test Selection, Threats to Validity
Controlled experiments, Hypothesis Testing, Test Selection, Threats to ValidityControlled experiments, Hypothesis Testing, Test Selection, Threats to Validity
Controlled experiments, Hypothesis Testing, Test Selection, Threats to Validityalessio_ferrari
 
Requirements Engineering: focus on Natural Language Processing, Lecture 2
Requirements Engineering: focus on Natural Language Processing, Lecture 2Requirements Engineering: focus on Natural Language Processing, Lecture 2
Requirements Engineering: focus on Natural Language Processing, Lecture 2alessio_ferrari
 
Requirements Engineering: focus on Natural Language Processing, Lecture 1
Requirements Engineering: focus on Natural Language Processing, Lecture 1Requirements Engineering: focus on Natural Language Processing, Lecture 1
Requirements Engineering: focus on Natural Language Processing, Lecture 1alessio_ferrari
 
Ambiguity in Software Engineering
Ambiguity in Software EngineeringAmbiguity in Software Engineering
Ambiguity in Software Engineeringalessio_ferrari
 
Empirical Methods in Software Engineering - an Overview
Empirical Methods in Software Engineering - an OverviewEmpirical Methods in Software Engineering - an Overview
Empirical Methods in Software Engineering - an Overviewalessio_ferrari
 
Natural Language Processing (NLP) for Requirements Engineering (RE): an Overview
Natural Language Processing (NLP) for Requirements Engineering (RE): an OverviewNatural Language Processing (NLP) for Requirements Engineering (RE): an Overview
Natural Language Processing (NLP) for Requirements Engineering (RE): an Overviewalessio_ferrari
 

More from alessio_ferrari (7)

Natural language processing for requirements engineering: ICSE 2021 Technical...
Natural language processing for requirements engineering: ICSE 2021 Technical...Natural language processing for requirements engineering: ICSE 2021 Technical...
Natural language processing for requirements engineering: ICSE 2021 Technical...
 
Controlled experiments, Hypothesis Testing, Test Selection, Threats to Validity
Controlled experiments, Hypothesis Testing, Test Selection, Threats to ValidityControlled experiments, Hypothesis Testing, Test Selection, Threats to Validity
Controlled experiments, Hypothesis Testing, Test Selection, Threats to Validity
 
Requirements Engineering: focus on Natural Language Processing, Lecture 2
Requirements Engineering: focus on Natural Language Processing, Lecture 2Requirements Engineering: focus on Natural Language Processing, Lecture 2
Requirements Engineering: focus on Natural Language Processing, Lecture 2
 
Requirements Engineering: focus on Natural Language Processing, Lecture 1
Requirements Engineering: focus on Natural Language Processing, Lecture 1Requirements Engineering: focus on Natural Language Processing, Lecture 1
Requirements Engineering: focus on Natural Language Processing, Lecture 1
 
Ambiguity in Software Engineering
Ambiguity in Software EngineeringAmbiguity in Software Engineering
Ambiguity in Software Engineering
 
Empirical Methods in Software Engineering - an Overview
Empirical Methods in Software Engineering - an OverviewEmpirical Methods in Software Engineering - an Overview
Empirical Methods in Software Engineering - an Overview
 
Natural Language Processing (NLP) for Requirements Engineering (RE): an Overview
Natural Language Processing (NLP) for Requirements Engineering (RE): an OverviewNatural Language Processing (NLP) for Requirements Engineering (RE): an Overview
Natural Language Processing (NLP) for Requirements Engineering (RE): an Overview
 

Recently uploaded

CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersChitralekhaTherkar
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 

Recently uploaded (20)

CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of Powders
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 

Case Study Research in Software Engineering

  • 1. Case Study Research In Software Engineering Alessio Ferrari, CNR-ISTI, Pisa, Italy alessio.ferrari@isti.cnr.it May, 2020 Runeson et al., 2012 http://www.egov.ee/media/1267/case-study-research-in-software-engineering.pdf Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf Shull et al., 2008 https://bit.ly/3bT3gXD
  • 2. Case Study • “Case study in software engineering is an empirical enquiry that draws on multiple sources of evidence to investigate one instance (or a small number of instances) of a contemporary software engineering phenomenon within its real-life context, especially when the boundary between phenomenon and context cannot be clearly specified” cf. Runeson et al., 2012 • A case study is a research-industry collaboration carried out and reported with an empirical software engineering mindset • A case study is any research that: • uses REAL industrial data sources (developers, documents, software) • analyse the data considering their specific context (the characteristics of the company and the people) • is designed and reported with an effort towards objectivity and generalisability of the findings
  • 3. Experiment vs Case Study Controlled Experiment Case Study simplifies reality minimise context considers context maximise realism requires knowledge of reality requires knowledge of empirical tools
  • 4. The ABC of Software Engineering Research 11:11 Fig. 1. The ABC framework: eight research strategies as categories of research methods for software engi- Jungle Natural Reserve Flight SimulatorIn Vitro Experiment Courtroom Referendum Mathematical Model Forecasting System
  • 5. When to Use Case Studies? 1. When you can't control the variables 2. When there are many more variables than data points 3. When you cannot separate phenomena from context • Phenomena that don't occur in a lab setting (e.g. large scale, complex software projects) • Effects can be wide-ranging • Effects can take a long time to appear (weeks, months, years!) 4. When the context is important (e.g. When you need to know how context affects the phenomena) 5. When you need to know whether your theory applies to a specific real world setting Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf in Theory
  • 6. When to Use Case Studies? • When a company wants to try a software engineering support tool that you developed • When a company wants to try an existing tool in their process • When a company wants you to develop some tool or process to support their software engineering activities • If you are in a (funded) collaboration project with industry and you want to publish the results in a software engineering venue and not just do the work for the company • If you had good established contacts with companies and you want to perform a qualitative study to analyse a general software engineering problem in practice in Practice
  • 7. Typical Examples • You are in the company 4 days per week; you experiment with a commercial tool for code generation using requirements from a previous project of the company • You stay at the university, but use requirements and code from the company to develop a customised tool for automated trace link detection • You are at the company and perform a qualitative study based on observations and/or interviews (see previous lectures)
  • 8. What is NOT a Case Study • Not an exemplar • Not a report of something interesting that was tried on a toy problem • Not an experience report • Retrospective report on an experience (typically, industrial) with lessons learned • Not a quasi-experiment with small n • Weaker form of experiment with a small sample size; uses a different logic for designing the study and for generalizing from results Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf
  • 9. Fundamental Characteristics of Case Studies • Has research questions set out from the beginning of the study • Data is collected in a planned and consistent manner • Inferences are made from the data to answer the research questions • Produces an explanation, description, prediction, or improvement of a phenomenon • Threats to validity are addressed in a systematic way Easterbrook and Aranda, 2006 http://www.cs.toronto.edu/~sme/case-studies/case_study_tutorial_slides.pdf
  • 10. Types of Case Study Exploratory Descriptive Explanatory Predictive ImprovingConfirmatory GOALSTAGE WHAT HOW WHY HOW TO THEORY
  • 11. Case Study Research Process PREPARATION EXECUTION REPORTING Theory Research Design (Units, Collection, Analysis, Validity Procedures) Research Questions and Constructs Identify Context (Subjects and Objects) Collect Data Analyse Data Report Answers Internal Validity External Validity Internal and Construct Validity Discuss Reliability Theory There is theory before and after!
  • 12. Case Study Research Process PREPARATION EXECUTION REPORTING Theory Research Design (Units, Collection, Analysis, Validity Procedures) Research Questions and Constructs Identify Context (Subjects and Objects) Collect Data Analyse Data Report Answers Internal Validity External Validity Internal and Construct Validity Discuss Reliability Theory Research design normally combines quantitative and qualitative data ITERATIONS can be performed with different goals and levels (iterative case studies) ITERATIONS are not always needed There is theory before and after!
  • 13. Example: Combining Quantitative and Qualitative Data • A tool to detect requirements defects is tried by the company on their requirements (previously analysed for defects) defective defective Requirements with manual defect analysis (expected results aka Ground Truth)
  • 14. Example: Combining Quantitative and Qualitative Data • A tool to detect requirements defects is tried by the company on their requirements (previously analysed for defects) defective defective Requirements with manual defect analysis (expected results aka Ground Truth) Requirements Tool Requirements with automatic defect analysis
  • 15. Example: Combining Quantitative and Qualitative Data • A tool to detect requirements defects is tried by the company on their requirements (previously analysed for defects) defective defective Requirements with manual defect analysis (expected results aka Ground Truth) Usability Accuracy Sources of Inaccuracy Improvements QUANTITATIVE QUALITATIVE Requirements Tool Requirements with automatic defect analysis
  • 16. Example: Combining Quantitative and Qualitative Data • A tool to detect defective requirements is tried by the company on their requirements documents —previously analysed for defects • Quantitative: accuracy is quantitative evaluated with precision and recall measures • Qualitative: false positive and false negative are analysed in a qualitative way to understand typical patters of inaccuracy • Quantitative: a set of users is selected by the company to try and evaluate the tool with a usability questionnaire • Qualitative: a set of users is selected by the company to try and evaluate the tool and give feedback for improvement This will be our running example for today
  • 18. Theory (and Motivation) • Existing theories about requirements defects and other aspects: • Standards (ISO/IEC/IEEE 29148:2011) • Handbook of Requirements Ambiguity (from Dan Berry) • Studies on Usability (e.g., System Usability Score - SUS) • Other tools for defect identification in requirements • Gaps in current theory (MOTIVATION for your case study): • Previous tools never tried in a the context of a safety-critical company • Other tools are not made publicly available • No usability analysis of these tools in practice
  • 19. Research Questions (RQs) and Constructs • Explicit set of questions that you aim to answer with your study • Goal-question-metric (GQM) approach: • Goal: improve the detection of requirements defects • RQ1: how much accurate is the tool in detecting defects? • RQ2: which are typical sources of inaccuracies? • RQ3: how much usable is the tool considered by the analysts? • RQ4: which are the improvements required? • Metrics (Quantitative) and Evaluation Strategies (Qualitative): • accurate (quantitative): precision and recall measures • sources of inaccuracies (qualitative): analysis of the false positive and false negative cases • usable (quantitative): usability score (SUS) from a questionnaire • improvements (qualitative): feedback from analysis See first lecture on how to formulate RQs Consider each construct in the RQs and plan its evaluation cf. Basili et al., https://www.cs.umd.edu/users/mvz/handouts/gqm.pdf GQM
  • 20. Context • It is extremely important to characterise the context of your study, i.e., to describe it in an accurate way • Domain (e.g, railway software company) • Size (e.g., large, localised company) • Activity (e.g., requirements reviews) • Maturity (e.g., years in the field, CMMI — https://en.wikipedia.org/wiki/ Capability_Maturity_Model_Integration) • Type of Process (agile, V-process, etc.) • Actors (e.g., requirements analysts) • Other Peculiarities (anything that specifically characterise your company, e.g., culture, mix of ethnicity, backgrounds, gender, type of environment) TIPS: If this was an experiment, what are the variables that you cannot control? This is the CONTEXT that you should report
  • 21. Context: Study Subjects and Objects • Study subjects and objects belong to the context and need to be characterised as well • Subjects: • 5 developers with 1 to 3 years experience • 4 analysts with 5 to 10 years experience • Objects: • Low level requirements document with 300 Requirements, and 30 marked as defective • High level requirements document with 100 Requirements, 30 defective You get what is available! You cannot be picky! Select what is most representative or relevant for the company
  • 22. Research Design: Data Collection, Analysis and Validity
  • 23. Research Design: Units of Analysis can be Peple, Companies, Activities, Artefacts… Context: the Company Case (Unit of Analysis) Automatic Defect Detection Task Study Subjects: analysts who will evaluate the tool Study Objects: documents that will be used to evaluate the tool Define THE CASE that you study in the context Sigle-unit, Holistic case study
  • 24. Case Study Designs Case Context Case Context Case Context Case Context Case Context Context Case Embedded Unit Embedded Unit Context ContextContext Context Single (e.g., one company) Multiple (e.g., multiple companies) Holistic (e.g., one team) Embedded (e.g., different teams)
  • 25. Remarks on Design: Variant • The design that you choose depends on how you want to frame your case study and which is your focus Case: Defect detection on High Level Requirements Study Objects: High-level documents Context: the Company Case: Defect detection on Low Level Requirements Study Subjects: analysts who will evaluate the tool Study Objects: Low-level documents If I want to differentiate evaluation by requirements types
  • 26. Remarks on Design: Variant • The design that you choose depends on how you want to frame your case study and which is your focus Case: Defect detection by Developers Study Subjects: Developers Context: the Company Case: Defect detection by Analysts Study Objects: Requirements Study Subjects: Analysts If I want to differentiate evaluation by viewpoints
  • 27. Data Collection Procedures See Initial Lecture on Data Collection Documentation Archival Records Interviews Observation Systems Combine Multiple Sources: you will not have much data, different viewpoints needed Create a Case Study Database: collect and organise the data in a structured manner Maintain a Chain of Evidence: link data to constructs, and to other data Data Sources Principles of Data Collection
  • 29. Example: Data Collection Procedures Usability Accuracy Sources of Inaccuracy Improvements QUANTITATIVE QUALITATIVE
  • 30. Example: Data Collection Procedures Usability Accuracy Sources of Inaccuracy Improvements QUANTITATIVE QUALITATIVE usability evaluation questionnaire
  • 31. Example: Data Collection Procedures Usability Accuracy Sources of Inaccuracy Improvements QUANTITATIVE QUALITATIVE actual output expected output precision & recall measures usability evaluation questionnaire
  • 32. Example: Data Collection Procedures Usability Accuracy Sources of Inaccuracy Improvements QUANTITATIVE QUALITATIVE actual output expected output precision & recall measures usability evaluation questionnaire manual comparison
  • 33. Example: Data Collection Procedures Usability Accuracy Sources of Inaccuracy Improvements QUANTITATIVE QUALITATIVE actual output expected output precision & recall measures usability evaluation questionnaire manual comparisonfeedback (focus group)
  • 34. Data Analysis Procedures • Quantitative: • you normally do not have enough data points to perform hypothesis testing • you can provide descriptive statistics (like for surveys) • you can use measures and approaches from other fields (e.g., machine learning and information retrieval if you developed an automated software engineering tool) • Qualitative: • apply coding, thematic analysis and grounded theory
  • 35. Example: Data Analysis Procedures Usability Improvements QUANTITATIVE QUALITATIVE usability evaluation questionnaire I think the GUI should distinguish between types of defect I think the they should use color coding for different defects Different defects shall be associated also to degree of severity Coding and Thematic Analysis “SUS score above a 68 would be considered above average and anything below 68 is below average”* *cf. https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
  • 36. Example: Data Analysis Procedures Accuracy Sources of Inaccuracy QUANTITATIVE QUALITATIVE precision and recall baseline tool precision and recall Compare manual comparison Baseline can be another tool, or fictional (e.g., random predictor) Coding and Thematic Analysis
  • 37. Validity Procedures: Triangulation • Data (Source) Triangulation – using more than one data source or collecting the same data at different occasions. • Observer Triangulation – using more than one observer in the study. • Methodological Triangulation – combining different types of data collection methods, for example, qualitative and quantitative methods. • Theory Triangulation – using alternative theories or viewpoints. You do not have a representative sample, or randomised experiment Triangulation is used to increase OBJECTIVITY
  • 38. Triangulation Types: Examples • Data (Source) Triangulation: multiple types of documents with different degrees of defects, multiple analysts perform defect analysis on the same document (see next slides: The Ground Truth) • Observer Triangulation: more than one interviewer ask users’ their feedback • Methodological Triangulation: analyse quantitatively and qualitatively both accuracy and usability (what we actually did…) • Theory Triangulation – information retrieval theory and usability theory to evaluate “fitness” of the tool for the company
  • 39. Example: the Ground Truth We initially assumed that the ground truth was available, what if it is NOT? (i.e., no defect classification?) Ground Truth
  • 40. Example: the Ground Truth We initially assumed that the ground truth was available, what if it is NOT? (i.e., no defect classification?) Independent Manual Classification Ground Truth
  • 41. Example: the Ground Truth We initially assumed that the ground truth was available, what if it is NOT? (i.e., no defect classification?) Independent Manual Classification Ground Truth Discuss Disagreement Compute Agreement Cohen’s k = 0.8
  • 42. Example: the Ground Truth We initially assumed that the ground truth was available, what if it is NOT? (i.e., no defect classification?) Independent Manual Classification Ground Truth Discuss Disagreement Compute Agreement Cohen’s k = 0.8 NOTE: You may need to better define your task if agreement is low! (i.e., give better instructions)
  • 43. Example: the Ground Truth • In many cases, you need to automate some software engineering activity (e.g., defect detection, tracing) • In most of the cases, a form of classification is required (e.g., defective vs not- defective, relevant vs not relevant) • Comparison between humans and tools is often needed to evaluate accuracy • A Ground Truth is built: • Two or more subjects independently classify the data (triangulation of sources) • Agreement is evaluated with Fleiss Kappa or Cohen’s Kappa • If agreement is high, it means that the task is well defined, otherwise you have to define it better • Subjects discuss on the disagreed cases and define the Ground Truth • Your tool’s output will be compared with the Ground Truth More complex evaluation approaches may be needed!
  • 44. Threats to Validity in Case Study Research
  • 45. Threats to Validity PREPARATION EXECUTION REPORTING Theory Research Design (Units, Collection, Analysis, Validity Procedures) Research Questions and Constructs Identify Context (Subjects and Objects) Collect Data Analyse Data Report Answers Internal Validity External Validity Construct Validity Discuss Reliability Theory
  • 46. Threats to Validity • Construct Validity: did I operationalise my constructs correctly? Are subjective measures involved (apply triangulation)? • Internal Validity: what uncontrolled variables may have influenced the outcome of my study? • External Validity: which are the features that makes my study applicable to other contexts? • Reliability: how much evidence did I provide, and how much data did I share? how much can be replicated? Mix aspects of Quantitative and Qualitative Research Refer to all the threats to validity that we studied, and check those that apply!
  • 47. Example: Threats to Validity (and Mitigations) • Construct: standard evaluation measures were used for usability and accuracy. Subjectivity in data analysis mitigated through triangulation. • Internal: Subjects involved in usability analysis are representative of the company analysts, and have different degrees of experience to their experience did not influence the outcome. Focus group may have not allowed everyone to contribute (add possible mitigation). • External: results applicable to large safety-critical companies, with mostly local development, and low level requirements analysed from the perspective of requirements analysts. • Reliability: we cannot share the data, but we share the tool to enable replication. The tool was tested for correctness with respect to its requirements, and was developed following the standards of the company for internal tools’ development (tool can be qualified) More threats based on the previous lectures! List potential threats to validity, and discuss adopted mitigations
  • 48. Reporting Case Studies • Can be tricky especially if the case is complex and iterative, so you must be CREATIVE • Thick report of the Context • Implications for research: in which way does your work advance the theory and what can other researchers do now? • e.g., other researchers can improve the tool for inaccuracies, other researchers can test in other domains, or with different requirements (extend the scope of validity) • Implications for practice: in which way does your work help the company involved in the study and other companies? • e.g., other companies can use your tool, other companies can improve their requirements review procedures
  • 49. Reporting Case Studies • Abstract: Context & Motivation, Problem, Solution, Results, Contribution (to Theory) • Introduction • Background and Motivation (pre-existing theories, related work and the need for the study) • [Optional: for Improving Case Studies] Adopted Tools/Methods • Study Design • [Optional: for iterative case studies or complex ones] Overview • Context and Units of Analysis (the company, study subjects, objects) • Research Questions & Constructs • Data Collection Procedures • Data Analysis Procedures • Validity Procedures • Execution and Results (in relation to RQs) • Threats to Validity • Discussion (in relation to results and theory) • Implications for Research and Practice • Conclusion and Future Work
  • 50. Unexpected Results and Iterations • In some (MANY!) cases, things do not go as expected and you may need different iterations • Make sure to REPORT a clear structure for each iteration, and to make the iterations compatible in terms of structure • In other cases, you may want to perform different studies with different stages (e.g., Exploratory, Confirmatory), and different goals (e.g., Descriptive, Improving) • 1) First you want to understand which types of errors can be detected by your tools (Exploratory, Descriptive) • 2) Then you want to improve the tool for inaccuracies (Exploratory, Improving) • 3) Then you want to test the improved tool on a larger dataset (Confirmatory, Improving)
  • 51. Example: Unexpected Results and Iterations 20 Table 5: Outline of the di↵erent iterations performed. ID Iteration Name Nature RQs Patterns Dataset 0 Pilot Exploratory RQ1 RQ2 Def. Det. Patterns D-Pilot 1 Large-scale - 1st Exploratory RQ1 RQ2 Def. Det. Patterns D-Large 2 Large-scale - 2nd Explanatory RQ1 RQ2 Def. Det. Patterns D-Large 3 Large-scale - 3rd Improving RQ3 Def. Det. Patterns + Discard Patterns D-Large 4 Large-scale - 4th Improving RQ4.1 SREE D-Large 5 Large-scale - 5th Explanatory RQ4.2 RQ4.3 RQ4.4 SREE-reduced D-Large Table 6: Tasks performed and subjects involved in each iteration. ID Res. Quest. Pat. Def. Data. Sel. Data. Ann. Pat. App. Out. Ann. Quant. Eval. Qual. Eval. 0 VE1 NLP-E VE1 NLP-E VE1 VE1 VE1 - VE1 VE1 NLP-E 1 VE1 NLP-E - VE1 VE3 VE1/ VE2 - VE1/ VE2 VE1 NLP-E 2 VE1 NLP-E - - - - VE1 VE2 VE1/ VE2 VE1/VE2 NLP-E 3 VE2 NLP-E VE2 NLP-E - - VE2 - VE2 - 4 VE2 NLP-E VE2 NLP-E - - VE2 - VE2 VE2 NLP-E 5 VE2 NLP-E VE2 NLP-E - - VE2 VE1 VE2 VE2 VE2 NLP-E They were originally written by the customer in international English lan- guage and refined by the company. No particular glossary restrictions are applied and no guideline was provided. This dataset is composed by the following requirements types: functional, architectural, interface and er- 15 Research Questions (NLP-E, VE) Patterns Definition (NLP-E, VE) Dataset Selection (VE) Dataset Annotation (VEs) Patterns Application (VE) Quantitative Evaluation (VE) Qualitative Evaluation (NLP-E, VE) Output Annotation (VEs) Preparation Data CollectionData Analysis Fig. 1: Template structure adopted in the iterations of the case-study. Structure for each Iteration Instantiation of each Iteration Nature and data of each iteration Detecting Requirements Defects with NLP Patterns: an Industrial Experience in the Railway Domain cf. Ferrari et al., 2018, https://bit.ly/2zZWIZv
  • 52. Industry-Academia Collaborations Where Case Studies Happen! cf. Lethbridge et al., 2008 https://bit.ly/3bT3gXD
  • 53. Benefits In what follows we separately enumerate the benefits to the company, to faculty members and to students involved in the research. These are summarized in Table 1 While many of these benefits might be self-evident, the parties may not necessarily Table 1 Benefits of industry–company research collaborations Typical amount of benefit (impact * Category of benefit Benefit type probability of occurrence) To the company Direct benefits • New or improved Medium technology or product • Data and knowledge useful High for decision making • Patents Low Indirect benefits • Potential employees for Medium company • Ideas and expertise High of researchers • Public relations Medium Factors lowering risk • Graduate students are often Medium of research top achievers • Researchers have a personal Medium stake in success • Low cost compared to High in-house research • Government matching funds High and tax incentives To researchers Direct benefits • Funding High • Interesting and challenging High problems and data of researchers • Public relations M Factors lowering risk • Graduate students are often M of research top achievers • Researchers have a personal M stake in success • Low cost compared to H in-house research • Government matching funds H and tax incentives To researchers Direct benefits • Funding H • Interesting and challenging H problems and data • Test-bed for ideas H Indirect benefits • Exposure to the ‘real world’: H Provides valid and relevant knowledge, consulting and networking. To the public Indirect benefits • Advancement of state-of-the H art and state-of-the-practice To the company Direct benefits • New or improved M technology or product • Data and knowledge useful Hi for decision making • Patents Lo Indirect benefits • Potential employees for M company • Ideas and expertise Hi of researchers • Public relations M Factors lowering risk • Graduate students are often M of research top achievers • Researchers have a personal M stake in success • Low cost compared to Hi in-house research • Government matching funds Hi and tax incentives To researchers Direct benefits • Funding Hi • Interesting and challenging Hi problems and data • Test-bed for ideas Hi Indirect benefits • Exposure to the ‘real world’: Hi Provides valid and relevant knowledge, consulting and networking. To the public Indirect benefits • Advancement of state-of-the Hi art and state-of-the-practice
  • 54. Drawbacks 10 The Management of University–Industry Collaborations Table 2 Drawbacks of industry–company research collabo Category of drawback Drawback type To the company Costs • Cash funding • Consumption of employee time • Office space and equipment Risk factors • Different definitions of success (bottom line for industry vs. scientific results and publication for researchers) • Unknown consumption of employee time • Inappropriate release of intellectual property To researchers Costs • Constrained research freedom • Excess consumption of time Risk factor • Company-initiated cancellation To the project as a whole 10 The Management of University–Industry Collaborations Table 2 Drawbacks of industry–company research collabor Category of drawback Drawback type To the company Costs • Cash funding • Consumption of employee time • Office space and equipment Risk factors • Different definitions of success (bottom line for industry vs. scientific results and publication for researchers) • Unknown consumption of employee time • Inappropriate release of intellectual property To researchers Costs • Constrained research freedom • Excess consumption of time Risk factor • Company-initiated cancellation Note that some projects are initiated by researchers while others are initiated by companies who have an active need to solve to a problem. Some risks are consider- ably higher in the latter case. • Excess consumption of time Moderate to high, depending on experience of researchers and research design Risk factor • Company-initiated Varies from low to high depending cancellation on corporate priorities and rapport between researchers and the company To the project as a whole Risk factors • Different perceptions of High if the company has defined to the problem the problem for researchers solve; otherwise low • Failure to staff the project Medium with sufficient numbers of skilled researchers • Unknown skill level of Varies from low to high depending researchers, including their on experience of researchers ability to estimate the required effort • Failure to find or keep Varies from low to high; depending adequate numbers on effort needed, management of participants support, and other factors • Inconclusive or non- Low, but higher when the objective useful results is to validate a hypothesis
  • 55. Collaboration Checklist 270 T.C. Lethbridge et al. Table 3 Checklist of activities that should be part of the planning and management process of industry–university collaborations involving empirical studies Activity Involves or decided by • Decision: To use university researchers or in-house Company employees (refer to Tables 1 and 2 for decision-making information) • Attracting companies Researchers • Decision: Level and type of commitment (finances, Negotiated resources, timetable, deliverables) • Decision: How on-going management and risk Negotiated management will be handled? • Decision: What is the research focus, what are the goals Negotiated, but may be and what are the research questions? largely determined by either party • Decision: What participants will be available and when? Negotiated • Decision: What information must be confidential? Negotiated • Decision: How will publication of results be handled? Negotiated • Decision: Who owns intellectual property? Negotiated • Obtain ethics approval Researchers • Find researcher team members and train them Researchers • Plan the details of work with participants Researchers • Plan for data analysis Researchers • Evaluate the risks and manage changes Both parties
  • 56. Summary • Case studies are context-dependent research approaches that are very common in SE (Software Engineering) — as SE itself is context-dependent! • Knowledge of ALL other empirical approaches is needed for a successful case study • A case study is a structured and planned experience with industrial data, with RQs, Data Collection, Data Analysis, Validity, Results and Discussion • Triangulation is KEY • Good industrial collaboration is KEY • Recommendations: • Do your best to be RIGOROUS, little choices can lead to unpublishable results • Expect the unexpected (be FLEXIBLE) • Publish also negative results and reason on what did not work — claiming success when there was none is useless • Share as much data and information as possible —replication is the only way to consolidate theory and extend scope of validity