SlideShare a Scribd company logo
1 of 39
Download to read offline
Context & key
concepts of the new
Evaluation
Methodology
Erik Arnold & Bea Mahieu -
Technopolis Group
www.metodika.reformy-msmt.cz
The EM in its historical context
• Evolution after the reform of 2008:
• Progressive restriction of the scope: research outputs only
• Reduction of the complexity of performance to an overly simple category of outputs
• Lack of consideration for disciplinary differences and for the missions of ROs
• Narrowing of the function of evaluation: only as component of the PRFS
• Concept of evaluation as part of a policy cycle providing strategic information is not
perceived; it provides information that is at the best of limited relevance
• Increasing breadth of coverage: from funding bodies to individual researchers
• Metodika 2013-2015: an improvement but still focused exclusively on outputs
• Result:
• Evaluation is perceived as counting outputs = points = funding
• It constitutes the key factor for R&D management throughout the entire system
• It is detached from any discourse on policy and strategy related to the national
R&D system
1
The mandate for this study
• Objectives: to develop an evaluation methodology
• Conducted on a national basis at the institutional level
• Evaluation results inform the institutional funding system (PRFS)
• Providing strategic information for the actors at all levels in the R&D system
• The expectations:
• A peer-review evaluation process
• Fulfil formative and summative functions
• Cover outputs, impacts, and institutional projections of research development
• Take into consideration the different missions of research organisations and the
field specifics
• Evaluation processes resistant to clientelism and conflicts of interests
• Take into account ‘gaming’
• Total costs do not exceed 1% of public institutional support for R&D in a five-year
time period
2
The project
3
PHASE 1
WP1 - EvU evaluation structure determination
WP2 - EvU field-specific eval. methodology
(FEM)
WP3 - EvU evaluation process rules
CONCLUSION PHASE 1
PHASE 2
WP4 - Evaluation information support
WP5 - Definition 'institutional funding' - intl
practice
WP6 - SWOT Czech institutional R&D base
WP7 - Draft models FF & PBC
WP8 - Model exp.impacts new inst.funding
system
CONCLUSION PHASE 2
PHASE 3
WP9 - Small Pilot Evaluation
CONCLUSION PHASE 3 / STUDY
Consult-
ations
2nd
Pilot
Implement
Handover to IPN team
Evaluation (assessment) in the
context of funding
4
Principles,
Structure
First
design
Consultations SPE
Principles,
Structure
First
design
Consultations
Ex-
ante
Evaluation methodology
Institutional funding principles
Final EM
and
funding
principles
Evaluation in the context of a
funding mechanism
5
• -----------
• -----------
• -----------
• -----------
• -----------
• -----------
• -----------
• -----------
• -----------
• -----------
• -----------
• -----------
Management
& potential
Membership of
research
community
Research
excellence
Research
performance
Information Peer
review
• Star rating
• Strategic
feedback
• Star rating
• Strategic
feedback
• Star rating
• Strategic
feedback
• Star rating
• Strategic
feedback
• -----------
• -----------
• -----------
Societal
relevance
• Star rating
• Strategic
feedback
Money ‘pot’ per
institution type
% % % %
% % % %
% % % %
% % % %
% % % %
Institutiona
l research
funding per
evaluated
RU
Evaluation in general should
inform the whole policy cycle
6
Issue identification
Issue framing
Policy measure
identification
Policy measure
development and
adoption
Implementation
Feedback
Issue identification
Issue framing
Policy measure
identification
Impact assessment
Policy measure
development and
adoption
Implementation
Interim evaluation
Effectiveness and
ex-post impact
assessment
Feedback
Evaluation should analyse societal
effects of intervention, not just
focus on outputs
7
Society
Economy
Environment
Public
Intervention
Evaluation
Needs
Problems
Issues
Objectives Inputs Outputs
Outcomes
Impacts
Relevanc
e
Efficiency
Effectiveness
Utility and Sustainability
Evaluation (assessment) for a
PRFS is necessarily more narrow
8
‘General’ evaluation Evaluation in a PRFS
Informs policy making on the failures
in the system and recommends
possible policy interventions
Is part of a policy intervention: it acts
upon previously identified failures and
steers research behaviour to tackle these
by providing incentives
Has no effects directly linked to the
evaluation Is intended to create effects
Has no consequences of gaming or
unintended effects
Has inevitable consequences of gaming
and may lead to unintended effects
Gives information on the positioning
of the evaluated objects in the
national/international context
Sets the evaluated objects in competition
to each other
Who does what
9
Australia
(2014)
Austria
(2014)
Belgium (FL) –
BOF (2014)
Finland
(2014)
Italy
(2014)
Netherlands
(2014)
New
Zealand
(2014)
Norway /
evaluations
(2014)
Norway /
PRFS
(2014)
Sweden
(2014)
UK –
(2014)
Purpose
Performance
assessment
X X X X X X X X X X X
Inform funding X X X X X X X
Main function
Formative X X X X X X X
Summative X X X X
Formative function
National R&D
governance
X X X X X X
Institutional
R&D
management
X X X X X X X
Summative function
R&D quality X X X X X X X X X X X
R&D capacity
building
X X X X X X X X
Research
excellence
X X
Societal
relevance
X X X X
Notes
Performance
contracts
Performance
agreements
Performance
contracts
Performance
contracts
Separate PRFS – simple,
10% of funding
New
PRFS in
2015
National and institutional
evaluations tend to be distinct
• National evaluations typically
• Are commissioned by national research funders or their principals
• Compare performance with other countries at the field level
• Collect strategic information to support research policy interventions
• Are summative with respect to the institutional level
• Lean heavily on bibliometric methods – with selective use of peer review
• Institutional evaluations
• Are commissioned by institutions or their principals
• If they benchmark (often they don’t), compare with other institutions
• Try to explain performance at a level of detail that is formative, in order to support
institutional research management
• Tend to use informed peer review (though note that modern research
management routinely monitors performance by bibliometric means
10
It’s unusual to compare different
types of RO
• Types of organisation
• Universities / Higher Education Institutions
• Scientific research institutes
• Research and Technology Organisations (RTOs)
• Government laboratories
• NB that hybrid forms are increasingly appearing
• Different types of RO are rarely benchmarked against each other
• Differences in mission or function
• Different principals and accountability
• Different mix among types of research and between research and non-research
activities
• Legal form is irrelevant – evaluation is based on function
• See, for example, the Norwegian institute PRFS, which separates institutes from
universities but ignores legal form
11
Typical institutional research
evaluation approaches by RO type
• Universities
• Focus on quality, interaction with teaching
• Preference for peer review
• Scientific research institutes
• Ditto, but no emphasis on teaching unless PhD education is provided
• RTOs
• Economic performance, social impacts, customer feedback, research quality
• Predominance of social scientific methods; peers if quality is a major issue
• Government labs
• Mission performance, usefulness to policy formation processes, social impacts,
research quality
• Predominance of social scientific methods; peers if quality is a major issue
12
Why use peers?
• Legitimacy in the scientific community
• Ability to address context, for example at institutional level
• Understanding inter-field differences
• Ability to be formative
• Drawbacks include
• Thse evaluated generally have to do work as part of the process
• Need to ensure independence (research funders devote significant effort to this)
• Expensive, especially if site visits are involved
• Shortage of peers
• Occasional misbehaviour at the individual level – typically constrained by the use
of panels
13
Why use bibliometrics?
• Comparatively low cost – and the cost is declining
• Need not involve work by those evaluated
• Can produce fine-grained ratings and rankings – though arguably this is
spurious precision
• However
• Inter-field comparisons are problematic
• Use of indicators is generally unsophisticated and sometimes problematic (for
example, Journal Impact Factors)
• Lack of context indicators and conventions for handling them
• Hence, they provide a much more partial picture than peer review
• Unhelpful for formative evaluation
• Emerging conclusion: use ‘informed’ peer review
14
Some effects of PRFS
• Positive
• Improved research and management attention to productivity and quality
• Performance focus and individual and collective levels
• Restructuring towards high-quality performers
• Greater transparency about performance
• Negative
• Focus on producing indicators rather than science, encouraging ‘gaming’
• Narrowing of the career path, with growing influence of research managers
• Squeezing out the heterodox and interdisciplinarity
• Matthew Effect encourages lock-ins
• Research indicators often used in relation to teaching
15
Unit of analysis
• Depends on the purpose of the evaluation
• Tendency to avoid the individual level – that produces erratic results
and undermines institutional autonomy
• Some systems require tactical decisions about including individual
researchers, with intra-institutional and personal consequences
• Decisions to include organisations are normally policy-based. You can’t
‘volunteer’ (and usually you can’t avoid assessment, either)
16
Pitting discipline against discipline
• Most PRFS reallocate a small proportion of total institutional funding
and are therefore tolerant of some inaccuracy or margin for error
• By and large, disciplines are incommensurable – one of the most
interesting activities in scientometrics is the struggle to overcome this
fact
• The nature of the research enterprise differs (‘What is research?’)
• Differing notions of quality
• Publication behaviours differ significantly
• Different social roles of disciplines
• It is not necessarily desirable to change the proportions of national
effort among disciplines for reasons of performance only – policy needs
to play a part
17
Interdisciplinarity
• All research funders (and therefore evaluators) struggle with this
• Interdisciplinarity is important
• Emergence of new disciplines and approaches
• Ability to do Mode 2 work, addressing industrial and societal challenges
• But it’s hard to measure
• Discipline redefinition implies changes in the criteria relevant for measurement
• Interdisciplinarity requires use of expertise and indicator categories that cross
existing boundaries
• For example, multiple peer panels
• In metrics-based approaches, the interesting things start happening in undefined or
‘other’ categories
• As with any change, there is often professional disagreement about standards,
criteria, approaches
18
Different indicators, different
purposes
19
Research
productivity
Research
quality
Relevance
of research
Efficiency/
value for
money
Quality/
sustainability
of national
research
systems
Input
Criteria
X X X
Systemic
indicators
X X
Process
indicators
X X X
Research
outputs
X X X
Impact
indicators
X X
Recent Czech focus on outputs is
unusual
20
UK (REF)
Italy (VQR)
Belgium/FL (IOF)
Belgium/FL (BOF)
Norway (HEI)
Sweden
Outputs
Systemic & process
indicators
Impacts
Society
Innovation
Research
Denmark
Finland
Norway (PRI)
Czech Republic
Ratings tend to use 4-5 points
21
Score Meaning Research quality Relevance to society Viability
1 World leading/
excellent
The research unit has been
shown to be one of the few
most influential research groups
in the world in its particular field.
The research unit makes
an outstanding
contribution to society.
The research unit is
excellently
equipped for the
future.
2 Very good The research unit conducts very
good, internationally recognised
research.
The research unit makes
a very good contribution
to society.
The research unit is
very well equipped
for the future.
3 Good The research unit conducts
good research.
The research unit makes
a good contribution to
society.
The research unit
makes responsible
strategic decisions
and is therefore
well equipped for
the future.
4 Unsatisfactory The research unit does not
achieve satisfactory results in
its field.
The research unit does
not make a satisfactory
contribution to society.
The research unit is
not adequately
equipped for the
future.
Example: Netherlands
Implementation issues for
peer/panel approaches
• Robustness
• Clear and universal assessment guidelines
• Professional management and support organisation
• Fairness
• Clientelism and nepotism are avoided
• Scholarly bias is tackled
• Appropriate representation
• Interdisciplinarity is addressed
• Consistency of assessments
• Transparency of the process
• Cost
22
The Evaluation Methodology
23
The key principles
• The EM
• Reflects the strategic policy objectives for the R&D system
• Functions = to act as source for strategic information & directly inform public
institutional funding for research organisations
• The evaluation
• Is at the level of field-defined Research Unit (RU) within an Evaluated Unit, i.e. a
research organisation or in the case of the public HEI, a faculty
• Is a process of informed peer review: metrics inform but do not substitute for
judgment
• Covers all research organisations of a critical size, on a voluntary basis
• Is a fair and egalitarian system: a single framework for assessment while allowing
for a reasonable level of field- and RO typology-specific variations
• Is comprehensive: all dimensions of the research activities and its outputs,
outcomes and impact
• The minimum possible cost and burden to deliver a robust and defensible process
24
The role and function of the EM
• The key function of an evaluation system is to support public R&D
governance in the attainment of its strategic objectives, i.e.
• To strengthen R&D capacity
• To foster excellence in research
• To foster research alignment with the societal needs
• To support the growth and competitiveness of the Czech Republic
• By:
• Assessing past research performance
• Supporting future performance improvement
Provide strategic information for policy making and R&D management
Provide incentives for positive change in the R&D system reflecting the strategy
25
How does the EM provide
incentives for positive change?
• Indicators used & their structuring into a set of assessment criteria is guided
by the policy objectives
26
Quality adequacy of the physical research
environment
Research strategy & management
(including HR management)
(International) research presence &
collaboration
Institutional management &
development potential
Membership of the (world)
research community
Research performance
Research productivity
Research capacity building (PhD)
Overall quality of the research activities
Scientific research excellence
The quality of the selected ‘best’ outputs
Societal relevance
Importance of (potential) societal
impacts
How does the EM act as a source
of strategic information?
• The EM is not an arithmetic exercise  quantitative & qualitative input &
evaluation results
• Quality information thanks to a detailed and comprehensive approach
• Focus on the evaluated actors’ role, positioning, and competitive value in the
national R&D and innovation system as well as in the international R&D landscape
• Strengths and weaknesses of the different actors against all dimensions of the
research activities, i.e. the research quality, research strategy, the research
environment (i.e. the institutional conditions), the research outputs, outcomes and
impacts
• It will allow for the identification of the factors upon which action is needed in order
to improve research performance, at the national as well as institutional level.
• Primary unit of analysis = the elements that worldwide constitute the
fundamental structure of research, i.e. scientific fields  Research Unit
27
Matching the scientific with the
institutional dimension
28
Research unit
(RU)
Individual
researchers
Evaluated Unit
(EvU)
Research Organisation
Research Institute /
University Faculty
R&D Governance
Bodies & Agencies
Scientific
fields
National R&D
System
(International)
Structure of
Science
Disciplinary
areas
A process of informed peer review
• Based on a mix of appropriate quantitative and qualitative data to support
their professional judgement
• international bibliometric data, data from the RD&I system, quantitative and
qualitative data provided by the evaluated Research Units through self-reporting
• Expert panels allow for in-depth consideration of differences in scientific
fields and sub-fields
• Type of outputs, channels, productivity, costs, collaboration, internationalisation …
• Strict rules govern conflicts of interest, against nepotism and ‘clientelism’,
combined with auditing mechanisms and rules for punishing cases of fraud
• The evaluation will be conducted in full transparency
• Evaluation results: quality scores & explanatory texts, conclusions &
recommendations
29
Covering all research organisations
in the CR of a critical size
• A broad & varied R&D base:
• HEIs, public research institutes, state organisations, private research organisations,
organisational units of the CR, non-profit organisations, and associations of legal persons
• Categorisation based on their mission for society related to research:
• Scientific Research Organisations, including the scientific research institutes, HEIs and
university hospitals
• RTOs, i.e. research organisations that have as primary function to provide knowledge transfer
services to the industry sector.
• The Aerospace Research and Test Establishment, Research Institute of Building Materials, etc
• Public Service Research Organisations, i.e. research organisations that have as primary
function to provide services to government or society.
• Research Institute for Labour and Social Affaires, the Institute for International Relations, the Czech
Metrology Institute, etc.
• Infrastructure Research Organisations, providing infrastructure & information to research
• CESNET, the National Library of the Czech Republic, etc.
• Critical size: 50 research outputs in 5 years – the minimum for robust data
analytics 30
Research outputs taken into
account
31
Threshold
Scientific research
excellence
Other indicators
Papers in peer-reviewed journals (J) X X X
Conference proceedings (D) X X X
Monographs, books and book chapters (B), provided they are
identified with an ISBN number
X X X
Results used by the funding provider, i.e. projected into legislation
or norm, projected into non-legislative or strategic documents (H)
X X
Research report containing classified information (V) X X
Certified methodologies, art conservation methodologies,
specialized map works (N)
X X
Patents and patent applications (P) X X
Plant/ breeders rights (Zodry & Zplem) X X
Scholarly outputs
Non-traditional scholarly outputs
Patents and other IP
Handling interdisciplinary
research
• Interdisciplinary research between sub-fields: handled by 1 subject panel
• Facilitated by the categorisation of scientific disciplines in broad fields based on
the OECD FOS (internationally recognised): 6 disciplinary areas – 36 fields – 190
sub-fields (e.g. Medical & Health sciences – Basic medicine – Neurosciences)
• Interdisciplinary research within a disciplinary area: cross-referrals among
the subject panels in 1 disciplinary area
• Interdisciplinary research across disciplinary areas: application for
Interdisciplinary Research Unit
• For cross-referrals & Interdisciplinary Research Units:
• At least 30% of research activities take place across disciplinary areas
• “Proven” through research outputs/bibliometrics or the scientific background of the
researchers in the unit
• Decided upon by main panel & subject panel chairs
32
Using a fair and egalitarian
system
• A single framework for assessment across all disciplines and types of
research organisations
• The same assessment criteria are used for all research organisations & fields
• Allows for full comparability of the evaluation results, independently of the
scientific fields and wherever the research is conducted
• Based on a detailed Evaluation Protocol, setting common procedures and
providing standard definitions
• The main panel chairs act as auditors and guarantors for the comparability
• A first step in the evaluation process is the calibration exercise related to the
assessment criteria
• Field-specific definition of the terms: originality, rigour, significance, and reach
• Weighting the importance of sub-criteria for the field and for the different types of
research organisation
33
Assessment criteria & indicators
(1)
• Institutional management & development potential
• Quality adequacy of the research environment
• Research infrastructure, use of national/international RI, shared or collaborative use
• Research capacity: funding profile (institutional, competitive, contract;
national/international), staff profile & size (including HC versus FTE)
• Research strategy & HR management
• Quality of the research strategy: strength of the research plan (long/short term,
objectives & activities, adequacy, feasibility
• Quality of processes for career development (objectives, competency framework,
appraisal criteria, frequency performance reviews, etc), policy & practice concerning PhD
& postdocs (support & supervision, training, etc)
• Membership of the national & global research community
• National & international
• Intensity & quality of collaboration & partnerships, co-publications, international mobility
• Reputation & esteem
34
Assessment criteria & indicators
(2)
• Scientific research excellence: peak quality
• In terms of originality, significance and rigour
• Two-stage assessment process:
• Review results (highly selected scholarly outputs)
• Panel assessment: aggregating review results + citation impacts: nr and % publications
among the top 10%, and 25% most cited publications (world, EU28)
• Research performance: overall quality of the research activities
• Research productivity: outputs/FTE researchers
• Contribution to capacity building: PhD students/awarded & trained
• Overall quality of the research activities:
• Publishing profile, citation impacts, publication channels, etc (incl. relative data on
positioning in field)
• Overall competitiveness (funding, strategy & alignment trends, infrastructure etc)
• Narrative on main value for research
35
Assessment criteria & indicators
(3)
• Societal relevance
• In terms of reach & significance
• Knowledge & technology transfer activities: intensity & quality
• Income from societal relevant research (competitive & contract research)
• Income from commercialization activities
• National collaborations and partnerships beyond academia
• Reputation and esteem beyond academia
• Non-traditional scholarly research & IPR-related outputs (national/international)
• Societal impact: narratives
36
Limiting the costs
• Sharp focus on key information needed only
• Maximal use of the information available in the RD&I IS (IS VaVaI)
• Remote review and remote assessments, complemented with panel
meetings
• Limited number of subject panels & panellists
• No site visits
• The higher the level of sophistication, the higher the cost
37
Děkujeme za pozornost!
www.metodika.reformy-msmt.cz

More Related Content

What's hot

Policy Review/Evaluation
Policy Review/EvaluationPolicy Review/Evaluation
Policy Review/Evaluationjoems_angel2000
 
Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)nida19
 
Actors in public policy process
Actors in public policy   processActors in public policy   process
Actors in public policy processfatimazaheer12
 
Week2 rainey chapter_3
Week2 rainey chapter_3Week2 rainey chapter_3
Week2 rainey chapter_3mmzzmartinez
 
Policy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F Bancien
Policy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F BancienPolicy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F Bancien
Policy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F BancienDer Na Fuente Bella
 
Public organizations micro strategy formation
Public organizations micro strategy formationPublic organizations micro strategy formation
Public organizations micro strategy formationUniversity of Tampere
 
Introduction to Policy Evaluation
Introduction to Policy EvaluationIntroduction to Policy Evaluation
Introduction to Policy EvaluationpasicUganda
 
STAKEHOLDERS : their role and influence in public policy
STAKEHOLDERS : their role and influence in public policySTAKEHOLDERS : their role and influence in public policy
STAKEHOLDERS : their role and influence in public policyDeanne Mitzi Somollo
 
Policy analysis
Policy analysisPolicy analysis
Policy analysisuma107
 
Week 12 - Public Policy
Week 12 - Public PolicyWeek 12 - Public Policy
Week 12 - Public PolicyRyan Lee
 
Complexity and co-ordination: rethinking the policy mix for innovation
Complexity and co-ordination: rethinking the policy mix for innovationComplexity and co-ordination: rethinking the policy mix for innovation
Complexity and co-ordination: rethinking the policy mix for innovationkieronflanagan
 
Explain the six step model for public policy by SYED SALMAN JALAL KAKA KHEL
Explain the six step model for public policy by SYED SALMAN JALAL KAKA KHELExplain the six step model for public policy by SYED SALMAN JALAL KAKA KHEL
Explain the six step model for public policy by SYED SALMAN JALAL KAKA KHELSalman Kaka Khel
 
Policy, plan, and implementation
Policy, plan, and implementationPolicy, plan, and implementation
Policy, plan, and implementationMohammad Moosa
 
Policy Implementation Process
Policy Implementation ProcessPolicy Implementation Process
Policy Implementation ProcessJed Abolencia
 
Policy adoption final
Policy adoption finalPolicy adoption final
Policy adoption finalLENY BARROGA
 
Decision making and models of policy analysis
Decision making and models of policy analysisDecision making and models of policy analysis
Decision making and models of policy analysisIan Necosia
 
Policy Analysis: Evaluating Policy Performance
Policy Analysis: Evaluating Policy PerformancePolicy Analysis: Evaluating Policy Performance
Policy Analysis: Evaluating Policy PerformanceDjadja Sardjana
 

What's hot (20)

Ppa 503 lecture6a
Ppa 503 lecture6aPpa 503 lecture6a
Ppa 503 lecture6a
 
Policy Review/Evaluation
Policy Review/EvaluationPolicy Review/Evaluation
Policy Review/Evaluation
 
Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)
 
Actors in public policy process
Actors in public policy   processActors in public policy   process
Actors in public policy process
 
Week2 rainey chapter_3
Week2 rainey chapter_3Week2 rainey chapter_3
Week2 rainey chapter_3
 
Policy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F Bancien
Policy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F BancienPolicy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F Bancien
Policy report final by Merlinda D Gorriceta, Joy S Sumortin, Derna F Bancien
 
Public organizations micro strategy formation
Public organizations micro strategy formationPublic organizations micro strategy formation
Public organizations micro strategy formation
 
Introduction to Policy Evaluation
Introduction to Policy EvaluationIntroduction to Policy Evaluation
Introduction to Policy Evaluation
 
STAKEHOLDERS : their role and influence in public policy
STAKEHOLDERS : their role and influence in public policySTAKEHOLDERS : their role and influence in public policy
STAKEHOLDERS : their role and influence in public policy
 
Policy analysis
Policy analysisPolicy analysis
Policy analysis
 
Week 12 - Public Policy
Week 12 - Public PolicyWeek 12 - Public Policy
Week 12 - Public Policy
 
Complexity and co-ordination: rethinking the policy mix for innovation
Complexity and co-ordination: rethinking the policy mix for innovationComplexity and co-ordination: rethinking the policy mix for innovation
Complexity and co-ordination: rethinking the policy mix for innovation
 
Chapter 5 policy implementation
Chapter 5 policy implementationChapter 5 policy implementation
Chapter 5 policy implementation
 
Explain the six step model for public policy by SYED SALMAN JALAL KAKA KHEL
Explain the six step model for public policy by SYED SALMAN JALAL KAKA KHELExplain the six step model for public policy by SYED SALMAN JALAL KAKA KHEL
Explain the six step model for public policy by SYED SALMAN JALAL KAKA KHEL
 
Policy formulation
Policy formulationPolicy formulation
Policy formulation
 
Policy, plan, and implementation
Policy, plan, and implementationPolicy, plan, and implementation
Policy, plan, and implementation
 
Policy Implementation Process
Policy Implementation ProcessPolicy Implementation Process
Policy Implementation Process
 
Policy adoption final
Policy adoption finalPolicy adoption final
Policy adoption final
 
Decision making and models of policy analysis
Decision making and models of policy analysisDecision making and models of policy analysis
Decision making and models of policy analysis
 
Policy Analysis: Evaluating Policy Performance
Policy Analysis: Evaluating Policy PerformancePolicy Analysis: Evaluating Policy Performance
Policy Analysis: Evaluating Policy Performance
 

Similar to Context & key concepts of the new Evaluation Methodology

Architecture of the EM and some key experience from abroad
Architecture of the EM and some key experience from abroadArchitecture of the EM and some key experience from abroad
Architecture of the EM and some key experience from abroadMEYS, MŠMT in Czech
 
Governance, assessment and incentives in the research and innovation funding ...
Governance, assessment and incentives in the research and innovation funding ...Governance, assessment and incentives in the research and innovation funding ...
Governance, assessment and incentives in the research and innovation funding ...MEYS, MŠMT in Czech
 
Feedback on the draft summary report
Feedback on the draft summary reportFeedback on the draft summary report
Feedback on the draft summary reportMEYS, MŠMT in Czech
 
Panel evaluation, processes and results
Panel evaluation, processes and resultsPanel evaluation, processes and results
Panel evaluation, processes and resultsMEYS, MŠMT in Czech
 
Livestock and Fish monitoring, evaluation and learning framework
Livestock and Fish monitoring, evaluation and learning frameworkLivestock and Fish monitoring, evaluation and learning framework
Livestock and Fish monitoring, evaluation and learning frameworkILRI
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsDebbie_at_IDS
 
Effective PRFS: principles and practice
Effective PRFS: principles and practiceEffective PRFS: principles and practice
Effective PRFS: principles and practiceMEYS, MŠMT in Czech
 
An overview of the REF
An overview of the REFAn overview of the REF
An overview of the REFEmma Gillaspy
 
'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...
'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...
'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...IIBA_Latvia_Chapter
 
Library Strategy: Models and Measurement
Library Strategy: Models and MeasurementLibrary Strategy: Models and Measurement
Library Strategy: Models and MeasurementStephen Town
 
Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...Fbertrand
 
Enhancing quality of research 4 development: Initial ideas for design and imp...
Enhancing quality of research 4 development: Initial ideas for design and imp...Enhancing quality of research 4 development: Initial ideas for design and imp...
Enhancing quality of research 4 development: Initial ideas for design and imp...ILRI
 
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...MEYS, MŠMT in Czech
 

Similar to Context & key concepts of the new Evaluation Methodology (20)

The new Evaluation Methodology
The new Evaluation MethodologyThe new Evaluation Methodology
The new Evaluation Methodology
 
Architecture of the EM and some key experience from abroad
Architecture of the EM and some key experience from abroadArchitecture of the EM and some key experience from abroad
Architecture of the EM and some key experience from abroad
 
Sunum qqml2014
Sunum qqml2014Sunum qqml2014
Sunum qqml2014
 
Governance, assessment and incentives in the research and innovation funding ...
Governance, assessment and incentives in the research and innovation funding ...Governance, assessment and incentives in the research and innovation funding ...
Governance, assessment and incentives in the research and innovation funding ...
 
Feedback on the draft summary report
Feedback on the draft summary reportFeedback on the draft summary report
Feedback on the draft summary report
 
University Mergers in Europe
University Mergers in Europe University Mergers in Europe
University Mergers in Europe
 
Panel evaluation, processes and results
Panel evaluation, processes and resultsPanel evaluation, processes and results
Panel evaluation, processes and results
 
Livestock and Fish monitoring, evaluation and learning framework
Livestock and Fish monitoring, evaluation and learning frameworkLivestock and Fish monitoring, evaluation and learning framework
Livestock and Fish monitoring, evaluation and learning framework
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
Evaluating and managing for results - Experiences from Norway
Evaluating and managing for results - Experiences from NorwayEvaluating and managing for results - Experiences from Norway
Evaluating and managing for results - Experiences from Norway
 
Effective PRFS: principles and practice
Effective PRFS: principles and practiceEffective PRFS: principles and practice
Effective PRFS: principles and practice
 
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
 
An overview of the REF
An overview of the REFAn overview of the REF
An overview of the REF
 
'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...
'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...
'Patterns in Business Analysis and Enterprise Modeling: How to evaluate their...
 
Library Strategy: Models and Measurement
Library Strategy: Models and MeasurementLibrary Strategy: Models and Measurement
Library Strategy: Models and Measurement
 
Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...
 
M & E Presentation DSK.ppt
M & E Presentation DSK.pptM & E Presentation DSK.ppt
M & E Presentation DSK.ppt
 
Enhancing quality of research 4 development: Initial ideas for design and imp...
Enhancing quality of research 4 development: Initial ideas for design and imp...Enhancing quality of research 4 development: Initial ideas for design and imp...
Enhancing quality of research 4 development: Initial ideas for design and imp...
 
Evaluation to Improve Development Results
Evaluation to Improve Development ResultsEvaluation to Improve Development Results
Evaluation to Improve Development Results
 
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
Komentáře členů hlavních a oborových panelů k metodice hodnocení a pilotnímu...
 

More from MEYS, MŠMT in Czech

Pilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisationsPilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisationsMEYS, MŠMT in Czech
 
Průvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizacePrůvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizaceMEYS, MŠMT in Czech
 
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RUSouhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RUMEYS, MŠMT in Czech
 
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...MEYS, MŠMT in Czech
 
Final report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation MethodologyFinal report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation MethodologyMEYS, MŠMT in Czech
 
Final report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding PrinciplesFinal report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding PrinciplesMEYS, MŠMT in Czech
 
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...MEYS, MŠMT in Czech
 
Summary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding PrinciplesSummary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding PrinciplesMEYS, MŠMT in Czech
 
Identifikace vědeckých pracovníků
Identifikace vědeckých pracovníkůIdentifikace vědeckých pracovníků
Identifikace vědeckých pracovníkůMEYS, MŠMT in Czech
 
Doporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠDoporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠMEYS, MŠMT in Czech
 
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...MEYS, MŠMT in Czech
 
Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...
Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...
Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...MEYS, MŠMT in Czech
 

More from MEYS, MŠMT in Czech (20)

Pilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisationsPilot test of new evaluation methodology of research organisations
Pilot test of new evaluation methodology of research organisations
 
Tabulky nákladového modelu
Tabulky nákladového modeluTabulky nákladového modelu
Tabulky nákladového modelu
 
Organizační schémata
Organizační schémataOrganizační schémata
Organizační schémata
 
Průvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizacePrůvodce pro hodnocené výzkumné organizace
Průvodce pro hodnocené výzkumné organizace
 
Šablona sebeevaluační zprávy
Šablona sebeevaluační zprávyŠablona sebeevaluační zprávy
Šablona sebeevaluační zprávy
 
Zápisy z kalibračních schůzek
Zápisy z kalibračních schůzekZápisy z kalibračních schůzek
Zápisy z kalibračních schůzek
 
Úpravy bibliometrické zprávy
Úpravy bibliometrické zprávyÚpravy bibliometrické zprávy
Úpravy bibliometrické zprávy
 
Příklad bibliometrické zprávy
Příklad bibliometrické zprávyPříklad bibliometrické zprávy
Příklad bibliometrické zprávy
 
Průvodce pro členy panelů
Průvodce pro členy panelůPrůvodce pro členy panelů
Průvodce pro členy panelů
 
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RUSouhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
Souhrnné tabulky s údaji o počtu pracovníků a výstupů EvU a jejich RU
 
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
Komentáře hodnocených a výzkumných jednotek k metodice hodnocení a pilotní...
 
Final report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation MethodologyFinal report 1 / The R&D Evaluation Methodology
Final report 1 / The R&D Evaluation Methodology
 
Final report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding PrinciplesFinal report 2: The Institutional Funding Principles
Final report 2: The Institutional Funding Principles
 
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
The Small Pilot Evaluation and the Use of the RD&I Information System for Eva...
 
Summary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding PrinciplesSummary Report / R&D Evaluation Methodology and Funding Principles
Summary Report / R&D Evaluation Methodology and Funding Principles
 
Analýza rizik pro zavedení NERO
Analýza rizik pro zavedení NEROAnalýza rizik pro zavedení NERO
Analýza rizik pro zavedení NERO
 
Identifikace vědeckých pracovníků
Identifikace vědeckých pracovníkůIdentifikace vědeckých pracovníků
Identifikace vědeckých pracovníků
 
Doporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠDoporučené změny vnitřních předpisů VVŠ
Doporučené změny vnitřních předpisů VVŠ
 
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
Podklady a doporučení pro zapracování do věcného záměru zákona nahrazujícího ...
 
Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...
Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...
Harmonogram postupných kroků realizace návrhů nového hodnocení a financování ...
 

Recently uploaded

ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxVanesaIglesias10
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONHumphrey A Beña
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)cama23
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptshraddhaparab530
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...JojoEDelaCruz
 

Recently uploaded (20)

ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptx
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)Global Lehigh Strategic Initiatives (without descriptions)
Global Lehigh Strategic Initiatives (without descriptions)
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.ppt
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
 

Context & key concepts of the new Evaluation Methodology

  • 1. Context & key concepts of the new Evaluation Methodology Erik Arnold & Bea Mahieu - Technopolis Group www.metodika.reformy-msmt.cz
  • 2. The EM in its historical context • Evolution after the reform of 2008: • Progressive restriction of the scope: research outputs only • Reduction of the complexity of performance to an overly simple category of outputs • Lack of consideration for disciplinary differences and for the missions of ROs • Narrowing of the function of evaluation: only as component of the PRFS • Concept of evaluation as part of a policy cycle providing strategic information is not perceived; it provides information that is at the best of limited relevance • Increasing breadth of coverage: from funding bodies to individual researchers • Metodika 2013-2015: an improvement but still focused exclusively on outputs • Result: • Evaluation is perceived as counting outputs = points = funding • It constitutes the key factor for R&D management throughout the entire system • It is detached from any discourse on policy and strategy related to the national R&D system 1
  • 3. The mandate for this study • Objectives: to develop an evaluation methodology • Conducted on a national basis at the institutional level • Evaluation results inform the institutional funding system (PRFS) • Providing strategic information for the actors at all levels in the R&D system • The expectations: • A peer-review evaluation process • Fulfil formative and summative functions • Cover outputs, impacts, and institutional projections of research development • Take into consideration the different missions of research organisations and the field specifics • Evaluation processes resistant to clientelism and conflicts of interests • Take into account ‘gaming’ • Total costs do not exceed 1% of public institutional support for R&D in a five-year time period 2
  • 4. The project 3 PHASE 1 WP1 - EvU evaluation structure determination WP2 - EvU field-specific eval. methodology (FEM) WP3 - EvU evaluation process rules CONCLUSION PHASE 1 PHASE 2 WP4 - Evaluation information support WP5 - Definition 'institutional funding' - intl practice WP6 - SWOT Czech institutional R&D base WP7 - Draft models FF & PBC WP8 - Model exp.impacts new inst.funding system CONCLUSION PHASE 2 PHASE 3 WP9 - Small Pilot Evaluation CONCLUSION PHASE 3 / STUDY Consult- ations 2nd Pilot Implement Handover to IPN team
  • 5. Evaluation (assessment) in the context of funding 4 Principles, Structure First design Consultations SPE Principles, Structure First design Consultations Ex- ante Evaluation methodology Institutional funding principles Final EM and funding principles
  • 6. Evaluation in the context of a funding mechanism 5 • ----------- • ----------- • ----------- • ----------- • ----------- • ----------- • ----------- • ----------- • ----------- • ----------- • ----------- • ----------- Management & potential Membership of research community Research excellence Research performance Information Peer review • Star rating • Strategic feedback • Star rating • Strategic feedback • Star rating • Strategic feedback • Star rating • Strategic feedback • ----------- • ----------- • ----------- Societal relevance • Star rating • Strategic feedback Money ‘pot’ per institution type % % % % % % % % % % % % % % % % % % % % Institutiona l research funding per evaluated RU
  • 7. Evaluation in general should inform the whole policy cycle 6 Issue identification Issue framing Policy measure identification Policy measure development and adoption Implementation Feedback Issue identification Issue framing Policy measure identification Impact assessment Policy measure development and adoption Implementation Interim evaluation Effectiveness and ex-post impact assessment Feedback
  • 8. Evaluation should analyse societal effects of intervention, not just focus on outputs 7 Society Economy Environment Public Intervention Evaluation Needs Problems Issues Objectives Inputs Outputs Outcomes Impacts Relevanc e Efficiency Effectiveness Utility and Sustainability
  • 9. Evaluation (assessment) for a PRFS is necessarily more narrow 8 ‘General’ evaluation Evaluation in a PRFS Informs policy making on the failures in the system and recommends possible policy interventions Is part of a policy intervention: it acts upon previously identified failures and steers research behaviour to tackle these by providing incentives Has no effects directly linked to the evaluation Is intended to create effects Has no consequences of gaming or unintended effects Has inevitable consequences of gaming and may lead to unintended effects Gives information on the positioning of the evaluated objects in the national/international context Sets the evaluated objects in competition to each other
  • 10. Who does what 9 Australia (2014) Austria (2014) Belgium (FL) – BOF (2014) Finland (2014) Italy (2014) Netherlands (2014) New Zealand (2014) Norway / evaluations (2014) Norway / PRFS (2014) Sweden (2014) UK – (2014) Purpose Performance assessment X X X X X X X X X X X Inform funding X X X X X X X Main function Formative X X X X X X X Summative X X X X Formative function National R&D governance X X X X X X Institutional R&D management X X X X X X X Summative function R&D quality X X X X X X X X X X X R&D capacity building X X X X X X X X Research excellence X X Societal relevance X X X X Notes Performance contracts Performance agreements Performance contracts Performance contracts Separate PRFS – simple, 10% of funding New PRFS in 2015
  • 11. National and institutional evaluations tend to be distinct • National evaluations typically • Are commissioned by national research funders or their principals • Compare performance with other countries at the field level • Collect strategic information to support research policy interventions • Are summative with respect to the institutional level • Lean heavily on bibliometric methods – with selective use of peer review • Institutional evaluations • Are commissioned by institutions or their principals • If they benchmark (often they don’t), compare with other institutions • Try to explain performance at a level of detail that is formative, in order to support institutional research management • Tend to use informed peer review (though note that modern research management routinely monitors performance by bibliometric means 10
  • 12. It’s unusual to compare different types of RO • Types of organisation • Universities / Higher Education Institutions • Scientific research institutes • Research and Technology Organisations (RTOs) • Government laboratories • NB that hybrid forms are increasingly appearing • Different types of RO are rarely benchmarked against each other • Differences in mission or function • Different principals and accountability • Different mix among types of research and between research and non-research activities • Legal form is irrelevant – evaluation is based on function • See, for example, the Norwegian institute PRFS, which separates institutes from universities but ignores legal form 11
  • 13. Typical institutional research evaluation approaches by RO type • Universities • Focus on quality, interaction with teaching • Preference for peer review • Scientific research institutes • Ditto, but no emphasis on teaching unless PhD education is provided • RTOs • Economic performance, social impacts, customer feedback, research quality • Predominance of social scientific methods; peers if quality is a major issue • Government labs • Mission performance, usefulness to policy formation processes, social impacts, research quality • Predominance of social scientific methods; peers if quality is a major issue 12
  • 14. Why use peers? • Legitimacy in the scientific community • Ability to address context, for example at institutional level • Understanding inter-field differences • Ability to be formative • Drawbacks include • Thse evaluated generally have to do work as part of the process • Need to ensure independence (research funders devote significant effort to this) • Expensive, especially if site visits are involved • Shortage of peers • Occasional misbehaviour at the individual level – typically constrained by the use of panels 13
  • 15. Why use bibliometrics? • Comparatively low cost – and the cost is declining • Need not involve work by those evaluated • Can produce fine-grained ratings and rankings – though arguably this is spurious precision • However • Inter-field comparisons are problematic • Use of indicators is generally unsophisticated and sometimes problematic (for example, Journal Impact Factors) • Lack of context indicators and conventions for handling them • Hence, they provide a much more partial picture than peer review • Unhelpful for formative evaluation • Emerging conclusion: use ‘informed’ peer review 14
  • 16. Some effects of PRFS • Positive • Improved research and management attention to productivity and quality • Performance focus and individual and collective levels • Restructuring towards high-quality performers • Greater transparency about performance • Negative • Focus on producing indicators rather than science, encouraging ‘gaming’ • Narrowing of the career path, with growing influence of research managers • Squeezing out the heterodox and interdisciplinarity • Matthew Effect encourages lock-ins • Research indicators often used in relation to teaching 15
  • 17. Unit of analysis • Depends on the purpose of the evaluation • Tendency to avoid the individual level – that produces erratic results and undermines institutional autonomy • Some systems require tactical decisions about including individual researchers, with intra-institutional and personal consequences • Decisions to include organisations are normally policy-based. You can’t ‘volunteer’ (and usually you can’t avoid assessment, either) 16
  • 18. Pitting discipline against discipline • Most PRFS reallocate a small proportion of total institutional funding and are therefore tolerant of some inaccuracy or margin for error • By and large, disciplines are incommensurable – one of the most interesting activities in scientometrics is the struggle to overcome this fact • The nature of the research enterprise differs (‘What is research?’) • Differing notions of quality • Publication behaviours differ significantly • Different social roles of disciplines • It is not necessarily desirable to change the proportions of national effort among disciplines for reasons of performance only – policy needs to play a part 17
  • 19. Interdisciplinarity • All research funders (and therefore evaluators) struggle with this • Interdisciplinarity is important • Emergence of new disciplines and approaches • Ability to do Mode 2 work, addressing industrial and societal challenges • But it’s hard to measure • Discipline redefinition implies changes in the criteria relevant for measurement • Interdisciplinarity requires use of expertise and indicator categories that cross existing boundaries • For example, multiple peer panels • In metrics-based approaches, the interesting things start happening in undefined or ‘other’ categories • As with any change, there is often professional disagreement about standards, criteria, approaches 18
  • 20. Different indicators, different purposes 19 Research productivity Research quality Relevance of research Efficiency/ value for money Quality/ sustainability of national research systems Input Criteria X X X Systemic indicators X X Process indicators X X X Research outputs X X X Impact indicators X X
  • 21. Recent Czech focus on outputs is unusual 20 UK (REF) Italy (VQR) Belgium/FL (IOF) Belgium/FL (BOF) Norway (HEI) Sweden Outputs Systemic & process indicators Impacts Society Innovation Research Denmark Finland Norway (PRI) Czech Republic
  • 22. Ratings tend to use 4-5 points 21 Score Meaning Research quality Relevance to society Viability 1 World leading/ excellent The research unit has been shown to be one of the few most influential research groups in the world in its particular field. The research unit makes an outstanding contribution to society. The research unit is excellently equipped for the future. 2 Very good The research unit conducts very good, internationally recognised research. The research unit makes a very good contribution to society. The research unit is very well equipped for the future. 3 Good The research unit conducts good research. The research unit makes a good contribution to society. The research unit makes responsible strategic decisions and is therefore well equipped for the future. 4 Unsatisfactory The research unit does not achieve satisfactory results in its field. The research unit does not make a satisfactory contribution to society. The research unit is not adequately equipped for the future. Example: Netherlands
  • 23. Implementation issues for peer/panel approaches • Robustness • Clear and universal assessment guidelines • Professional management and support organisation • Fairness • Clientelism and nepotism are avoided • Scholarly bias is tackled • Appropriate representation • Interdisciplinarity is addressed • Consistency of assessments • Transparency of the process • Cost 22
  • 25. The key principles • The EM • Reflects the strategic policy objectives for the R&D system • Functions = to act as source for strategic information & directly inform public institutional funding for research organisations • The evaluation • Is at the level of field-defined Research Unit (RU) within an Evaluated Unit, i.e. a research organisation or in the case of the public HEI, a faculty • Is a process of informed peer review: metrics inform but do not substitute for judgment • Covers all research organisations of a critical size, on a voluntary basis • Is a fair and egalitarian system: a single framework for assessment while allowing for a reasonable level of field- and RO typology-specific variations • Is comprehensive: all dimensions of the research activities and its outputs, outcomes and impact • The minimum possible cost and burden to deliver a robust and defensible process 24
  • 26. The role and function of the EM • The key function of an evaluation system is to support public R&D governance in the attainment of its strategic objectives, i.e. • To strengthen R&D capacity • To foster excellence in research • To foster research alignment with the societal needs • To support the growth and competitiveness of the Czech Republic • By: • Assessing past research performance • Supporting future performance improvement Provide strategic information for policy making and R&D management Provide incentives for positive change in the R&D system reflecting the strategy 25
  • 27. How does the EM provide incentives for positive change? • Indicators used & their structuring into a set of assessment criteria is guided by the policy objectives 26 Quality adequacy of the physical research environment Research strategy & management (including HR management) (International) research presence & collaboration Institutional management & development potential Membership of the (world) research community Research performance Research productivity Research capacity building (PhD) Overall quality of the research activities Scientific research excellence The quality of the selected ‘best’ outputs Societal relevance Importance of (potential) societal impacts
  • 28. How does the EM act as a source of strategic information? • The EM is not an arithmetic exercise  quantitative & qualitative input & evaluation results • Quality information thanks to a detailed and comprehensive approach • Focus on the evaluated actors’ role, positioning, and competitive value in the national R&D and innovation system as well as in the international R&D landscape • Strengths and weaknesses of the different actors against all dimensions of the research activities, i.e. the research quality, research strategy, the research environment (i.e. the institutional conditions), the research outputs, outcomes and impacts • It will allow for the identification of the factors upon which action is needed in order to improve research performance, at the national as well as institutional level. • Primary unit of analysis = the elements that worldwide constitute the fundamental structure of research, i.e. scientific fields  Research Unit 27
  • 29. Matching the scientific with the institutional dimension 28 Research unit (RU) Individual researchers Evaluated Unit (EvU) Research Organisation Research Institute / University Faculty R&D Governance Bodies & Agencies Scientific fields National R&D System (International) Structure of Science Disciplinary areas
  • 30. A process of informed peer review • Based on a mix of appropriate quantitative and qualitative data to support their professional judgement • international bibliometric data, data from the RD&I system, quantitative and qualitative data provided by the evaluated Research Units through self-reporting • Expert panels allow for in-depth consideration of differences in scientific fields and sub-fields • Type of outputs, channels, productivity, costs, collaboration, internationalisation … • Strict rules govern conflicts of interest, against nepotism and ‘clientelism’, combined with auditing mechanisms and rules for punishing cases of fraud • The evaluation will be conducted in full transparency • Evaluation results: quality scores & explanatory texts, conclusions & recommendations 29
  • 31. Covering all research organisations in the CR of a critical size • A broad & varied R&D base: • HEIs, public research institutes, state organisations, private research organisations, organisational units of the CR, non-profit organisations, and associations of legal persons • Categorisation based on their mission for society related to research: • Scientific Research Organisations, including the scientific research institutes, HEIs and university hospitals • RTOs, i.e. research organisations that have as primary function to provide knowledge transfer services to the industry sector. • The Aerospace Research and Test Establishment, Research Institute of Building Materials, etc • Public Service Research Organisations, i.e. research organisations that have as primary function to provide services to government or society. • Research Institute for Labour and Social Affaires, the Institute for International Relations, the Czech Metrology Institute, etc. • Infrastructure Research Organisations, providing infrastructure & information to research • CESNET, the National Library of the Czech Republic, etc. • Critical size: 50 research outputs in 5 years – the minimum for robust data analytics 30
  • 32. Research outputs taken into account 31 Threshold Scientific research excellence Other indicators Papers in peer-reviewed journals (J) X X X Conference proceedings (D) X X X Monographs, books and book chapters (B), provided they are identified with an ISBN number X X X Results used by the funding provider, i.e. projected into legislation or norm, projected into non-legislative or strategic documents (H) X X Research report containing classified information (V) X X Certified methodologies, art conservation methodologies, specialized map works (N) X X Patents and patent applications (P) X X Plant/ breeders rights (Zodry & Zplem) X X Scholarly outputs Non-traditional scholarly outputs Patents and other IP
  • 33. Handling interdisciplinary research • Interdisciplinary research between sub-fields: handled by 1 subject panel • Facilitated by the categorisation of scientific disciplines in broad fields based on the OECD FOS (internationally recognised): 6 disciplinary areas – 36 fields – 190 sub-fields (e.g. Medical & Health sciences – Basic medicine – Neurosciences) • Interdisciplinary research within a disciplinary area: cross-referrals among the subject panels in 1 disciplinary area • Interdisciplinary research across disciplinary areas: application for Interdisciplinary Research Unit • For cross-referrals & Interdisciplinary Research Units: • At least 30% of research activities take place across disciplinary areas • “Proven” through research outputs/bibliometrics or the scientific background of the researchers in the unit • Decided upon by main panel & subject panel chairs 32
  • 34. Using a fair and egalitarian system • A single framework for assessment across all disciplines and types of research organisations • The same assessment criteria are used for all research organisations & fields • Allows for full comparability of the evaluation results, independently of the scientific fields and wherever the research is conducted • Based on a detailed Evaluation Protocol, setting common procedures and providing standard definitions • The main panel chairs act as auditors and guarantors for the comparability • A first step in the evaluation process is the calibration exercise related to the assessment criteria • Field-specific definition of the terms: originality, rigour, significance, and reach • Weighting the importance of sub-criteria for the field and for the different types of research organisation 33
  • 35. Assessment criteria & indicators (1) • Institutional management & development potential • Quality adequacy of the research environment • Research infrastructure, use of national/international RI, shared or collaborative use • Research capacity: funding profile (institutional, competitive, contract; national/international), staff profile & size (including HC versus FTE) • Research strategy & HR management • Quality of the research strategy: strength of the research plan (long/short term, objectives & activities, adequacy, feasibility • Quality of processes for career development (objectives, competency framework, appraisal criteria, frequency performance reviews, etc), policy & practice concerning PhD & postdocs (support & supervision, training, etc) • Membership of the national & global research community • National & international • Intensity & quality of collaboration & partnerships, co-publications, international mobility • Reputation & esteem 34
  • 36. Assessment criteria & indicators (2) • Scientific research excellence: peak quality • In terms of originality, significance and rigour • Two-stage assessment process: • Review results (highly selected scholarly outputs) • Panel assessment: aggregating review results + citation impacts: nr and % publications among the top 10%, and 25% most cited publications (world, EU28) • Research performance: overall quality of the research activities • Research productivity: outputs/FTE researchers • Contribution to capacity building: PhD students/awarded & trained • Overall quality of the research activities: • Publishing profile, citation impacts, publication channels, etc (incl. relative data on positioning in field) • Overall competitiveness (funding, strategy & alignment trends, infrastructure etc) • Narrative on main value for research 35
  • 37. Assessment criteria & indicators (3) • Societal relevance • In terms of reach & significance • Knowledge & technology transfer activities: intensity & quality • Income from societal relevant research (competitive & contract research) • Income from commercialization activities • National collaborations and partnerships beyond academia • Reputation and esteem beyond academia • Non-traditional scholarly research & IPR-related outputs (national/international) • Societal impact: narratives 36
  • 38. Limiting the costs • Sharp focus on key information needed only • Maximal use of the information available in the RD&I IS (IS VaVaI) • Remote review and remote assessments, complemented with panel meetings • Limited number of subject panels & panellists • No site visits • The higher the level of sophistication, the higher the cost 37