SlideShare a Scribd company logo
1 of 52
CHARACTERISTICS
OF A GOOD TEST
TEST
 a formal and systematic instrument,
usually paper and pencil procedure
designed to assess the quality, ability, skill
or knowledge of the students by giving a
set of question in uniform manner
one of the many types of assessment
procedure used to gather information
about the performance of students
LDIIYVTA
•VALIDITY
 DEFINITION:
 “Validity is the extent to which a test
measures what it claims to measure. It is
vital for a test to be valid in order for the
results to be accurately applied and
interpreted.”
Other definitions given by experts
Gronlund and Linn (1995)- “ Validity refers
to the appropriateness of the interpretation
made from test scores and other
evaluation results with regard to a
particular use.”
Anne Anastasi (1969) writes “ the validity
of a test concerns what the test measures
and how well it does so.”
Ebel and Frisbie (1991)- “ The term
validity, when applied to a set of test
scores, refers to the consistency
(accuracy) with which the scores measure
a particular cognitive ability of interest.”
C.V. Good (1973)- in the dictionary of
education defines validity as the “ extent to
which a test or other measuring instrument
fulfills the purpose for which it is used.”
•TYPES OF VALIDITY
1. Face Validity:
- it is the extent to which the
measurement method appears “on its
face” to measure the construct of interest.
-is done by examining the physical
appearance of the instrument to make it
readable and understandable
EXAMPLE:
People might have negative reactions
to an intelligence test that did not
appear to them to be measuring their
intelligence.
 2. Content Validity:
-it is the extent to which the
measurement method covers the entire
range of relevant behaviors, thoughts,
and feelings that define the construct
being measured.
-is done through a careful and critical
examination of the objectives of
assessment to reflect the curricular
objectives.
3. Criterion-based Validity:
-it is the extent to which people’s
scores are correlated with other variables
or criteria that reflect the same construct.
-is established statistically such
that a set of scores revealed by the
measuring instrument is correlated with the
scores obtained in another external
predictor or measure
Example:
 An IQ test should correlate positively with
school performance.
An occupational aptitude test should
correlate positively with work performance
•TYPES OF CRITERION VALIDITY:
 3.1. Predictive Validity:
-describes the future performance of
an individual by correlating the sets of
scores obtained from two measures given
at a longer time interval
-when the criterion is something that
will happen or be assessed in the future,
this is called predictive validity.
 3.2. Concurrent Validity:
-describes the present status of the
individual by correlating the sets of
scores obtained from two measures
given at a close interval
-when the criterion is something that is
happening or being assessed at the
same time as the construct of interest, it
is called concurrent validity.
4. Construct Validity
-is established statistically by comparing
psychological traits or factors that
theoretically influence scores in a test
TYPES OF CONSTRUCT VALIDITY:
4.1 Convergent Validity
-is established if the instrument defines
another similar trait other than what it is
intended to measure.
E.g. Critical Thinking Test may be
correlated with Creative Thinking Test.
4.2 Divergent Validity
- is established if an instrument can
describe only the intended trait and not
the other traits.
E.g. Critical Thinking Test may not be
correlated with Reading Comprehension
Test.
Nature of Validity
1. Validity refers to the appropriateness of the
test results but not to the instrument itself.
2. Validity does not exist on an all-or-none
basis but it is a matter of degree.
3. Tests are not valid for all purposes. Validity
is always specific to particular interpretation.
4. Validity is not of different types. It is a unitary
concept. It is based on various types of
evidence.
Factors Affecting Validity :-
1. Factors in the test:
(i) Unclear directions to the students to
respond the test.
(ii) Difficulty of the reading vocabulary and
sentence structure.
(iii) Too easy or too difficult test items.
(iv) Ambiguous statements in the test
items.
(v) Inappropriate test items for measuring
a particular outcome.
(vi) Inadequate time provided to take the
test
2. Factors in Test Administration and
Scoring
(i) Unfair aid to individual students, who
ask for help.
(ii) Cheating by the pupils during testing.
(iii) Unreliable scoring of essay type
answer.
(iv) Insufficient time to complete the test.
(v) Adverse physical and psychological
condition at the time of testing.
3. Factors related to Testee
(i) Test anxiety of the students.
(ii) Physical and Psychological state of
the pupil
(iii) Response set– a consistent
tendency to follow a certain pattern in
responding the items.
LAIERLIYTIB
• RELIABILITY
 refers to the consistency of
measurement; that is, how consistent
test results or other assessment results
from one measurement to another
Other definitions given by experts
Gronlund and Linn (1995)-” reliability
refers to the consistency of measurement-
that is, how consistent test scores or other
evaluation results are from one
measurement to other”.
Ebel and Frisbie (1991)- “ the term
reliability means the consistency with
which a set of test scores measure
whatever they do measure”.
C.V. Good (1973)-has defined reliability
as the “ worthiness with which a
measuring device measures something;
the degree to which a test or other
instrument of evaluation measures
consistently whatever it does in fact
measure”.
Davis (1946) “ the degree of relative
precisions of measurement of a set of test
score is defined reliability”.
Nature of Reliability
1. Reliability refers to consistency of the results
obtained with an instrument but not the
instrument itself.
2. Reliability refers to a particular interpretation
of test scores.
3.Reliability is a statistical concept to determine
reliability we administer a test to a group once
or more than once.
4. Reliability is necessary but not a sufficient
condition for validity.
Four methods of determining
reliability
(a) Test-Retest method.
(b) Equivalent forms/Parallel forms
method.
(c) Split-half method.
(d) Rational Equivalence/Kuder-
Richardson method.
Test-Retest method:
This is the simplest method of determining
the test reliability.
To determine reliability in this method the
test is given and repeated on same group.
Then the correlation between the first set
of scores and second set of scores is
obtained.
Equivalent Forms/Parallel Forms Method:
In this process two parallel forms of tests
are administered to the same group of
pupils in short interval of time, then the
scores of both the tests are cor­related.
This correlation provides the index of
equivalence.
Split-Half Method:
In this method a test is administered to a
group of pupils in usual manner. Then the
test is divided into two equivalent values
and correlation for these half­tests are
found.
Rational Equivalent/Kuder Richardson Method:
This method also provides a measure of
internal consistency. It neither requires
administration of two equivalent forms of
tests nor it requires to split the tests into two
equal halves.
Reliability coefficient is determined by using
the Kuder­Richardson formula­20 which
reads like this.
Factors affecting reliability:-
1. Factors related to test:
(i) length of the test
(ii) content of the test
(iii) characteristics of items
(iv) spread of scores
2. Factors related to testee:
(i) Heterogeneity of the group
(ii) Test wiseness of the students
(iii) Motivation of the students
3. Factors related to testing procedures:
(i) Time limit of test
(ii) Cheating opportunity given to the
students
BIJETCIYOTV
• OBJECTIVITY
 refers to the agreement of two or more
raters or test administrators concerning
the score of the student.
Other definitions given by experts:
C.V. Good (1973) defines objectivity in
testing is “the extent to which the
instrument is free from personal error
(personal bias), that is subjectivity on the
part of the scorer”.
Gronlund and Linn (1995) states
“Objectivity of a test refers to the degree to
which equally competent scores obtain the
same results. So a test is considered
objective when it makes for the elimination
of the scorer’s personal opinion and bias
judgement. In this con­text there are two
aspects of objectivity which should be kept
in mind while constructing a test.”
Two aspects of objectivity which should be kept
in mind while constructing a test
1. Objectivity in Scoring
-means same person or different persons
scoring the test at any time arrives at the
same result without may chance error. ­a
test to be objective must necessarily so
worded that only correct answer can be
given to it.
2. Objectivity of Test Items
-means that the item must call for a
definite single answer. Well­con­structed
test items should lead themselves to one
and only one interpretation by students
who know the material involved. It means
the test items should be free from
ambiguity.
INESRASF
• FAIRNESS
- means the test item should not have
any biases. It should not be offensive
to any examinee subgroup.
-a test can only be good if it is fair to all
the examinees
-a fair assessment provides all
students with an equal opportunity to
demonstrate achievement
The key to fairness are as follows:
Students have knowledge of learning targets
and assessment.
Students are given equal opportunity to
learn.
Students possess the pre-requisite
knowledge and skills.
Students are free from teacher stereotypes.
Students are free from biased assessment
task and procedures.
RBAILISOTCY
• SCORABILITY
- means that the test should be easy
to score, direction for scoring should be
clearly stated in the instruction. Provide the
students an answer sheet and the answer
key for the one who will check the test.
QAUYDCEA
• ADEQUACY
-means that the test should contain a
wide range of sampling of items to
determine the educational outcomes or
abilities so that the resulting scores are
representatives of the total performance in
the areas measured.
NIMRSTBIADATYLI
• ADMINISTRABILITY
- means that the test should be
administered uniformly to all students so
that the scores obtained will not vary due to
factors other than differences of the
student’s knowledge ad skills. There should
be a clear provision for instruction for the
students, proctors and even the one who will
check the test or the test scorer.
TILACYTIAPCR
DAN
IECCYNFEFI
• PRACTICALITY AND EFFICIENCY
-refers to the teacher’s familiarity with
the methods used, time required for the
assessment, complexity of the
administration, ease of scoring, ease of
interpretation of the test results and the
materials used must be at the lowest cost.
ECANABL
• BALANCE
-a balanced assessment sets targets in all
domains of learning (cognitive, affective and
psychomotor) or domains of intelligence
(verbal-linguistic, logical-mathematical, bodily
kinesthetic , visual- spatial, musical-rhythmic,
intrapersonal-social, interpersonal-
introspection, physical world-natural,
existential-spiritual)
-makes use of both traditional and
alternative assessment
THANK YOU!

More Related Content

What's hot

teacher made test Vs standardized test
 teacher made test Vs standardized test teacher made test Vs standardized test
teacher made test Vs standardized testathiranandan
 
Good test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good testGood test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good testTiru Goel
 
Constructing test Items
Constructing test ItemsConstructing test Items
Constructing test ItemsDEBABRATA GIRI
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment. Tarek Tawfik Amin
 
Characteristics of a good assessment tool
Characteristics of a good assessment toolCharacteristics of a good assessment tool
Characteristics of a good assessment toolKiranMalik37
 
Reliability (assessment of student learning I)
Reliability (assessment of student learning I)Reliability (assessment of student learning I)
Reliability (assessment of student learning I)Rey-ra Mora
 
TOOLS AND TECHNIQUES FOR CLASSROOM ASSESSMENT
TOOLS AND  TECHNIQUES FOR CLASSROOM ASSESSMENTTOOLS AND  TECHNIQUES FOR CLASSROOM ASSESSMENT
TOOLS AND TECHNIQUES FOR CLASSROOM ASSESSMENTVijayalakshmi Murugesan
 
Meaning and concept of test, testing, measurement, assessment and evaluation
Meaning and concept of test, testing, measurement, assessment and evaluationMeaning and concept of test, testing, measurement, assessment and evaluation
Meaning and concept of test, testing, measurement, assessment and evaluationDr. Amjad Ali Arain
 
Principles of Test Construction 1
Principles of Test Construction 1Principles of Test Construction 1
Principles of Test Construction 1Monica P
 
Formative Assessment vs. Summative Assessment
Formative Assessment vs. Summative AssessmentFormative Assessment vs. Summative Assessment
Formative Assessment vs. Summative Assessmentjcheek2008
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learningsuresh kumar
 
Validity of test
Validity of testValidity of test
Validity of testSarat Rout
 
Placement & diagnostic assessment
Placement & diagnostic assessmentPlacement & diagnostic assessment
Placement & diagnostic assessmentHadeeqaTanveer
 

What's hot (20)

teacher made test Vs standardized test
 teacher made test Vs standardized test teacher made test Vs standardized test
teacher made test Vs standardized test
 
Good test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good testGood test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good test
 
Qualities of a Good Test
Qualities of a Good TestQualities of a Good Test
Qualities of a Good Test
 
Constructing test Items
Constructing test ItemsConstructing test Items
Constructing test Items
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
 
Characteristics of a good assessment tool
Characteristics of a good assessment toolCharacteristics of a good assessment tool
Characteristics of a good assessment tool
 
Definition of Assessment,
Definition of Assessment,Definition of Assessment,
Definition of Assessment,
 
Subjective test
Subjective testSubjective test
Subjective test
 
Reliability (assessment of student learning I)
Reliability (assessment of student learning I)Reliability (assessment of student learning I)
Reliability (assessment of student learning I)
 
TOOLS AND TECHNIQUES FOR CLASSROOM ASSESSMENT
TOOLS AND  TECHNIQUES FOR CLASSROOM ASSESSMENTTOOLS AND  TECHNIQUES FOR CLASSROOM ASSESSMENT
TOOLS AND TECHNIQUES FOR CLASSROOM ASSESSMENT
 
Meaning and concept of test, testing, measurement, assessment and evaluation
Meaning and concept of test, testing, measurement, assessment and evaluationMeaning and concept of test, testing, measurement, assessment and evaluation
Meaning and concept of test, testing, measurement, assessment and evaluation
 
Characteristic of good test
Characteristic of good testCharacteristic of good test
Characteristic of good test
 
Principles of Test Construction 1
Principles of Test Construction 1Principles of Test Construction 1
Principles of Test Construction 1
 
Formative Assessment vs. Summative Assessment
Formative Assessment vs. Summative AssessmentFormative Assessment vs. Summative Assessment
Formative Assessment vs. Summative Assessment
 
Types of test items
Types of test itemsTypes of test items
Types of test items
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learning
 
Feedback assessment
Feedback assessmentFeedback assessment
Feedback assessment
 
Validity of test
Validity of testValidity of test
Validity of test
 
Subjective and Objective Test
Subjective and Objective TestSubjective and Objective Test
Subjective and Objective Test
 
Placement & diagnostic assessment
Placement & diagnostic assessmentPlacement & diagnostic assessment
Placement & diagnostic assessment
 

Similar to Characteristics of a good test

Standardized and non standardized tests
Standardized and non standardized testsStandardized and non standardized tests
Standardized and non standardized testsshaziazamir1
 
LESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptxLESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptxAdnanIssah
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessmentmunsif123
 
Standardized and non standardized tests
Standardized and non standardized testsStandardized and non standardized tests
Standardized and non standardized testsvinoli_sg
 
Construction of Tests
Construction of TestsConstruction of Tests
Construction of TestsDakshta1
 
Louzel Report - Reliability & validity
Louzel Report - Reliability & validity Louzel Report - Reliability & validity
Louzel Report - Reliability & validity Louzel Linejan
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliabilitysongoten77
 
constructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptxconstructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptxGajeSingh9
 
Characteristics of Good Evaluation Instrument
Characteristics of Good Evaluation InstrumentCharacteristics of Good Evaluation Instrument
Characteristics of Good Evaluation InstrumentSuresh Babu
 

Similar to Characteristics of a good test (20)

Validity and Reliability.pdf
Validity and Reliability.pdfValidity and Reliability.pdf
Validity and Reliability.pdf
 
Validity and Reliability.pdf
Validity and Reliability.pdfValidity and Reliability.pdf
Validity and Reliability.pdf
 
Standardized and non standardized tests (1)
Standardized and non standardized tests (1)Standardized and non standardized tests (1)
Standardized and non standardized tests (1)
 
Standardized and non standardized tests
Standardized and non standardized testsStandardized and non standardized tests
Standardized and non standardized tests
 
Chandani
ChandaniChandani
Chandani
 
Qualities of good evaluation tool (1)
Qualities of good evaluation  tool (1)Qualities of good evaluation  tool (1)
Qualities of good evaluation tool (1)
 
Business research methods
Business research methodsBusiness research methods
Business research methods
 
Week 8 & 9 - Validity and Reliability
Week 8 & 9 - Validity and ReliabilityWeek 8 & 9 - Validity and Reliability
Week 8 & 9 - Validity and Reliability
 
LESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptxLESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptx
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessment
 
Unit 2.pptx
Unit 2.pptxUnit 2.pptx
Unit 2.pptx
 
Standardized and non standardized tests
Standardized and non standardized testsStandardized and non standardized tests
Standardized and non standardized tests
 
Construction of Tests
Construction of TestsConstruction of Tests
Construction of Tests
 
Louzel Report - Reliability & validity
Louzel Report - Reliability & validity Louzel Report - Reliability & validity
Louzel Report - Reliability & validity
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliability
 
constructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptxconstructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptx
 
Chapter 6: Validity
Chapter 6: ValidityChapter 6: Validity
Chapter 6: Validity
 
Characteristics of Good Evaluation Instrument
Characteristics of Good Evaluation InstrumentCharacteristics of Good Evaluation Instrument
Characteristics of Good Evaluation Instrument
 
Chapter 6: Validity
Chapter 6: ValidityChapter 6: Validity
Chapter 6: Validity
 
Validation
ValidationValidation
Validation
 

Recently uploaded

The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhikauryashika82
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room servicediscovermytutordmt
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 

Recently uploaded (20)

The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room service
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 

Characteristics of a good test

  • 2. TEST  a formal and systematic instrument, usually paper and pencil procedure designed to assess the quality, ability, skill or knowledge of the students by giving a set of question in uniform manner one of the many types of assessment procedure used to gather information about the performance of students
  • 4. •VALIDITY  DEFINITION:  “Validity is the extent to which a test measures what it claims to measure. It is vital for a test to be valid in order for the results to be accurately applied and interpreted.”
  • 5. Other definitions given by experts Gronlund and Linn (1995)- “ Validity refers to the appropriateness of the interpretation made from test scores and other evaluation results with regard to a particular use.” Anne Anastasi (1969) writes “ the validity of a test concerns what the test measures and how well it does so.”
  • 6. Ebel and Frisbie (1991)- “ The term validity, when applied to a set of test scores, refers to the consistency (accuracy) with which the scores measure a particular cognitive ability of interest.” C.V. Good (1973)- in the dictionary of education defines validity as the “ extent to which a test or other measuring instrument fulfills the purpose for which it is used.”
  • 7. •TYPES OF VALIDITY 1. Face Validity: - it is the extent to which the measurement method appears “on its face” to measure the construct of interest. -is done by examining the physical appearance of the instrument to make it readable and understandable
  • 8. EXAMPLE: People might have negative reactions to an intelligence test that did not appear to them to be measuring their intelligence.
  • 9.  2. Content Validity: -it is the extent to which the measurement method covers the entire range of relevant behaviors, thoughts, and feelings that define the construct being measured. -is done through a careful and critical examination of the objectives of assessment to reflect the curricular objectives.
  • 10. 3. Criterion-based Validity: -it is the extent to which people’s scores are correlated with other variables or criteria that reflect the same construct. -is established statistically such that a set of scores revealed by the measuring instrument is correlated with the scores obtained in another external predictor or measure
  • 11. Example:  An IQ test should correlate positively with school performance. An occupational aptitude test should correlate positively with work performance
  • 12. •TYPES OF CRITERION VALIDITY:  3.1. Predictive Validity: -describes the future performance of an individual by correlating the sets of scores obtained from two measures given at a longer time interval -when the criterion is something that will happen or be assessed in the future, this is called predictive validity.
  • 13.  3.2. Concurrent Validity: -describes the present status of the individual by correlating the sets of scores obtained from two measures given at a close interval -when the criterion is something that is happening or being assessed at the same time as the construct of interest, it is called concurrent validity.
  • 14. 4. Construct Validity -is established statistically by comparing psychological traits or factors that theoretically influence scores in a test
  • 15. TYPES OF CONSTRUCT VALIDITY: 4.1 Convergent Validity -is established if the instrument defines another similar trait other than what it is intended to measure. E.g. Critical Thinking Test may be correlated with Creative Thinking Test.
  • 16. 4.2 Divergent Validity - is established if an instrument can describe only the intended trait and not the other traits. E.g. Critical Thinking Test may not be correlated with Reading Comprehension Test.
  • 17. Nature of Validity 1. Validity refers to the appropriateness of the test results but not to the instrument itself. 2. Validity does not exist on an all-or-none basis but it is a matter of degree. 3. Tests are not valid for all purposes. Validity is always specific to particular interpretation. 4. Validity is not of different types. It is a unitary concept. It is based on various types of evidence.
  • 18. Factors Affecting Validity :- 1. Factors in the test: (i) Unclear directions to the students to respond the test. (ii) Difficulty of the reading vocabulary and sentence structure. (iii) Too easy or too difficult test items. (iv) Ambiguous statements in the test items. (v) Inappropriate test items for measuring a particular outcome. (vi) Inadequate time provided to take the test
  • 19. 2. Factors in Test Administration and Scoring (i) Unfair aid to individual students, who ask for help. (ii) Cheating by the pupils during testing. (iii) Unreliable scoring of essay type answer. (iv) Insufficient time to complete the test. (v) Adverse physical and psychological condition at the time of testing.
  • 20. 3. Factors related to Testee (i) Test anxiety of the students. (ii) Physical and Psychological state of the pupil (iii) Response set– a consistent tendency to follow a certain pattern in responding the items.
  • 22. • RELIABILITY  refers to the consistency of measurement; that is, how consistent test results or other assessment results from one measurement to another
  • 23. Other definitions given by experts Gronlund and Linn (1995)-” reliability refers to the consistency of measurement- that is, how consistent test scores or other evaluation results are from one measurement to other”. Ebel and Frisbie (1991)- “ the term reliability means the consistency with which a set of test scores measure whatever they do measure”.
  • 24. C.V. Good (1973)-has defined reliability as the “ worthiness with which a measuring device measures something; the degree to which a test or other instrument of evaluation measures consistently whatever it does in fact measure”. Davis (1946) “ the degree of relative precisions of measurement of a set of test score is defined reliability”.
  • 25. Nature of Reliability 1. Reliability refers to consistency of the results obtained with an instrument but not the instrument itself. 2. Reliability refers to a particular interpretation of test scores. 3.Reliability is a statistical concept to determine reliability we administer a test to a group once or more than once. 4. Reliability is necessary but not a sufficient condition for validity.
  • 26. Four methods of determining reliability (a) Test-Retest method. (b) Equivalent forms/Parallel forms method. (c) Split-half method. (d) Rational Equivalence/Kuder- Richardson method.
  • 27. Test-Retest method: This is the simplest method of determining the test reliability. To determine reliability in this method the test is given and repeated on same group. Then the correlation between the first set of scores and second set of scores is obtained.
  • 28. Equivalent Forms/Parallel Forms Method: In this process two parallel forms of tests are administered to the same group of pupils in short interval of time, then the scores of both the tests are cor­related. This correlation provides the index of equivalence.
  • 29. Split-Half Method: In this method a test is administered to a group of pupils in usual manner. Then the test is divided into two equivalent values and correlation for these half­tests are found.
  • 30. Rational Equivalent/Kuder Richardson Method: This method also provides a measure of internal consistency. It neither requires administration of two equivalent forms of tests nor it requires to split the tests into two equal halves. Reliability coefficient is determined by using the Kuder­Richardson formula­20 which reads like this.
  • 31. Factors affecting reliability:- 1. Factors related to test: (i) length of the test (ii) content of the test (iii) characteristics of items (iv) spread of scores
  • 32. 2. Factors related to testee: (i) Heterogeneity of the group (ii) Test wiseness of the students (iii) Motivation of the students 3. Factors related to testing procedures: (i) Time limit of test (ii) Cheating opportunity given to the students
  • 34. • OBJECTIVITY  refers to the agreement of two or more raters or test administrators concerning the score of the student.
  • 35. Other definitions given by experts: C.V. Good (1973) defines objectivity in testing is “the extent to which the instrument is free from personal error (personal bias), that is subjectivity on the part of the scorer”.
  • 36. Gronlund and Linn (1995) states “Objectivity of a test refers to the degree to which equally competent scores obtain the same results. So a test is considered objective when it makes for the elimination of the scorer’s personal opinion and bias judgement. In this con­text there are two aspects of objectivity which should be kept in mind while constructing a test.”
  • 37. Two aspects of objectivity which should be kept in mind while constructing a test 1. Objectivity in Scoring -means same person or different persons scoring the test at any time arrives at the same result without may chance error. ­a test to be objective must necessarily so worded that only correct answer can be given to it.
  • 38. 2. Objectivity of Test Items -means that the item must call for a definite single answer. Well­con­structed test items should lead themselves to one and only one interpretation by students who know the material involved. It means the test items should be free from ambiguity.
  • 40. • FAIRNESS - means the test item should not have any biases. It should not be offensive to any examinee subgroup. -a test can only be good if it is fair to all the examinees -a fair assessment provides all students with an equal opportunity to demonstrate achievement
  • 41. The key to fairness are as follows: Students have knowledge of learning targets and assessment. Students are given equal opportunity to learn. Students possess the pre-requisite knowledge and skills. Students are free from teacher stereotypes. Students are free from biased assessment task and procedures.
  • 43. • SCORABILITY - means that the test should be easy to score, direction for scoring should be clearly stated in the instruction. Provide the students an answer sheet and the answer key for the one who will check the test.
  • 45. • ADEQUACY -means that the test should contain a wide range of sampling of items to determine the educational outcomes or abilities so that the resulting scores are representatives of the total performance in the areas measured.
  • 47. • ADMINISTRABILITY - means that the test should be administered uniformly to all students so that the scores obtained will not vary due to factors other than differences of the student’s knowledge ad skills. There should be a clear provision for instruction for the students, proctors and even the one who will check the test or the test scorer.
  • 49. • PRACTICALITY AND EFFICIENCY -refers to the teacher’s familiarity with the methods used, time required for the assessment, complexity of the administration, ease of scoring, ease of interpretation of the test results and the materials used must be at the lowest cost.
  • 51. • BALANCE -a balanced assessment sets targets in all domains of learning (cognitive, affective and psychomotor) or domains of intelligence (verbal-linguistic, logical-mathematical, bodily kinesthetic , visual- spatial, musical-rhythmic, intrapersonal-social, interpersonal- introspection, physical world-natural, existential-spiritual) -makes use of both traditional and alternative assessment