SlideShare a Scribd company logo
1 of 78
Download to read offline
Certifi Tester
C
ied
r
Found
dation Lev Sy
n vel yllabu
us

Released
R
Ver
rsion 201
11

Int
ternatio
onal Software Testing Qualif
g
fication Board
ns
r
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Copyrigh Notice
ht
This doc
cument may be copied in its entirety, or extracts made, if the s
m
source is ack
knowledged.
Copyrigh Notice © In
ht
nternational Software Te
esting Qualific
cations Boar (hereinafte called ISTQB®)
rd
er
ISTQB is a registered trademark of the Intern
s
d
national Softw
ware Testing Qualifications Board,
g
Copyrigh © 2011 the authors for the update 2011 (Thomas Müller (ch
ht
e
r
hair), Debra Friedenberg, and
the ISTQ WG Foun
QB
ndation Level)
Copyrigh © 2010 the authors for the update 2010 (Thomas Müller (ch
ht
e
r
hair), Armin B
Beer, Martin
Klonk, R
Rahul Verma)
)
Copyrigh © 2007 the authors for the update 2007 (Thomas Müller (ch
ht
e
r
hair), Dorothy Graham, Debra
y
D
Friedenb
berg and Erik van Veenendaal)
k
Copyrigh © 2005, th authors (T
ht
he
Thomas Mülle (chair), Re Black, Sig Eldh, Dorothy Graham,
er
ex
grid
Klaus Ol
lsen, Maaret Pyhäjärvi, G
Geoff Thompson and Erik van Veenen
k
ndaal).
All rights reserved.
s
The auth
hors hereby t
transfer the c
copyright to t Internatio
the
onal Softwar Testing Qu
re
ualifications Board
(ISTQB). The author (as current copyright holders) and ISTQB (as th future cop
rs
t
I
he
pyright holder)
have agr
reed to the fo
ollowing cond
ditions of use
e:
1) Any individual or training com
r
mpany may u this sylla
use
abus as the b
basis for a tra
aining course if the
e
ed
ource and co
opyright owners of the sy
yllabus
authors and the ISTQB are acknowledge as the so
at
rtisement of such a train
ning course m mention the syllabu only
may
n
us
and provided tha any adver
r
n
on
aining mater
rials to an I
ISTQB recognized
after submission for official accreditatio of the tra
Natio
onal Board.
2) Any individual or group of in
r
ndividuals ma use this syllabus as t basis for articles, boo
ay
s
the
oks, or
othe derivative writings if th authors a
er
he
and the ISTQB are acknowledged a the sourc and
as
ce
copy
yright owners of the syllabus.
s
3) Any ISTQB-reco
ognized Natio
onal Board m translate this syllabu and licens the syllab (or
may
e
us
se
bus
ranslation) to other parties.
o
its tr

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 2 of 78
7

31-Mar
r-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Revision Histo
ory
Version

Date
D

Remarks
s

ISTQB 2
2011

Effective 1-Ap
E
pr-2011

ISTQB 2
2010

Effective 30-M
E
Mar-2010

ISTQB 2
2007

01-May-2007
0
7

ISTQB 2
2005
ASQF V2.2

01-July-2005
0
July-2003
J

ISEB V2
2.0

25-Feb-1999
2

Certified Tester Foundation Level Syllabus
l
Maintena
ance Release – see Appe
e
endix E – Re
elease
Notes
Certified Tester Foundation Level Syllabus
l
Maintena
ance Release – see Appe
e
endix E – Re
elease
Notes
Certified Tester Foundation Level Syllabus
l
Maintena
ance Release
e
Certified Tester Foundation Level Syllabus
l
ASQF Sy
yllabus Foundation Level Version 2.2
“Lehrplan Grundlagen des Softwa
n
n
are-testens“
ISEB Sof
ftware Testin Foundatio Syllabus V2.0
ng
on
V
25 February 1999

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 3 of 78
7

31-Mar
r-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Table of Conte
ents
Acknowledgements..
........................................
........................................
................................................... 7 
Introduct
tion to this S
Syllabus............................
........................................
................................................... 8 
Purpo of this Do
ose
ocument ..........................
........................................
................................................... 8 
The C
Certified Test Foundatio Level in S
ter
on
Software Testing ..............
................................................... 8 
Learn
ning Objective
es/Cognitive Level of Kno
owledge ..........................
................................................... 8 
The E
Examination .
........................................
........................................
................................................... 8 
Accre
editation ........
........................................
........................................
................................................... 8 
Level of Detail......
........................................
........................................
................................................... 9 
How t
this Syllabus is Organized ..................
........................................
................................................... 9 
1.  Fun
ndamentals o Testing (K
of
K2)................
........................................
................................................. 10 
1.1 
Why is Te
esting Necessary (K2) .....
........................................
................................................. 11 
1.1.1  Software Systems C
Context (K1) .......................................
)
................................................. 11 
1.1.2  Causes of Software Defects (K2 ....................................
s
e
2)
................................................. 11 
1.1.3  Role of Testing in S
f
Software Dev
velopment, Maintenance and Operations (K2) ............... 11 
M
1.1.4  Testing and Quality (K2) ...........
g
y
........................................
................................................. 11 
1.1.5  How Much Testing is Enough? (K2) ................................
................................................. 12 
1.2 
What is Testing? (K2) ....................
........................................
................................................. 13 
Seven Testing Princip
1.3 
ples (K2) .......
........................................
................................................. 14 
Fundamental Test Pro
1.4 
ocess (K1) ...
........................................
................................................. 15 
1.4   Test Planning and C
4.1
Control (K1) .......................................
................................................. 15 
1.4   Test An
4.2
nalysis and D
Design (K1) .
........................................
................................................. 15 
1.4   Test Im
4.3
mplementatio and Execu
on
ution (K1).........................
................................................. 16 
1.4   Evaluating Exit Crit
4.4
teria and Rep
porting (K1) .....................
................................................. 16 
1.4   Test Cl
4.5
losure Activit
ties (K1) ......
........................................
................................................. 16 
1.5 
The Psych
hology of Testing (K2) ....
........................................
................................................. 18 
1.6 
Code of E
Ethics ...............................
........................................
................................................. 20 
2.  Tes
sting Throug
ghout the Sof
ftware Life C
Cycle (K2) .........................
................................................. 21 
2.1 
Software Developmen Models (K2 ....................................
nt
2)
................................................. 22 
2.1.1  V-mode (Sequentia Development Model) (K2) ..............
el
al
................................................. 22 
2.1.2  Iterative
e-incrementa Development Models (K2) .............
al
(
................................................. 22 
2.1.3  Testing within a Life Cycle Model (K2) ............................
g
e
................................................. 22 
2.2 
Test Leve (K2) ............................
els
........................................
................................................. 24 
2.2   Compo
2.1
onent Testing (K2) ...........
g
........................................
................................................. 24 
2.2   Integra
2.2
ation Testing (K2) ............
........................................
................................................. 25 
2.2   System Testing (K2 .................
2.3
m
2)
........................................
................................................. 26 
2.2   Acceptance Testing (K2)...........
2.4
g
........................................
................................................. 26 
Test Type (K2) .............................
2.3 
es
........................................
................................................. 28 
2.3   Testing of Function (Functional Testing) (K2 .................
3.1
g
n
2)
................................................. 28 
2.3   Testing of Non-func
3.2
g
ctional Softw
ware Characte
eristics (Non
n-functional T
Testing) (K2) ......... 28 
2.3   Testing of Software Structure/A
3.3
g
e
Architecture (Structural Te
esting) (K2) .............................. 29 
2.3   Testing Related to Changes: Re
3.4
g
e-testing and Regression Testing (K2 ........................... 29 
d
n
2)
2.4 
Maintenan Testing (
nce
(K2) .............
........................................
................................................. 30 
3.  Sta Techniqu (K2)...........................
atic
ues
........................................
................................................. 31 
3.1 
Static Tec
chniques and the Test Pr
d
rocess (K2) ......................
................................................. 32 
3.2 
Review Process (K2) .....................
........................................
................................................. 33 
3.2   Activitie of a Form Review (K ...................................
2.1
es
mal
K1)
................................................. 33 
3.2   Roles a Respons
2.2
and
sibilities (K1) .......................................
)
................................................. 33 
3.2   Types o Reviews (
2.3
of
(K2) ..............
........................................
................................................. 34 
3.2   Succes Factors fo Reviews (K ...................................
2.4
ss
or
K2)
................................................. 35 
3.3 
Static Ana
alysis by Too (K2) ........
ols
........................................
................................................. 36 
4.  Tes Design Te
st
echniques (K ................
K4)
........................................
................................................. 37 
4.1 
The Test Developmen Process (K ...................................
nt
K3)
................................................. 38 
4.2 
es
esign Techniq
ques (K2) ........................
................................................. 39 
Categorie of Test De
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 4 of 78
7

31-Mar
r-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

4.3 
Specificat
tion-based or Black-box T
r
Techniques (K3) .............
(
................................................. 40 
4.3   Equivalence Partitio
3.1
oning (K3) ...
........................................
................................................. 40 
4.3   Bounda Value An
3.2
ary
nalysis (K3) ..
........................................
................................................. 40 
4.3   Decisio Table Tes
3.3
on
sting (K3) .....
........................................
................................................. 40 
4.3   State T
3.4
Transition Testing (K3) ....
........................................
................................................. 41 
4.3   Use Ca Testing (
3.5
ase
(K2)..............
........................................
................................................. 41 
4.4 
Structure-based or Wh
hite-box Techniques (K4 ..................
4)
................................................. 42 
4.4   Statem
4.1
ment Testing a Coverag (K4) ............................
and
ge
................................................. 42 
4.4   Decisio Testing an Coverage (K4) ...............................
4.2
on
nd
e
................................................. 42 
4.4   Other S
4.3
Structure-bas Techniqu (K1) ..........................
sed
ues
................................................. 42 
4.5 
Experienc
ce-based Tec
chniques (K2 .....................................
2)
................................................. 43 
4.6 
Choosing Test Techni
iques (K2)....
........................................
................................................. 44 
5.  Tes Management (K3) ..........................
st
........................................
................................................. 45 
Test Orga
5.1 
anization (K2 ..................
2)
........................................
................................................. 47 
5.1.1  Test Organization a Independ
and
dence (K2) ......................
................................................. 47 
5.1.2  Tasks o the Test L
of
Leader and T
Tester (K1) .......................
................................................. 47 
5.2 
Test Planning and Est
timation (K3).......................................
)
................................................. 49 
5.2   Test Planning (K2) ....................
2.1
........................................
................................................. 49 
5.2   Test Planning Activ
2.2
vities (K3) .....
........................................
................................................. 49 
5.2   Entry C
2.3
Criteria (K2) .....................
........................................
................................................. 49 
5.2   Exit Criteria (K2)........................
2.4
........................................
................................................. 49 
5.2   Test Es
2.5
stimation (K2 .................
2)
........................................
................................................. 50 
5.2   Test St
2.6
trategy, Test Approach (K ..................................
t
K2)
................................................. 50 
5.3 
Test Prog
gress Monitor
ring and Con
ntrol (K2) .........................
................................................. 51 
5.3   Test Pr
3.1
rogress Monitoring (K1) ..
........................................
................................................. 51 
5.3   Test Re
3.2
eporting (K2)...................
........................................
................................................. 51 
5.3   Test Co
3.3
ontrol (K2).......................
........................................
................................................. 51 
5.4 
Configura
ation Manage
ement (K2) ...
........................................
................................................. 52 
5.5 
Risk and T
Testing (K2) ....................
........................................
................................................. 53 
5.5   Project Risks (K2) .....................
5.1
t
........................................
................................................. 53 
5.5   Produc Risks (K2) ....................
5.2
ct
........................................
................................................. 53 
5.6 
Incident M
Management (K3) ............
........................................
................................................. 55 
6.  Too Support fo Testing (K2
ol
or
2).................
........................................
................................................. 57 
6.1 
Types of T
Test Tools (K ...............
K2)
........................................
................................................. 58 
6.1.1  Tool Su
upport for Te
esting (K2) ...
........................................
................................................. 58 
6.1.2  Test To Classifica
ool
ation (K2) .....
........................................
................................................. 58 
6.1.3  Tool Su
upport for Ma
anagement o Testing an Tests (K1) ............................................... 59 
of
nd
)
6.1.4  Tool Su
upport for Sta Testing (K1) ................................
atic
................................................. 59 
6.1.5  Tool Su
upport for Te Specificat
est
tion (K1) ..........................
................................................. 59 
6.1.6  Tool Su
upport for Te Execution and Loggin (K1) .........
est
n
ng
................................................. 60 
6.1.7  Tool Su
upport for Pe
erformance a Monitorin (K1).........
and
ng
................................................. 60 
6.1.8  Tool Su
upport for Sp
pecific Testin Needs (K1 .................
ng
1)
................................................. 60 
6.2 
Effective U of Tools Potential B
Use
s:
Benefits and Risks (K2) ..
................................................. 62 
6.2   Potential Benefits a Risks of Tool Suppor for Testing (for all tools (K2) ................... 62 
2.1
and
rt
s)
6.2   Special Considerations for Som Types of Tools (K1) ....
2.2
me
T
................................................. 62 
Introducin a Tool into an Organiz
6.3 
ng
o
zation (K1) .......................
................................................. 64 
7.  References ......
........................................
........................................
................................................. 65 
Stand
dards ............
........................................
........................................
................................................. 65 
Books
s...................
........................................
........................................
................................................. 65 
8.  Appendix A – S
Syllabus Background .......
........................................
................................................. 67 
Histor of this Doc
ry
cument ............................
........................................
................................................. 67 
Objec
ctives of the F
Foundation C
Certificate Qualification ......................
................................................. 67 
Objec
ctives of the I
International Qualification (adapted fr
n
rom ISTQB m
meeting at So
ollentuna,
Novem
mber 2001)..
........................................
........................................
................................................. 67 
Entry Requiremen for this Qu
nts
ualification ...
........................................
................................................. 67 
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 5 of 78
7

31-Mar
r-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Backg
ground and H
History of the Foundation Certificate in Software T
e
n
Testing ..................................... 68 
Appendix B – L
Learning Obje
ectives/Cogn
nitive Level of Knowledge ............................................... 69 
o
e
Level 1: Remember (K1) ............................
........................................
................................................. 69 
Level 2: Understand (K2) ...........................
........................................
................................................. 69 
Level 3: Apply (K3 .....................................
3)
........................................
................................................. 69 
Level 4: Analyze (
(K4) .................................
........................................
................................................. 69 
10. 
Appendix C – Rules App
A
plied to the IS
STQB ...............................
................................................. 71 
Found
dation Syllab ...................................
bus
........................................
................................................. 71 
10.   Genera Rules ...........................
.1.1
al
........................................
................................................. 71 
10.   Current Content ........................
.1.2
........................................
................................................. 71 
10.   Learnin Objectives ..................
.1.3
ng
s
........................................
................................................. 71 
10.   Overall Structure .......................
.1.4
l
........................................
................................................. 71 
11. 
Appendix D – Notice to T
A
Training Prov
viders ..............................
................................................. 73 
12. 
Appendix E – Release Notes.............
A
........................................
................................................. 74 
Relea 2010 ......
ase
........................................
........................................
................................................. 74 
Relea 2011 ......
ase
........................................
........................................
................................................. 74 
13. 
Index ...........
........................................
........................................
................................................. 76 
9. 

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 6 of 78
7

31-Mar
r-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Ackno
owledgements
Internatio
onal Softwar Testing Qu
re
ualifications Board Working Group Fo
oundation Le
evel (Edition 2011):
Thomas Müller (chair), Debra Friedenberg. T core team thanks the review team (Dan Almog
The
m
m
g,
Armin Be Rex Black, Julie Gar
eer,
rdiner, Judy McKay, Tuul Pääkköne Eric Riou du Cosquier Hans
la
en,
r
Schaefer, Stephanie Ulrich, Erik van Veenendaal) and all National Bo
oards for the suggestions for
the curre version o the syllabus.
ent
of
Internatio
onal Softwar Testing Qu
re
ualifications Board Working Group Fo
oundation Le
evel (Edition 2010):
Thomas Müller (chair), Rahul Verma, Martin K
Klonk and Ar
rmin Beer. T core team thanks the
The
m
review te
eam (Rex Bla
ack, Mette B
Bruhn-Peders
son, Debra Friedenberg, Klaus Olsen Judy McKa
F
n,
ay,
Tuula Pä
ääkkönen, M
Meile Posthum Hans Sc
ma,
chaefer, Step
phanie Ulrich, Pete William Erik van
ms,
Veenend
daal) and all National Boa
ards for their suggestions
r
s.
Internatio
onal Softwar Testing Qu
re
ualifications Board Working Group Fo
oundation Le
evel (Edition 2007):
Thomas Müller (chair), Dorothy G
Graham, Deb Friedenberg, and Erik van Veenendaal. The core
bra
k
c
team tha
anks the revie team (Ha Schaefer Stephanie Ulrich, Meile Posthuma, Anders
ew
ans
r,
e
Pettersson, and Won Kwon) and all the National Boards for their sug
nil
d
ggestions.
Internatio
onal Softwar Testing Qu
re
ualifications Board Working Group Fo
oundation Le
evel (Edition 2005):
Thomas Müller (chair), Rex Black Sigrid Eldh Dorothy Graham, Klau Olsen, Ma
k,
h,
G
us
aaret Pyhäjär
rvi,
hompson and Erik van Ve
d
eenendaal an the review team and a National B
nd
w
all
Boards for their
Geoff Th
suggestions.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 7 of 78
7

31-Mar
r-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Introd
duction to this Syllabus
n
s
Purpo of this Docume
ose
s
ent
This sylla
abus forms t basis for the International Softwar Testing Qualification a the Founda
the
re
at
ation
Level. Th Internatio
he
onal Software Testing Qualifications Board (ISTQB provides it to the Natio
e
B
B)
onal
Boards f them to accredit the tr
for
raining provid
ders and to derive examination quest
d
tions in their local
language Training p
e.
providers will determine a
appropriate te
eaching meth
hods and pro
oduce course
eware
for accre
editation. The syllabus w help candidates in their preparation for the exa
will
amination.
Information on the history and ba
ackground of the syllabus can be foun in Append A.
f
s
nd
dix

The C
Certified T
Tester Foundation Level in Software Testing
e
The Foundation Leve qualificatio is aimed a anyone inv
el
on
at
volved in soft
tware testing This includ
g.
des
people in roles such as testers, te analysts, test enginee test cons
n
est
,
ers,
sultants, test managers, user
t
acceptan testers a software developers. This Founda
nce
and
ation Level q
qualification i also appro
is
opriate
for anyone who want a basic un
ts
nderstanding of software testing, such as project m
h
managers, quality
manager software developmen managers, business an
rs,
nt
nalysts, IT dir
rectors and m
management
consulta
ants. Holders of the Foundation Certif
ficate will be able to go on to a higher
n
r-level softwa
are
testing q
qualification.

Learni Objec
ing
ctives/Co
ognitive Level of Knowledge
K
e
Learning objectives a indicated for each section in this syllabus and classified as follows:
g
are
d
s
d
s
o K1: r
remember
o K2: u
understand
o K3: a
apply
o K4: a
analyze
Further d
details and e
examples of l
learning obje
ectives are given in Appe
endix B.
All terms listed under “Terms” jus below chap heading shall be re
s
st
pter
gs
emembered (
(K1), even if not
explicitly mentioned in the learnin objectives
y
ng
s.

The E
Examinatio
on
The Foundation Leve Certificate examination will be base on this sy
el
n
ed
yllabus. Answ
wers to
ation question may require the use o material ba
ns
of
ased on more than one section of this
e
s
examina
syllabus. All sections of the syllab are exam
s
bus
minable.
The form of the exa
mat
amination is multiple cho
oice.
Exams m be taken as part of a accredited training cou
may
n
an
d
urse or taken independen (e.g., at an
n
ntly
a
examina
ation center o in a public exam). Com
or
mpletion of an accredited training cou
a
d
urse is not a prerequisite for the exam
e
m.

Accred
ditation
An ISTQ National B
QB
Board may accredit training providers whose cour material f
s
rse
follows this
syllabus. Training pro
oviders shou obtain acc
uld
creditation guidelines from the board or body that
t
performs the accreditation. An ac
s
ccredited cou
urse is recog
gnized as con
nforming to this syllabus, and
is allowe to have an ISTQB exa
ed
n
amination as part of the course.
guidance for training prov
viders is give in Append D.
en
dix
Further g

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 8 of 78
7

31-Mar
r-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Level of Detail
The leve of detail in this syllabus allows inter
el
s
rnationally co
onsistent teaching and ex
xamination. In
order to achieve this goal, the syllabus consis of:
sts
o General instructional objectiv describin the intentio of the Fou
ves
ng
on
undation Lev
vel
o A list of informati to teach, including a d
ion
description, and referenc to additio
a
ces
onal sources if
requ
uired
o Lear
rning objectiv for each knowledge a
ves
area, describ
bing the cogn
nitive learning outcome and
a
mind
dset to be ac
chieved
o A list of terms tha students m
at
must be able to recall and understand
e
d
d
o A de
escription of t key conc
the
cepts to teach, including sources such as accepte literature or
s
h
ed
o
standards
The sylla
abus content is not a des
t
scription of th entire kno
he
owledge area of software testing; it ref
a
flects
the level of detail to b covered in Foundation Level traini courses.
be
n
n
ing

How th Syllab is Or
his
bus
rganized
There ar six major c
re
chapters. The top-level h
heading for each chapter shows the h
highest level of
learning objectives th is covere within the chapter and specifies the time for the chapter. Fo
hat
ed
e
e
or
example
e:

2. Tes
sting Thr
roughout the Sof
t
ftware Life Cycle (K2)

115 min
nutes

This hea
ading shows that Chapter 2 has learning objective of K1 (ass
r
es
sumed when a higher level is
shown) a K2 (but n K3), and it is intended to take 115 minutes to teach the material in the
and
not
d
5
e
chapter. Within each chapter there are a num
mber of sectio
ons. Each se
ection also ha the learning
as
objective and the am
es
mount of time required. S
e
Subsections that do not h
have a time g
given are included
within the time for the section.
e

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 9 of 78
7

31-Mar
r-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.

Fundam
mentals of Test
ting (K2
2)

15 minu
55
utes

Learni Objec
ing
ctives for Fundam
r
mentals of Testing
f
The obje
ectives identify what you will be able t do followin the compl
to
ng
letion of each module.
h

1.1 Wh is Testing Necess
hy
sary? (K2)
LO-1.1.1
1
LO-1.1.2
2
LO-1.1.3
3
LO-1.1.4
4
LO-1.1.5
5

Describe with examples, the way in which a defect in sof
e,
y
ftware can ca
ause harm to a
o
person, t the enviro
to
onment or to a company (K2)
(
Distinguish between the root cau of a defec and its effe
use
ct
ects (K2)
Give rea
asons why testing is nece
essary by giv
ving example (K2)
es
Describe why testing is part of qu
e
g
uality assurance and give examples o how testing
e
of
contribut to higher quality (K2)
tes
r
Explain a compare the terms e
and
e
error, defect, fault, failure and the cor
e,
rresponding terms
mistake and bug, usi examples (K2)
ing
s

1.2 Wh is Testing? (K2)
hat
LO-1.2.1
1
LO-1.2.2
2
LO-1.2.3
3

Recall th common o
he
objectives of testing (K1)
f
Provide examples for the objectiv of testing in different phases of th software life
ves
g
he
cycle (K2
2)
Different
tiate testing f
from debugg
ging (K2)

1.3 Sev Testin Princip
ven
ng
ples (K2)
LO-1.3.1
1

Explain t seven pr
the
rinciples in te
esting (K2)

1.4 Fun
ndamenta Test Pro
al
ocess (K1)
)
LO-1.4.1
1

Recall th five fundamental test a
he
activities and respective t
d
tasks from planning to closure
(K1)

1.5 The Psychology of Tes
e
sting (K2)
)
LO-1.5.1
1
LO-1.5.2
2

Recall th psycholog
he
gical factors t
that influence the succes of testing (
e
ss
(K1)
Contrast the mindset of a tester a of a deve
t
t
and
eloper (K2)

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 10 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.1

Why is Testing Necess
g
sary (K2
2)

20 minut
tes

Terms
Bug, def
fect, error, fa
ailure, fault, m
mistake, qual
lity, risk

1.1.1

Software Systems Context (
e
s
(K1)

Software systems ar an integral part of life, f
e
re
l
from busines applications (e.g., ban
ss
nking) to cons
sumer
products (e.g., cars). Most people have had a experienc with softwa that did n work as
s
.
e
an
ce
are
not
expected Software t
d.
that does not work correc can lead to many pro
t
ctly
oblems, including loss of
money, t
time or busin
ness reputation, and coul even caus injury or de
ld
se
eath.

1.1.2

Causes o Softwar Defects (K2)
of
re
s

A human being can m
n
make an erro (mistake), which produ
or
uces a defect (fault, bug) in the progra
am
code, or in a docume If a defec in code is executed, th system ma fail to do w
ent.
ct
he
ay
what it shoul do
ld
(or do so
omething it shouldn’t), ca
ausing a failure. Defects in software, s
systems or d
documents may
m
result in failures, but not all defec do so.
cts
Defects occur because human be
eings are fallible and bec
cause there is time press
sure, complex
x
code, co
omplexity of infrastructure changing t
e,
technologies, and/or man system int
ny
teractions.
Failures can be caus by enviro
sed
onmental con
nditions as well. For example, radiati
w
ion, magnetis
sm,
electroni fields, and pollution can cause faults in firmwar or influenc the execut
ic
re
ce
tion of softwa by
are
changing the hardwa conditions.
g
are

1.1.3 Role of T
Testing in Software Developm
ment, Main
ntenance a
and
tions (K2)
Operat
Rigorous testing of s
s
systems and documentati can help to reduce th risk of problems occurring
ion
he
during operation and contribute to the quality of the software system, if the defects found are
d
o
s
corrected before the system is re
d
eleased for operational us
se.
Software testing may also be req
e
y
quired to mee contractua or legal req
et
al
quirements, o industry-specific
or
standard
ds.

1.1.4

Testing a
and Qualit (K2)
ty

With the help of testing, it is poss
sible to meas
sure the qual of software in terms o defects fou
lity
of
und,
for both f
functional an non-functi
nd
ional softwar requireme
re
ents and char
racteristics (e
e.g., reliabilit
ty,
usability, efficiency, m
maintainability and portab
bility). For more information on non-fu
unctional tes
sting
see Cha
apter 2; for more informat
tion on software characte
eristics see ‘S
Software Eng
gineering –
Software Product Qu
e
uality’ (ISO 9126).
Testing c give con
can
nfidence in th quality of t software if it finds few or no defec A proper
he
the
w
cts.
rly
designed test that pa
d
asses reduce the overall level of risk in a system When testing does find
es
k
m.
d
defects, the quality o the softwar system inc
of
re
creases whe those defe
en
ects are fixed
d.
Lessons should be le
s
earned from previous pro
ojects. By understanding the root causes of defec
cts
found in other projec processe can be imp
cts,
es
proved, whic in turn sho
ch
ould prevent those defect from
ts
reoccurring and, as a consequen
nce, improve the quality of future syst
o
tems. This is an aspect of
o
quality assurance.
Testing s
should be int
tegrated as o of the qu
one
uality assurance activities (i.e., alongs
s
side develop
pment
standard training and defect an
ds,
nalysis).
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 11 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.1.5

How Muc Testing is Enoug
ch
g
gh? (K2)

Deciding how much t
g
testing is eno
ough should take accoun of the leve of risk, inclu
nt
el
uding technic
cal,
safety, a business risks, and p
and
s
project constr
raints such as time and b
a
budget. Risk is discussed
k
d
further in Chapter 5.
n
Testing s
should provid sufficient information t stakeholders to make informed decisions abou the
de
to
ut
release o the softwa or system being teste for the next development step or h
of
are
m
ed,
handover to
custome
ers.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 12 of 78

31-Ma
ar-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.2

30 minut
tes

What is Testing? (K2)
s

Terms
Debugging, requirem
ment, review, test case, te
esting, test objective
o

Backgr
round
A common perceptio of testing i that it only consists of running tests i.e., execu
on
is
y
s,
uting the softw
ware.
This is p of testing but not all o the testing activities.
part
g,
of
g
Test acti
ivities exist b
before and af test exec
fter
cution. Thes activities in
se
nclude plann
ning and cont
trol,
choosing test conditions, designing and exec
g
cuting test cases, checkin results, ev
ng
valuating exit
t
criteria, r
reporting on the testing p
process and system unde test, and fi
er
inalizing or c
completing closure
activities after a test phase has b
s
been complet
ted. Testing also includes reviewing d
s
documents
(including source cod and cond
de)
ducting static analysis.
c
Both dyn
namic testing and static te
g
esting can be used as a means for ac
chieving sim
milar objective
es,
and will provide infor
rmation that c be used to improve both the syst
can
b
tem being tes
sted and the
e
developm
ment and tes
sting process
ses.
Testing c have the following ob
can
e
bjectives:
o Finding defects
o Gain
ning confiden about the level of qua
nce
e
ality
o Prov
viding information for dec
cision-making
g
o Prev
venting defec
cts
The thou
ught process and activitie involved in designing tests early in the life cycle (verifying the
s
es
n
t
e
test basis via test design) can he to prevent defects from being intro
elp
t
m
oduced into c
code. Review of
ws
documen (e.g., req
nts
quirements) a the ident
and
tification and resolution o issues also help to prev
d
of
o
vent
defects a
appearing in the code.
Different viewpoints in testing tak different o
t
ke
objectives into account. F example, in developm
o
For
ment
testing (e
e.g., compon
nent, integrat
tion and syst
tem testing), the main ob
bjective may be to cause as
many fai
ilures as pos
ssible so that defects in th software are identified and can be fixed. In
t
he
a
d
e
acceptan testing, t main obje
nce
the
ective may b to confirm that the system works as expected, to
be
gain con
nfidence that it has met th requireme
he
ents. In some cases the m
e
main objectiv of testing may
ve
be to ass
sess the qua of the so
ality
oftware (with no intention of fixing defe
ects), to give information to
e
n
stakeholders of the r of releasing the syste at a given time. Maint
risk
em
n
tenance testi often incl
ing
ludes
testing th no new d
hat
defects have been introdu
uced during development of the chan
d
nges. During
operational testing, the main obje
ective may be to assess system chara
s
acteristics su as reliab
uch
bility or
availability.
Debugging and testin are differe Dynamic testing can show failure that are ca
ng
ent.
c
es
aused by def
fects.
Debugging is the dev
velopment ac
ctivity that fin
nds, analyzes and remov the cause of the failur
ves
e
re.
Subsequ
uent re-testin by a tester ensures tha the fix doe indeed res
ng
at
es
solve the failure. The
responsi
ibility for thes activities i usually tes
se
is
sters test and developers debug.
d
s
The proc
cess of testin and the te
ng
esting activitie are explained in Section 1.4.
es

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 13 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.3

Seven Testing Principles (K2)
)

35 minut
tes

Terms
Exhaustive testing

Princip
ples
A numbe of testing p
er
principles ha been sug
ave
ggested over the past 40 years and offer general
r
guideline common f all testing
es
for
g.
esence of d
defects
Principle 1 – Testing shows pre
Testing c show tha defects are present, bu cannot pro that there are no defe
can
at
ut
ove
e
ects. Testing
g
reduces the probability of undisco
overed defec remaining in the softw
cts
g
ware but, eve if no defec are
en
cts
found, it is not a proo of correctn
of
ness.
Principle 2 – Exhau
ustive testing is imposs
sible
Testing e
everything (a combinatio of inputs and precon
all
ons
s
nditions) is no feasible ex
ot
xcept for trivi
ial
cases. In
nstead of exh
haustive test
ting, risk ana
alysis and priorities should be used to focus testing
d
o
efforts.
Principle 3 – Early t
testing
To find d
defects early, testing activ
vities shall be started as early as pos
ssible in the s
software or system
s
developm
ment life cycle, and shall be focused on defined objectives.
o
t
Principle 4 – Defect clustering
Testing e
effort shall be focused pr
e
roportionally to the expec
cted and later observed d
defect density of
y
modules A small number of mod
s.
dules usually contains mo of the def
y
ost
fects discove
ered during prep
release t
testing, or is responsible for most of t operation failures.
the
nal
Principle 5 – Pestic
cide paradox
x
If the sam tests are repeated ov and over again, event
me
ver
tually the sam set of tes cases will no
me
st
longer fin any new d
nd
defects. To o
overcome thi “pesticide paradox”, test cases nee to be regu
is
ed
ularly
reviewed and revised and new a different tests need to be written t exercise d
d
d,
and
o
to
different parts of
s
the softw
ware or syste to find potentially mor defects.
em
re
t
t
Principle 6 – Testing is context dependent
Testing i done differ
is
rently in diffe
erent context For example, safety-critical softwa is tested
ts.
are
differentl from an ely
-commerce s
site.
nce-of-errors fallacy
s
Principle 7 – Absen
Finding a fixing de
and
efects does n help if the system built is unusable and does n fulfill the users’
not
e
e
not
needs an expectatio
nd
ons.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 14 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.4

Fundam
mental T
Test Pro
ocess (K
K1)

35 minut
tes

Terms
Confirma
ation testing, re-testing, e criteria, incident, regr
,
exit
ression testin test basis test condit
ng,
s,
tion,
test cove
erage, test da test execution, test log, test plan, test proced
ata,
dure, test policy, test suite test
e,
summary report, test
y
tware

Backgr
round
The mos visible part of testing is test executi
st
t
s
ion. But to be effective an efficient, t
e
nd
test plans sh
hould
also inclu time to b spent on p
ude
be
planning the tests, designing test cas
ses, preparin for execution
ng
and eval
luating result
ts.
The fund
damental tes process co
st
onsists of the following ma activities:
ain
o Test planning an control
t
nd
o Test analysis and design
t
o Test implementa
t
ation and exe
ecution
o Evaluating exit c
criteria and re
eporting
o Test closure activities
t
Although logically se
h
equential, the activities in the process may overlap or take plac concurren
e
p
ce
ntly.
Tailoring these main activities wit
g
thin the context of the system and the project is u
e
usually requir
red.

1.4.1

Test Plan
nning and Control (
d
(K1)

Test plan
nning is the a
activity of de
efining the ob
bjectives of te
esting and th specificatio of test ac
he
on
ctivities
in order to meet the o
objectives an mission.
nd
Test con
ntrol is the on
ngoing activit of comparing actual pr
ty
rogress again the plan, and reportin the
nst
ng
status, in
ncluding deviations from the plan. It in
nvolves takin actions ne
ng
ecessary to m
meet the mis
ssion
and obje
ectives of the project. In o
e
order to contr testing, th testing activities shoul be monitor
rol
he
ld
red
througho the projec Test planning takes in account the feedback from monito
out
ct.
nto
k
oring and con
ntrol
activities
s.
nning and co
ontrol tasks a defined in Chapter 5 of this syllab
are
n
o
bus.
Test plan

1.4.2

Test Ana
alysis and Design (K
K1)

Test ana
alysis and de
esign is the a
activity during which gene testing objectives are transformed into
g
eral
e
d
tangible test conditio and test c
ons
cases.
The test analysis and design acti
d
ivity has the following ma tasks:
ajor
o Revi
iewing the te basis (suc as require
est
ch
ements, softw
ware integrity level1 (risk level), risk
y
analysis reports, architecture design, inte
e,
erface specif
fications)
o Evaluating testab
bility of the te basis and test objects
est
d
s
o Identifying and p
prioritizing tes conditions based on an
st
nalysis of tes items, the specification
st
n,
beha
avior and stru
ucture of the software
e
o Desi
igning and prioritizing hig level test c
gh
cases
o Identifying neces
ssary test data to support the test con
t
nditions and test cases
o Desi
igning the tes environme setup and identifying any required infrastructu and tools
st
ent
d
d
ure
o Crea
ating bi-direc
ctional tracea
ability betwee test basis and test cas
en
ses
1

The degr to which sof
ree
ftware complies or must comply with a set of stakeholder-sele
s
y
s
ected software a
and/or software-based
system cha
aracteristics (e.g software com
g.,
mplexity, risk as
ssessment, safe level, securit level, desired performance,
ety
ty
reliability, o cost) which a defined to re
or
are
eflect the importa
ance of the soft
tware to its stak
keholders.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 15 of 78

31-Ma
ar-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.4.3

Test Imp
plementation and Ex
xecution (K1)

Test imp
plementation and executio is the activity where te procedur or scripts are specifie by
on
est
res
s
ed
combinin the test ca
ng
ases in a par
rticular order and includin any other information needed for test
r
ng
t
executio the enviro
on,
onment is set up and the tests are run
t
n.
Test imp
plementation and executio has the fo
on
ollowing majo tasks:
or
o Fina
alizing, implem
menting and prioritizing t
test cases (in
ncluding the identification of test data
n
a)
o Deve
eloping and prioritizing te procedure creating test data and optionally, preparing te
est
es,
t
d,
est
harn
nesses and w
writing autom
mated test scr
ripts
o Crea
ating test suit from the test procedu
tes
ures for efficient test exec
cution
o Verif
fying that the test environ
e
nment has be set up co
een
orrectly
o Verif
fying and updating bi-dire
ectional trace
eability between the test basis and te cases
est
o Exec
cuting test pr
rocedures either manuall or by using test execut
ly
g
tion tools, ac
ccording to th
he
planned sequenc
ce
o Logg
ging the outc
come of test execution an recording the identities and versions of the sof
nd
s
ftware
unde test, test to
er
ools and test
tware
o Com
mparing actua results with expected r
al
h
results
o Repo
orting discrepancies as in
ncidents and analyzing th
d
hem in order to establish their cause (e.g.,
r
h
a de
efect in the co
ode, in specified test data in the test document, o a mistake in the way th test
a,
or
he
was executed)
o Repe
eating test activities as a result of act
tion taken for each discre
epancy, for e
example, reexec
cution of a te that previo
est
ously failed in order to co
onfirm a fix (c
confirmation testing), exe
ecution
of a corrected tes and/or exe
st
ecution of tes in order to ensure tha defects have not been
sts
t
at
intro
oduced in unc
changed areas of the sof
ftware or that defect fixing did not unc
cover other
defe
ects (regressi testing)
ion

1.4.4

Evaluatin Exit Cr
ng
riteria and Reporting (K1)
g

Evaluatin exit criteria is the activ where te execution is assessed against the defined
ng
vity
est
d
objective This shou be done f each test level (see Section 2.2).
es.
uld
for
S
Evaluatin exit criteria has the following majo tasks:
ng
or
o Chec
cking test log against th exit criteria specified in test plannin
gs
he
a
n
ng
o Asse
essing if mor tests are n
re
needed or if t exit criter specified should be ch
the
ria
hanged
o Writi a test sum
ing
mmary repor for stakeho
rt
olders

1.4.5

Test Clos
sure Activ
vities (K1)

Test clos
sure activities collect data from comp
a
pleted test ac
ctivities to consolidate exp
perience,
testware facts and n
e,
numbers. Tes closure ac
st
ctivities occur at project m
r
milestones su as when a
uch
software system is re
e
eleased, a te project is completed (o cancelled), a milestone has been
est
or
achieved or a mainte
d,
enance relea has been completed.
ase
n

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 16 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Test clos
sure activities include the following m
e
major tasks:
o Chec
cking which planned deliverables hav been deliv
ve
vered
o Clos
sing incident reports or ra
aising change records for any that rem
e
r
main open
o Docu
umenting the acceptance of the syste
e
e
em
o Fina
alizing and ar
rchiving testw
ware, the tes environmen and the te infrastruct
st
nt
est
ture for later reuse
o Hand
ding over the testware to the mainten
e
o
nance organi
ization
o Anal
lyzing lesson learned to determine c
ns
o
changes nee
eded for futur releases a projects
re
and
o Usin the information gathere to improv test maturity
ng
ed
ve

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 17 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.5

The Ps
sycholog of Tes
gy
sting (K2)

25 minut
tes

Terms
Error gue
essing, indep
pendence

Backgr
round
The mind
dset to be us while tes
sed
sting and rev
viewing is diff
ferent from th used whi developing
hat
ile
software With the rig mindset d
e.
ght
developers a able to te their own code, but se
are
est
eparation of this
t
responsi
ibility to a tes is typically done to help focus eff and provide additiona benefits, su as
ster
fort
al
uch
an indep
pendent view by trained a professio
w
and
onal testing resources. In
r
ndependent t
testing may be
b
carried o at any lev of testing.
out
vel
A certain degree of in
n
ndependenc (avoiding t author bias) often ma
ce
the
akes the teste more effec
er
ctive
at finding defects and failures. Ind
g
d
dependence is not, howe
e
ever, a replac
cement for fa
amiliarity, an
nd
develope can efficiently find ma defects in their own code. Severa levels of in
ers
any
c
al
ndependence can
e
be define as shown here from lo to high:
ed
n
ow
o Test designed b the person(s) who wro the softw
ts
by
ote
ware under test (low level of independence)
o Test designed b another person(s) (e.g from the development team)
ts
by
g.,
d
t
o Test designed b a person(s) from a diff
ts
by
ferent organi
izational grou (e.g., an i
up
independent test
t
team or test spe
m)
ecialists (e.g. usability or performanc test specia
.,
r
ce
alists)
o Test designed b a person(s) from a diff
ts
by
ferent organi
ization or com
mpany (i.e., outsourcing or
certification by an external bo
ody)
People a projects are driven by objectives. People tend to align the plans with the objectives set
and
y
.
d
eir
by mana
agement and other stakeh
holders, for e
example, to find defects o to confirm that softwar
f
or
m
re
meets its objectives. Therefore, it is important to clearly state the obje
s
t
ectives of testing.
Identifyin failures du
ng
uring testing may be perc
ceived as criticism agains the produc and agains the
st
ct
st
author. A a result, te
As
esting is ofte seen as a destructive activity, even though it is very constru
en
a
n
s
uctive
in the ma
anagement o product ris
of
sks. Looking for failures in a system r
requires curio
osity, profess
sional
pessimis a critical eye, attentio to detail, g
sm,
on
good commu
unication with developme peers, and
h
ent
experien on which to base erro guessing.
nce
or
If errors, defects or fa
ailures are co
ommunicated in a constr
ructive way, bad feelings between the
e
testers a the analy
and
ysts, designe and developers can be avoided. T
ers
b
This applies t defects fou
to
und
during re
eviews as we as in testin
ell
ng.
The teste and test le
er
eader need g
good interper
rsonal skills to communic
cate factual information about
a
defects, progress and risks in a c
constructive w
way. For the author of the software o document,
or
defect in
nformation ca help them improve the skills. Defe
an
m
eir
ects found and fixed duri testing will
ing
w
save time and money later, and r
y
reduce risks.
.
Commun
nication prob
blems may oc
ccur, particularly if testers are seen o
only as messengers of
unwante news abou defects. However, ther are severa ways to im
ed
ut
re
al
mprove comm
munication an
nd
relations
ships betwee testers and others:
en
d

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 18 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

o
o

o
o

Start with collabo
t
oration rather than battles – remind everyone of th common goal of bette
s
e
he
er
quality systems
Com
mmunicate fin
ndings on the product in a neutral, fac
e
ct-focused w without cr
way
riticizing the
pers who crea
son
ated it, for example, write objective an factual inc
nd
cident reports and review
s
w
findings
Try t understand how the ot
to
ther person f
feels and why they react as they do
Conf
firm that the other person has unders
n
stood what yo have said and vice ve
ou
d
ersa

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 19 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

1.6

10 minut
tes

Code o Ethics
of
s

Involvem
ment in software testing e
enables indiv
viduals to learn confidential and privile
eged informa
ation. A
code of e
ethics is necessary, amo other rea
ong
asons to ensu that the information is not put to
ure
s
inapprop
priate use. Re
ecognizing th ACM and IEEE code of ethics for engineers, th ISTQB states the
he
d
he
following code of ethics:
g
PUBLIC - Certified so
oftware teste shall act consistently with the pub interest
ers
blic
CLIENT AND EMPLO
OYER - Cert
tified softwar testers sha act in a m
re
all
manner that is in the best interests
s
of their c
client and em
mployer, cons
sistent with th public inte
he
erest
PRODUC - Certified software te
CT
d
esters shall e
ensure that th deliverables they prov
he
vide (on the products
p
and syst
tems they tes meet the highest profe
st)
essional stan
ndards possible
JUDGME
ENT- Certifie software t
ed
testers shall maintain inte
egrity and ind
dependence in their profe
essional
judgmen
nt
MANAGEMENT - Ce
ertified softwa test man
are
nagers and le
eaders shall subscribe to and promote an
ethical a
approach to the managem
ment of softw
ware testing
on
PROFES
SSION - Cer
rtified softwar testers shall advance the integrity and reputatio of the pro
re
ofession
consistent with the public interest
t
COLLEA
AGUES - Cer
rtified softwa testers sh be fair to and support
are
hall
tive of their c
colleagues, and
a
promote cooperation with software developer
n
rs
SELF - C
Certified softw
ware testers shall particip
pate in lifelon learning r
ng
regarding the practice of their
e
professio and shall promote an ethical appro
on
oach to the practice of the profession
p
n

Refere
ences
1.1.5 Black, 2001, K
Kaner, 2002
zer,
ack, 2001, M
Myers, 1979
1.2 Beiz 1990, Bla
1.3 Beiz 1990, He
zer,
etzel, 1988, M
Myers, 1979
1.4 Hetz 1988
zel,
1.4.5 Black, 2001, C
Craig, 2002
1.5 Blac 2001, Het
ck,
tzel, 1988

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 20 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

2. T
Testing Throug
g
ghout th Softw
he
ware Lif
fe
Cycle (K2)
e

1 minutes
115

Learni Objec
ing
ctives for Testing Througho the S
r
out
Software L Cycle
Life
e
The obje
ectives identify what you will be able t do followin the compl
to
ng
letion of each module.
h

2.1 Sof
ftware Dev
velopmen Models (
nt
(K2)
LO-2.1.1
1
LO-2.1.2
2
LO-2.1.3
3

Explain t relationship between developmen test activit
the
nt,
ties and work products in the
n
developm
ment life cyc by giving examples us
cle,
sing project a product types (K2)
and
Recognize the fact th software developmen models mu be adapte to the con
hat
nt
ust
ed
ntext
of projec and produc characteris
ct
ct
stics (K1)
Recall ch
haracteristics of good tes
s
sting that are applicable t any life cy
e
to
ycle model (K
K1)

2.2 Tes Levels (
st
(K2)
LO-2.2.1
1

Compare the differen levels of te
e
nt
esting: major objectives, typical objec of testing,
r
cts
,
typical ta
argets of test
ting (e.g., fun
nctional or st
tructural) and related wor products, people
d
rk
who test types of de
t,
efects and failures to be id
dentified (K2
2)

2.3 Tes Types (K2)
st
LO-2.3.1
1
LO-2.3.2
2
LO-2.3.3
3
LO-2.3.4
4
LO-2.3.5
5

Compare four softwa test types (functional, non-functional, structura and chang
e
are
s
,
al
gerelated) by example (K2)
Recognize that funct
tional and str
ructural tests occur at any test level (K1)
s
Identify a describe non-functio
and
e
onal test type based on n
es
non-functional requireme
ents
(K2)
Identify a describe test types b
and
e
based on the analysis of a software sy
e
ystem’s struc
cture
or archite
ecture (K2)
Describe the purpose of confirma
e
e
ation testing and regression testing (K
K2)

2.4 Maintenance Testing (
e
(K2)
LO-2.4.1
1
LO-2.4.2
2
LO-2.4.3
3.

Compare maintenance testing (te
e
esting an existing system to testing a new application
m)
with resp
pect to test ty
ypes, triggers for testing and amount of testing (K
K2)
Recognize indicators for mainten
s
nance testing (modificatio migration and retirement)
g
on,
(K1)
Describe the role of r
e
regression te
esting and im
mpact analysis in mainten
nance (K2)

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 21 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

2.1

Softwa Deve
are
elopmen Models (K2)
nt

20 minut
tes

Terms
Commer
rcial Off-The-Shelf (COTS iterative-i
S),
incremental development model, validation,
verification, V-model

Backgr
round
Testing d
does not exis in isolation test activiti are relate to softwar developme activities.
st
n;
ies
ed
re
ent
Different developmen life cycle m
t
nt
models need different approaches to testing.
d

2.1.1

V-model (Sequential Develo
opment Mo
odel) (K2)

Although variants of the V-model exist, a com
h
l
mmon type of V-model us four test levels,
f
ses
correspo
onding to the four develop
e
pment levels
s.
r
bus
The four levels used in this syllab are:
o Com
mponent (unit testing
t)
o Integ
gration testin
ng
o Syst
tem testing
o Acce
eptance testi
ing
In practic a V-mode may have more, fewer or different levels of dev
ce,
el
r
velopment an testing,
nd
dependin on the pro
ng
oject and the software pr
e
roduct. For example, ther may be co
re
omponent
integratio testing aft compone testing, an system in
on
ter
ent
nd
ntegration tes
sting after sy
ystem testing
g.
Software work produ
e
ucts (such as business sc
s
cenarios or use cases, re
u
equirements s
specification
ns,
design d
documents an code) pro
nd
oduced during developme are often the basis of testing in on or
ent
f
ne
more tes levels. Ref
st
ferences for g
generic work products include Capab
k
bility Maturity Model Integ
y
gration
(CMMI) o ‘Software life cycle pro
or
ocesses’ (IEE
EE/IEC 1220 Verification and valid
07).
dation (and early
e
test design) can be c
carried out du
uring the dev
velopment of the software work produ
f
e
ucts.

2.1.2

Iterative-incremen Develo
ntal
opment Models (K2)

Iterative-incremental developmen is the proc
nt
cess of estab
blishing requi
irements, designing, build
ding
and testi a system in a series o short deve
ing
m
of
elopment cyc
cles. Examples are: proto
otyping, Rapid
Applicati Developm
ion
ment (RAD), Rational Un
nified Process (RUP) and agile develo
s
opment mode A
els.
system t
that is produc using the models m be teste at several test levels d
ced
ese
may
ed
during each
iteration. An increme added to others deve
.
ent,
eloped previo
ously, forms a growing pa
artial system,
which sh
hould also be tested. Reg
e
gression testing is increas
singly import
tant on all ite
erations after the
r
first one. Verification and validatio can be ca
.
on
arried out on each increm
ment.

2.1.3

Testing w
within a Life Cycle M
Model (K2
2)

In any lif cycle model, there are several characteristics of good testin
fe
o
ng:
o For e
every develo
opment activi there is a corresponding testing ac
ity
ctivity
o Each test level h test objec
h
has
ctives specifi to that leve
ic
el
o The analysis and design of te
d
ests for a giv test level should begi during the correspondi
ven
in
ing
deve
elopment act
tivity
o Test
ters should b involved in reviewing d
be
n
documents as soon as dr
a
rafts are available in the
deve
elopment life cycle
Test leve can be co
els
ombined or r
reorganized d
depending on the nature of the projec or the syst
ct
tem
architect
ture. For exa
ample, for the integration of a Comme
e
ercial Off-The
e-Shelf (COT software
TS)
product into a system the purcha
m,
aser may per
rform integra
ation testing a the system level (e.g.,
at
m
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 22 of 78

31-Ma
ar-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

integratio to the infr
on
rastructure and other systems, or sys
stem deploym
ment) and ac
cceptance tes
sting
(function and/or no
nal
on-functional, and user an
,
nd/or operational testing)
).

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 23 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

2.2

40 minut
tes

Test Le
evels (K
K2)

Terms
Alpha testing, beta te
esting, comp
ponent testing driver, field testing, fun
g,
d
nctional requ
uirement,
integratio integratio testing, no
on,
on
on-functional requiremen robustness testing, stu system te
l
nt,
s
ub,
esting,
test environment, tes level, test-d
st
driven development, use acceptance testing
er
e

Backgr
round
For each of the test levels, the fo
h
ollowing can b identified: the generic objectives, t work
be
c
the
product(s) being refe
erenced for d
deriving test c
cases (i.e., th test basis the test ob
he
s),
bject (i.e., wh is
hat
being tes
sted), typical defects and failures to b found, tes harness requirements a tool supp
l
d
be
st
and
port,
and spec approac
cific
ches and responsibilities.
Testing a system’s configuration data shall be considered during test planning,
e
d

2.2.1

Component Testin (K2)
ng

Test bas
sis:
o Com
mponent requ
uirements
o Deta
ailed design
o Code
e
Typical t
test objects:
o Com
mponents
o Prog
grams
o Data conversion / migration p
a
programs
o Data
abase modules
Compon
nent testing (a
also known a unit, modu or progra testing) searches for d
as
ule
am
defects in, an
nd
verifies t functionin of, softwa modules, programs, objects, class
the
ng
are
o
ses, etc., that are separat
tely
testable. It may be do in isolati from the rest of the sy
.
one
ion
ystem, depending on the context of the
e
developm
ment life cycle and the sy
ystem. Stubs drivers and simulators may be used
s,
d
d.
nent testing m include t
may
testing of fun
nctionality an specific no
nd
on-functional characterist
l
tics,
Compon
such as resource-behavior (e.g., searching fo memory le
or
eaks) or robu
ustness testin as well as
ng,
s
structura testing (e.g decision c
al
g.,
coverage). Te cases are derived fro work prod
est
e
om
ducts such as a
s
specifica
ation of the component, th software design or the data model.
he
e
Typically component testing occ
y,
curs with acce to the co being tes
ess
ode
sted and with the support of a
h
t
developm
ment environ
nment, such as a unit test framework or debugging tool. In practice, comp
ponent
testing u
usually involv the progr
ves
rammer who wrote the co
ode. Defects are typically fixed as soo as
y
on
they are found, witho formally m
out
managing the defects.
ese
One app
proach to com
mponent test
ting is to prepare and aut
tomate test c
cases before coding. This is
e
s
called a test-first app
proach or tes
st-driven deve
elopment. Th approach is highly iterative and is
his
h
s
based on cycles of developing te cases, the building and integratin small piec of code, and
n
est
en
ng
ces
a
executing the compo
onent tests co
orrecting any issues and iterating unt they pass.
y
til

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 24 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

2.2.2

Integratio Testing (K2)
on
g

Test bas
sis:
o Softw
ware and sys
stem design
o Arch
hitecture
o Workflows
o Use cases
Typical t
test objects:
o Subs
systems
o Data
abase implem
mentation
o Infra
astructure
o Inter
rfaces
o Syst
tem configura
ation and configuration d
data
Integratio testing tests interfaces between components, interactions with differen parts of a
on
nt
system, such as the operating sy
ystem, file sy
ystem and ha
ardware, and interfaces b
between syst
tems.
There may be more t
than one lev of integrat
vel
tion testing and it may be carried out on test objec of
a
e
cts
varying s
size as follow
ws:
1. Com
mponent integ
gration testin tests the in
ng
nteractions between softw
b
ware compo
onents and is done
s
after component testing
r
2. Syst
tem integratio testing tests the intera
on
actions betwe different systems or between
een
t
hard
dware and so
oftware and m be done after system testing. In this case, the developing
may
e
m
g
orga
anization may control only one side of the interface. This migh be conside
y
y
f
ht
ered as a risk
k.
Business proces
sses impleme
ented as wor
rkflows may involve a ser
ries of system Cross-platform
ms.
issue may be si
es
ignificant.
The grea the scop of integrat
ater
pe
tion, the more difficult it becomes to is
b
solate defect to a specif
ts
fic
compone or system which may lead to incr
ent
m,
y
reased risk and additiona time for tro
a
al
oubleshooting
g.
Systema integratio strategies may be bas on the sy
atic
on
sed
ystem archite
ecture (such as top-down and
n
bottom-u functiona tasks, tran
up),
al
nsaction proc
cessing sequ
uences, or so
ome other as
spect of the system
s
or compo
onents. In or
rder to ease fault isolation and detect defects earl integration should nor
n
t
ly,
n
rmally
be increm
mental rathe than “big bang”.
er
Testing o specific no
of
on-functional characterist (e.g., performance) m be includ in integr
l
tics
may
ded
ration
testing a well as fun
as
nctional testin
ng.
At each stage of inte
egration, teste concentr
ers
rate solely on the integrat
n
tion itself. Fo example, if they
or
f
are integ
grating modu A with mo
ule
odule B they are intereste in testing the commun
ed
nication betw
ween
the modu
ules, not the functionality of the indivi
y
idual module as that was done during component
e
s
g
t
testing. B
Both function and struc
nal
ctural approaches may be used.
e
Ideally, t
testers should understand the archite
d
ecture and inf
fluence integ
gration plann
ning. If integra
ation
tests are planned before compon
e
nents or syste
ems are built, those com
mponents can be built in th
n
he
order req
quired for mo efficient t
ost
testing.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 25 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

2.2.3

System T
Testing (K
K2)

Test bas
sis:
o Syst
tem and softw
ware require
ement specification
o Use cases
o Func
ctional specif
fication
o Risk analysis rep
k
ports
Typical t
test objects:
o Syst
tem, user and operation m
manuals
o Syst
tem configura
ation and configuration d
data
System t
testing is con
ncerned with the behavio of a whole system/prod
h
or
duct. The tes
sting scope shall
s
be clearl addressed in the Master and/or Lev Test Plan for that test level.
ly
d
vel
n
In system testing, the test environ
m
e
nment should correspond to the final target or pro
d
oduction
environm
ment as much as possible in order to minimize the risk of environment-spe
e
e
ecific failures not
s
being fou in testing
und
g.
testing may include tests based on risks and/or on requirements specifica
s
o
ations, busine
ess
System t
processe use case or other high level text descriptions or models of system be
es,
es,
t
s
ehavior,
interactio with the operating sy
ons
ystem, and sy
ystem resources.
System t
testing should investigate functional a non-func
e
and
ctional requir
rements of th system, and
he
data qua characte
ality
eristics. Teste also need to deal with incomplete or undocum
ers
d
h
e
mented
requirem
ments. System testing of f
m
functional requirements starts by usin the most a
s
ng
appropriate
specifica
ation-based (
(black-box) te
echniques fo the aspect of the syste to be teste For exam
or
t
em
ed.
mple, a
decision table may b created for combinations of effects described in business ru
be
s
n
ules. Structurebased te
echniques (w
white-box) ma then be us to asses the thoroughness of th testing with
ay
sed
ss
he
respect t a structura element, s
to
al
such as menu structure or web page n
u
o
navigation (s Chapter 4).
see
An indep
pendent test team often c
carries out sy
ystem testing
g.

2.2.4

Acceptan Testin (K2)
nce
ng

Test bas
sis:
o User requiremen
r
nts
o Syst
tem requirem
ments
o Use cases
sses
o Business proces
o Risk analysis rep
k
ports
Typical t
test objects:
o Business proces
sses on fully integrated sy
ystem
o Operational and maintenance processes
e
o User procedures
r
s
o Form
ms
o Repo
orts
o Conf
figuration da
ata
Acceptance testing is often the re
s
esponsibility of the customers or user of a system other
rs
m;
e
s
stakeholders may be involved as well.
The goal in acceptan testing is to establish confidence in the system parts of th system or
nce
s
h
m,
he
r
specific non-functional characteri
istics of the s
system. Find
ding defects is not the ma focus in
ain
acceptan testing. A
nce
Acceptance t
testing may assess the system’s read
s
diness for de
eployment an
nd
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 26 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

use, alth
hough it is no necessarily the final lev of testing. For example, a large-sc
ot
y
vel
cale system
integratio test may c
on
come after th acceptanc test for a system.
he
ce
Acceptance testing m occur at various time in the life cycle, for exa
may
t
es
ample:
o A CO
OTS software product ma be accept
ay
tance tested when it is in
nstalled or int
tegrated
o Acce
eptance testi of the usa
ing
ability of a co
omponent may be done d
during component testing
g
o Acce
eptance testi of a new functional en
ing
nhancement may come b
t
before system testing
m
Typical f
forms of acce
eptance testi include th following:
ing
he
ceptance te
esting
User acc
Typically verifies the fitness for use of the sys
y
stem by business users.
onal (accept
tance) testin
ng
Operatio
The acce
eptance of th system by the system administrato including
he
y
ors,
g:
o Test
ting of backu
up/restore
o Disa
aster recover
ry
o User manageme
r
ent
o Main
ntenance tas
sks
o Data load and m
a
migration task
ks
o Perio
odic checks of security vulnerabilities
s
ct
ation accept
tance testin
ng
Contrac and regula
Contract acceptance testing is pe
t
e
erformed aga
ainst a contra
act’s accepta
ance criteria for producing
custom-d
developed so
oftware. Acc
ceptance crite should be defined wh the parti agree to the
eria
b
hen
ies
t
contract. Regulation acceptance testing is pe
.
erformed aga
ainst any regu
ulations that must be adh
hered
to, such as governme legal or safety regula
ent,
ations.
g
Alpha and beta (or field) testing
Developers of marke or COTS, software ofte want to get feedback from potentia or existing
et,
en
al
g
custome in their ma
ers
arket before the software product is put up for sale commercia
e
p
ally. Alpha te
esting
is performed at the d
developing or
rganization’s site but not by the developing team. Beta testing or
s
g,
field-test
ting, is perfor
rmed by cust
tomers or po
otential custo
omers at their own locatio
ons.
Organiza
ations may u other term as well, s
use
ms
such as facto acceptanc testing an site accep
ory
ce
nd
ptance
testing fo systems th are tested before and after being moved to a customer’s s
or
hat
d
site.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 27 of 78

31-Ma
ar-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

2.3

40 minut
tes

Test Ty
ypes (K2
2)

Terms
Black-bo testing, co coverage functional testing, inter
ox
ode
e,
roperability te
esting, load t
testing,
maintain
nability testing performan testing, p
g,
nce
portability tes
sting, reliabili testing, se
ity
ecurity testing,
stress te
esting, structu testing, u
ural
usability test
ting, white-bo testing
ox

Backgr
round
A group of test activities can be a
aimed at veri
ifying the sof
ftware system (or a part o a system) based
m
of
on a spe
ecific reason or target for testing.
A test typ is focused on a partic
pe
d
cular test obje
ective, which could be an of the follo
h
ny
owing:
o A fun
nction to be performed by the software
y
o A no
on-functional quality characteristic, su as reliabi or usability
uch
ility
o The structure or architecture of the softwa or system
are
m
o Change related, i.e., confirmi that defe
ing
ects have bee fixed (con
en
nfirmation tes
sting) and loo
oking
for u
unintended ch
hanges (regr
ression testin
ng)
A model of the software may be d
developed and/or used in structural te
n
esting (e.g., a control flow
w
model or menu struc
r
cture model), non-function testing (e
nal
e.g., performa
ance model, usability mo
odel
security threat modeling), and fun
nctional testing (e.g., a process flow m
model, a stat transition model
te
or a plain language s
n
specification)
).

2.3.1

Testing o Functio (Functio
of
on
onal Testi
ing) (K2)

The func
ctions that a system, subs
system or co
omponent are to perform may be described in work
e
products such as a re
s
equirements specification use cases or a functio
s
n,
s,
onal specifica
ation, or they may
y
be undoc
cumented. T functions are “what” t system does.
The
s
the
d
Function tests are based on fun
nal
nctions and f
features (des
scribed in do
ocuments or u
understood by the
b
testers) a their inte
and
eroperability with specific systems, an may be pe
c
nd
erformed at a test levels (e.g.,
all
s
tests for components may be bas on a com
s
sed
mponent specification).
Specifica
ation-based t
techniques m be used to derive tes conditions and test cas from the
may
st
s
ses
functiona of the so
ality
oftware or sy
ystem (see C
Chapter 4). Fu
unctional tes
sting conside the extern
ers
nal
behavior of the softw
r
ware (black-b testing).
box
A type of functional testing, security testing, in
f
nvestigates the functions (e.g., a firew
t
s
wall) relating to
g
detection of threats, such as virus
n
ses, from ma
alicious outsiders. Anothe type of fun
er
nctional testing,
interoper
rability testin evaluates the capabili of the soft
ng,
s
ity
tware produc to interact with one or more
ct
specified component or systems
d
ts
s.

2.3.2 Testing o Non-fun
of
nctional S
Software Characteris
C
stics (Non
n-functional
Testing (K2)
g)
Non-func
ctional testing includes, b is not limited to, perfo
but
ormance testing, load testing, stress
testing, u
usability testing, maintain
nability testin reliability testing and p
ng,
t
portability tes
sting. It is the
e
testing o “how” the s
of
system works
s.
Non-func
ctional testing may be pe
erformed at a test levels. The term non-functiona testing des
all
al
scribes
the tests required to measure cha
s
aracteristics of systems and software that can be quantified on a
a
e
o
varying s
scale, such a response times for per
as
rformance te
esting. These tests can be referenced to a
e
d
quality m
model such a the one de
as
efined in ‘Sof
ftware Engine
eering – Soft
tware Product Quality’ (IS
SO
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 28 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

9126). N
Non-functiona testing con
al
nsiders the external beha
avior of the so
oftware and in most case
es
uses bla
ack-box test d
design techn
niques to acc
complish that
t.

2.3.3

Testing o Softwar Structu
of
re
ure/Archite
ecture (Str
ructural Testing) (K
K2)

Structura (white-box testing may be perform at all test levels. Structural techni
al
x)
y
med
t
iques are best
used afte specification-based techniques, in order to help measure th thoroughn
er
p
he
ness of testin
ng
through assessment of coverage of a type of structure.
e
Coverag is the exte that a stru
ge
ent
ucture has be exercise by a test s
een
ed
suite, expressed as a
percenta of the ite
age
ems being co
overed. If cov
verage is not 100%, then more tests m be desig
may
gned
to test th
hose items th were miss to increa coverage Coverage techniques a covered in
hat
sed
ase
e.
are
Chapter 4.
At all tes levels, but especially in component testing and component integration te
st
n
t
esting, tools can
be used to measure the code cov
verage of ele
ements, such as stateme
h
ents or decisions. Structural
testing m be based on the arch
may
d
hitecture of th system, such as a cal
he
s
lling hierarch
hy.
Structura testing app
al
proaches can also be applied at syste system i
n
em,
integration or acceptance
e
testing le
evels (e.g., to business m
o
models or me structure
enu
es).

2.3.4

Testing R
Related to Changes Re-testing and Re
o
s:
egression Testing (K2)

After a d
defect is dete
ected and fixe the softw
ed,
ware should be re-tested t confirm th the origina
b
to
hat
al
defect ha been succ
as
cessfully rem
moved. This i called confirmation. De
is
ebugging (loc
cating and fix
xing a
defect) is a developm
s
ment activity, not a testing activity.
g
Regress
sion testing is the repeate testing of an already te
s
ed
ested progra after mod
am,
dification, to
discover any defects introduced o uncovered as a result of the chang
r
s
or
d
ge(s). These defects may be
y
either in the software being tested, or in anoth related or unrelated s
e
her
o
software com
mponent. It is
s
performe when the software, or its environm
ed
ment, is changed. The ext
tent of regres
ssion testing is
g
based on the risk of not finding defects in soft
n
ftware that was working p
previously.
hould be repe
eatable if the are to be u
ey
used for conf
firmation test
ting and to assist regress
sion
Tests sh
testing.
Regress
sion testing m be performed at all te levels, an includes functional, no
may
est
nd
on-functional and
structura testing. Re
al
egression tes suites are r many tim and gene
st
run
mes
erally evolve slowly, so
e
regressio testing is a strong can
on
ndidate for au
utomation.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 29 of 78

31-Ma
ar-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

2.4

Maintenance T
Testing (
(K2)

15 minut
tes

Terms
Impact a
analysis, maintenance tes
sting

Backgr
round
Once de
eployed, a so
oftware syste is often in service for years or decades. During this time the
em
n
y
g
system, its configura
ation data, or its environm
ment are often corrected, changed or extended. Th
n
he
planning of releases in advance i crucial for successful maintenance testing. A distinction has to be
g
is
m
e
s
made be
etween plann releases and hot fixe Maintenan testing is done on an existing
ned
es.
nce
s
n
operational system, a is triggered by modif
and
fications, mig
gration, or retirement of th software or
he
system.
Modifica
ations include planned en
e
nhancement c
changes (e.g release-ba
g.,
ased), correc
ctive and
emergen changes, and change of environ
ncy
es
nment, such as planned o
a
operating sys
stem or database
upgrades, planned upgrade of Co
ommercial-O
Off-The-Shelf software, or patches to correct newly
f
r
exposed or discovere vulnerabilities of the o
d
ed
operating sys
stem.
Maintena
ance testing for migration (e.g., from one platform to another) should inclu operation
n
m
ude
nal
tests of t new environment as w as of the changed so
the
well
e
oftware. Migration testing (conversion
g
n
testing) i also neede when data from anoth applicatio will be mig
is
ed
a
her
on
grated into th system be
he
eing
maintain
ned.
Maintena
ance testing for the retire
ement of a sy
ystem may in
nclude the te
esting of data migration or
a
archiving if long data
g
a-retention pe
eriods are required.
In additio to testing what has be changed, maintenanc testing inc
on
een
ce
cludes regression testing to
g
parts of t system that have not been chang
the
t
ged. The sco of mainte
ope
enance testin is related to the
ng
t
risk of th change, th size of the existing sys
he
he
e
stem and to the size of th change. D
t
he
Depending on the
n
changes maintenanc testing may be done a any or all test levels an for any or all test types.
s,
ce
at
t
nd
r
Determin
ning how the existing sys
e
stem may be affected by changes is c
called impact analysis, an is
t
nd
used to h
help decide h
how much re
egression tes
sting to do. The impact analysis may be used to
T
determin the regres
ne
ssion test sui
ite.
Maintena
ance testing can be diffic if specific
cult
cations are out of date or missing, or testers with
o
r
domain k
knowledge a not availa
are
able.

Refere
ences
2.1.3 CM
MMI, Craig, 2
2002, Hetzel 1988, IEEE 12207
l,
E
2.2 Hetz 1988
zel,
2.2.4 Co
opeland, 200 Myers, 19
04,
979
2.3.1 Be
eizer, 1990, B
Black, 2001, Copeland, 2
2004
2.3.2 Black, 2001, IS 9126
SO
2.3.3 Be
eizer, 1990, C
Copeland, 20
004, Hetzel, 1988
2.3.4 He
etzel, 1988, I
IEEE STD 82
29-1998
2.4 Blac 2001, Cra 2002, He
ck,
aig,
etzel, 1988, I
IEEE STD 82
29-1998

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 30 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

3.

Static T
S
Techniques (K2
2)

6 minut
60
tes

Learni Objec
ing
ctives for Static Te
r
echniques
The obje
ectives identify what you will be able t do followin the compl
to
ng
letion of each module.
h

3.1 Sta Techniques and the Test Process (K2)
atic
d
(
LO-3.1.1
1
LO-3.1.2
2
LO-3.1.3
3

Recognize software work produc that can be examined by the differ
cts
b
rent static
techniqu (K1)
ues
Describe the importa
e
ance and valu of conside
ue
ering static te
echniques fo the assess
or
sment
of softwa work products (K2)
are
Explain t differenc between s
the
ce
static and dyn
namic techniques, consid
dering object
tives,
types of defects to be identified, a the role of these tech
e
and
hniques with the softwa life
hin
are
cycle (K2
2)

3.2 Rev
view Proc
cess (K2)
LO-3.2.1
1
LO-3.2.2
2
LO-3.2.3
3

Recall th activities, roles and responsibilities of a typical formal revie (K1)
he
s
ew
Explain t differenc between different type of reviews informal re
the
ces
es
s:
eview, techni
ical
review, w
walkthrough and inspection (K2)
Explain t factors fo successful performanc of reviews (K2)
the
or
ce
s

3.3 Sta Analys by Too (K2)
atic
sis
ols
LO-3.3.1
1
LO-3.3.2
2
LO-3.3.3
3

Recall ty
ypical defects and errors identified by static analys and comp
s
y
sis
pare them to
o
reviews and dynamic testing (K1)
c
Describe using exam
e,
mples, the ty
ypical benefits of static an
nalysis (K2)
List typic code and design defe
cal
ects that may be identified by static an
y
d
nalysis tools (K1)

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 31 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

3.1 Static T
Techniques and the Tes Proce
d
st
ess
(K2)

15 minut
tes

Terms
Dynamic testing, stat testing
c
tic

Backgr
round
Unlike dy
ynamic testin which req
ng,
quires the ex
xecution of so
oftware, static testing tec
chniques rely on
y
the manu examinat
ual
tion (reviews and autom
s)
mated analysi (static ana
is
alysis) of the code or othe
er
project d
documentatio without the execution of the code.
on
Reviews are a way o testing soft
s
of
tware work p
products (including code) and can be performed well
)
w
before dynamic test e
execution. D
Defects detec
cted during re
eviews early in the life cy
ycle (e.g., def
fects
found in requirement are often much cheap to remove than those detected by running test on
ts)
per
e
y
ts
the exec
cuting code.
A review could be do entirely a a manual activity, but there is also tool support The main
w
one
as
t.
manual a
activity is to e
examine a w
work product and make co
omments about it. Any so
oftware work
k
product c be reviewed, includin requireme
can
ng
ents specifica
ations, desig specifications, code, te
gn
est
plans, te specifications, test cas
est
ses, test scri
ipts, user guides or web pages.
Benefits of reviews in
nclude early defect detec
ction and cor
rrection, deve
elopment pro
oductivity
improvem
ments, reduc developm
ced
ment timesca
ales, reduced testing cos and time, li
d
st
ifetime cost
reduction fewer def
ns,
fects and improved comm
munication. Reviews can find omissio
R
n
ons, for exam
mple,
in require
ements, whic are unlike to be foun in dynamic testing.
ch
ely
nd
Reviews static analy and dyna
s,
ysis
amic testing have the same objective – identifying defects. Th
e
g
hey
are complementary; the different techniques can find diffe
erent types o defects effe
of
ectively and
efficiently. Compared to dynamic testing, stat technique find causes of failures (defects) rather
d
c
tic
es
than the failures them
mselves.
Typical d
defects that a easier to find in reviews than in dynamic testin include: d
are
ng
deviations fro
om
standard requireme defects, d
ds,
ent
design defec insufficie maintaina
cts,
ent
ability and inc
correct interfa
ace
specifica
ations.

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 32 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

3.2

25 minut
tes

Review Proces (K2)
w
ss

Terms
Entry criteria, formal review, infor
rmal review, inspection, metric, mode
m
erator, peer r
review, review
wer,
scribe, te
echnical review, walkthro
ough

Backgr
round
The diffe
erent types o reviews vary from inform characterized by no written instr
of
mal,
o
ructions for
reviewer to systematic, charact
rs,
terized by tea participat
am
tion, docume
ented results of the review and
s
w,
documen
nted procedu
ures for cond
ducting the re
eview. The fo
ormality of a review proce is related to
ess
d
factors s
such as the m
maturity of the developme process, any legal or regulatory re
ent
equirements or the
s
need for an audit trai
r
il.
The way a review is carried out d
y
depends on t agreed objectives of t review (e
the
the
e.g., find defe
ects,
gain und
derstanding, educate test
ters and new team memb
w
bers, or discu
ussion and d
decision by
consens
sus).

3.2.1

Activities of a Form Revie (K1)
s
mal
ew

A typical formal revie has the fo
l
ew
ollowing main activities:
n
1. Plan
nning
• D
Defining the review criter
ria
• S
Selecting the personnel
e
• A
Allocating ro
oles
• D
Defining the entry and ex criteria for more forma review type (e.g., insp
xit
r
al
es
pections)
• S
Selecting wh
hich parts of documents t review
to
• C
Checking en criteria (f more form review types)
ntry
for
mal
2. Kick
k-off
• D
Distributing d
documents
• E
Explaining th objectives process an document to the participants
he
s,
nd
ts
3. Indiv
vidual prepar
ration
• P
Preparing for the review meeting by r
reviewing the document(s)
e
• N
Noting poten
ntial defects, questions an comment
nd
ts
4. Exam
mination/eva
aluation/recor
rding of resu (review meeting)
ults
m
• D
Discussing o logging, with documented results or minutes (fo more form review typ
or
o
or
mal
pes)
• N
Noting defec making re
cts,
ecommenda
ations regarding handling the defects, making dec
cisions
about the de
a
efects
• E
Examining/e
evaluating and recording issues during any physic meetings or tracking any
g
cal
a
group electro
g
onic commun
nications
5. Rew
work
• F
Fixing defect found (typ
ts
pically done b the author
by
r)
• R
Recording updated status of defects (in formal reviews)
6. Follo
ow-up
• C
Checking tha defects ha been add
at
ave
dressed
• G
Gathering metrics
• C
Checking on exit criteria (for more for
n
rmal review types)
t

3.2.2

Roles an Respon
nd
nsibilities (K1)

A typical formal revie will includ the roles b
l
ew
de
below:
o Manager: decide on the exe
es
ecution of rev
views, alloca
ates time in p
project sched
dules and
dete
ermines if the review obje
e
ectives have been met.
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 33 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

o

o
o

o

Moderator: the p
person who le
eads the review of the do
ocument or s of docume
set
ents, includin
ng
planning the revi
iew, running the meeting, and followin
ng-up after th meeting. If necessary the
he
y,
moderator may m
mediate betw
ween the various points of view and is often the pe
o
s
erson upon whom
w
the s
success of th review res
he
sts.
Auth the writer or person w chief res
hor:
with
sponsibility fo the docum
or
ment(s) to be reviewed.
Revi
iewers: indiv
viduals with a specific technical or bus
siness backg
ground (also called check
kers or
inspe
ectors) who, after the necessary prep
paration, iden
ntify and des
scribe finding (e.g., defe
gs
ects) in
the p
product unde review. Re
er
eviewers sho
ould be chose to represe different perspectives and
en
ent
s
roles in the revie process, a should ta part in any review me
s
ew
and
ake
eetings.
Scrib (or record
be
der): docume
ents all the is
ssues, proble
ems and open points that were identif
t
fied
durin the meetin
ng
ng.

Looking at software p
products or r
related work products fro different p
om
perspectives and using
checklist can make reviews mor effective a efficient. For example a checklist based on various
ts
re
and
e,
t
v
perspect
tives such as user, maint
s
tainer, tester or operation or a chec
r
ns,
cklist of typica requirements
al
problems may help to uncover pr
s
o
reviously und
detected issu
ues.

3.2.3

Types of Reviews (K2)
f

A single software pro
oduct or relat work pro
ted
oduct may be the subject of more than one review If
e
n
w.
more tha one type o review is u
an
of
used, the ord may vary For example, an inform review ma be
der
y.
mal
ay
carried o before a t
out
technical rev
view, or an in
nspection ma be carried out on a req
ay
quirements
specifica
ation before a walkthroug with customers. The main characte
gh
m
eristics, optio and purp
ons
poses
of comm review ty
mon
ypes are:
Informal Review
o No fo
ormal proces
ss
o May take the form of pair pro
m
ogramming o a technical lead review
or
wing designs and code
o Resu may be d
ults
documented
o Varie in usefuln
es
ness dependi on the re
ing
eviewers
o Main purpose: in
n
nexpensive w to get so
way
ome benefit
rough
Walkthr
o Mee
eting led by a
author
o May take the form of scenarios, dry runs, peer group participation
m
,
n
ssions
o Open-ended ses
Optional pre-meeting pre
eparation of r
reviewers
• O
• O
Optional preparation of a review repo including list of finding
ort
gs
o Optio
onal scribe (who is not th author)
he
o May vary in prac
ctice from qui informal to very forma
ite
al
o Main purposes: learning, gaining unders
n
standing, find
ding defects
cal
Technic Review
o Docu
umented, de
efined defect-detection pr
rocess that in
ncludes peer and techni
rs
ical experts with
w
optio
onal manage
ement particip
pation
o May be performe as a peer review witho managem
ed
out
ment participation
o Idea led by trained modera (not the a
ally
ator
author)
o Pre-meeting prep
paration by r
reviewers
o Optio
onal use of c
checklists
o Prep
paration of a review repor which inclu
rt
udes the list of findings, t verdict whether the
the
softw
ware product meets its re
t
equirements and, where appropriate, recommendations relate to
a
ed
findings
o May vary in prac
ctice from qui informal to very forma
ite
al
o Main purposes: discussing, making decis
n
sions, evalua
ating alternat
tives, finding defects, sol
g
lving
technical problem and chec
ms
cking conform
mance to spe
ecifications, p
plans, regula
ations, and
standards
Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 34 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

Inspecti
ion
o Led by trained m
moderator (no the author)
ot
o Usua conducte as a peer examination
ally
ed
n
o Defin roles
ned
o Inclu
udes metrics gathering
o Form process b
mal
based on rules and chec
cklists
o Spec
cified entry a exit criter for accep
and
ria
ptance of the software pro
oduct
o Pre-meeting prep
paration
o Inspection report including lis of findings
t
st
o Form follow-up process (wi optional p
mal
p
ith
process impro
ovement com
mponents)
o Optio
onal reader
o Main purpose: fin
n
nding defects
s
Walkthro
oughs, technical reviews and inspecti
ions can be performed w
p
within a peer g
group,
i.e., colle
eagues at the same organizational lev This type of review is called a “pe review”.
e
vel.
e
s
eer

3.2.4

Success Factors f Review (K2)
s
for
ws

Success factors for r
s
reviews include:
o Each review has clear predef
h
s
fined objectiv
ves
o The right people for the revie objectives are involved
ew
s
d
o Test
ters are value reviewers who contrib
ed
s
bute to the re
eview and als learn abou the produc
so
ut
ct
whic enables th
ch
hem to prepa tests earlier
are
o Defe
ects found ar welcomed and express objective
re
sed
ely
o Peop issues an psycholog
ple
nd
gical aspects are dealt with (e.g., mak
s
king it a positive experien for
nce
the a
author)
o The review is conducted in an atmospher of trust; th outcome w not be us for the
re
he
will
sed
evaluation of the participants
e
s
o Revi
iew techniqu are applie that are s
ues
ed
suitable to ac
chieve the ob
bjectives and to the type and
a
level of software work produc and revie
cts
ewers
o Chec
cklists or role are used if appropriate to increase effectivenes of defect identification
es
e
e
ss
n
o Train
ning is given in review techniques, es
specially the more formal techniques such as
l
inspe
ection
o Management sup
pports a goo review pro
od
ocess (e.g., by incorporat
b
ting adequate time for rev
e
view
vities in proje schedules
ect
s)
activ
o Ther is an emphasis on lear
re
rning and pro
ocess improv
vement

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 35 of 78

31-Ma
ar-2011
International
Software Te
esting
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

3.3

Static A
Analysis by Too (K2)
s
ols

20 minut
tes

Terms
Compiler, complexity control flow data flow, static analys
y,
w,
sis

Backgr
round
The obje
ective of stati analysis is to find defects in softwa source co and softw
ic
s
are
ode
ware models
s.
Static an
nalysis is per
rformed witho actually e
out
executing the software be
e
eing examine by the too
ed
ol;
dynamic testing does execute the software co
c
s
e
ode. Static analysis can l
locate defect that are ha to
ts
ard
find in dy
ynamic testin As with re
ng.
eviews, static analysis fin defects r
c
nds
rather than fa
ailures. Static
c
analysis tools analyz program c
ze
code (e.g., co
ontrol flow an data flow) as well as g
nd
),
generated ou
utput
such as HTML and X
XML.
The valu of static an
ue
nalysis is:
o Early detection o defects prior to test exe
y
of
ecution
o Early warning ab
y
bout suspicio aspects o the code or design by t calculatio of metrics such
ous
of
o
the
on
s,
as a high comple
exity measur
re
o Identification of d
defects not e
easily found b dynamic testing
by
t
o Dete
ecting dependencies and inconsistenc
cies in softw
ware models s
such as links
s
o Impr
roved mainta
ainability of c
code and des
sign
o Prev
vention of defects, if lesso are learn in develo
ons
ned
opment
Typical d
defects disco
overed by sta analysis tools include
atic
e:
o Refe
erencing a va
ariable with a undefined value
an
d
o Inconsistent interfaces betwe modules and compon
een
s
nents
o Varia
ables that ar not used o are improp
re
or
perly declared
d
o Unre
eachable (de
ead) code
o Miss
sing and erro
oneous logic (potentially infinite loops)
o Overly complicat construct
ted
ts
o Prog
gramming sta
andards viola
ations
o Secu
urity vulnerab
bilities
o Synt violations of code and software m
tax
s
d
models
Static an
nalysis tools are typically used by dev
velopers (che
ecking against predefined rules or
d
programming standa
ards) before a during component an integration testing or w
and
nd
when checking-in
configuration manageme tools, and by designers during sof
n
ent
d
ftware model
ling. Static
code to c
analysis tools may produce a larg number o warning me
ge
of
essages, which need to b well-mana
be
aged
to allow the most effe
ective use of the tool.
f
Compilers may offer some suppo for static a
ort
analysis, incl
luding the ca
alculation of m
metrics.

Refere
ences
3.2 IEEE 1028
E
3.2.2 Gi 1993, van Veenendaa 2004
ilb,
n
al,
3.2.4 Gi 1993, IEE 1028
ilb,
EE
3.3 van Veenendaal 2004
l,

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 36 of 78

31-Ma
ar-2011
International
esting
Software Te
Q
Qualifications Board
s

Certif
fied Teste
er
Founda
ation Level Sy
yllabus

4.

Test De
T
esign Te
echniqu (K4)
ues
)

28 minu
85
utes

Learni Objec
ing
ctives for Test Des
r
sign Tech
hniques
The obje
ectives identify what you will be able t do followin the compl
to
ng
letion of each module.
h

4.1 The Test Dev
e
velopment Process (K3)
t
LO-4.1.1
1
LO-4.1.2
2
LO-4.1.3
3
LO-4.1.4
4

Different
tiate between a test desig specificat
n
gn
tion, test case specificatio and test
on
procedure specificati (K2)
ion
Compare the terms t
e
test condition test case and test proc
n,
a
cedure (K2)
Evaluate the quality of test cases in terms of clear traceab
e
s
bility to the re
equirements and
s
expected results (K2
d
2)
Translate test cases into a well-s
e
structured tes procedure specification at a level of
st
n
o
detail rel
levant to the knowledge o the testers (K3)
of
s

4.2 Cat
tegories o Test Des
of
sign Tech
hniques (K
K2)
LO-4.2.1
1
LO-4.2.2
2

Recall re
easons that b
both specification-based (black-box) a structure
and
e-based (whitebox) test design tech
t
hniques are u
useful and lis the commo technique for each (K
st
on
es
K1)
Explain t characteristics, comm
the
monalities, an difference between s
nd
es
specification-based
testing, s
structure-bas testing a experienc
sed
and
ce-based tes
sting (K2)

4.3 Spe
ecification
n-based or Black-bo Techniques (K3)
ox
LO-4.3.1
1
LO-4.3.2
2
LO-4.3.3
3

Write tes cases from given softw
st
m
ware models using equiva
alence partiti
ioning, bound
dary
value an
nalysis, decis
sion tables an state transition diagra
nd
ams/tables (K
K3)
Explain t main pur
the
rpose of each of the four testing techn
h
niques, what level and ty of
t
ype
testing c
could use the technique, a how cov
e
and
verage may b measured (K2)
be
d
Explain t concept of use case testing and its benefits (K
the
K2)

4.4 Str
ructure-ba
ased or Wh
hite-box T
Techniques (K4)
LO-4.4.1
1
LO-4.4.2
2

LO-4.4.3
3
LO-4.4.4
4

Describe the concep and value o code cove
e
pt
of
erage (K2)
Explain t concepts of statemen and decision coverage and give re
the
s
nt
e,
easons why these
t
concepts can also be used at tes levels othe than component testing (e.g., on
s
e
st
er
g
business procedures at system le
s
s
evel) (K2)
Write tes cases from given contr flows usin statement and decision test design
st
m
rol
ng
t
n
techniqu (K3)
ues
Assess s
statement an decision c
nd
coverage for completenes with respe to defined exit
ss
ect
d
criteria. (
(K4)

4.5 Exp
perience-b
based Tec
chniques (
(K2)
LO-4.5.1
1
LO-4.5.2
2

Recall re
easons for w
writing test ca
ases based on intuition, e
o
experience an knowledg
nd
ge
about co
ommon defec (K1)
cts
Compare experience
e
e-based tech
hniques with specification
n-based testin technique (K2)
ng
es

4.6 Cho
oosing Te Techni
est
iques (K2)
)
LO-4.6.1
1

Classify test design t
techniques a
according to their fitness t a given co
t
to
ontext, for the test
e
basis, re
espective mo
odels and sof
ftware charac
cteristics (K2
2)

Version 2
2011
© Internationa Software Testing Q
al
Qualifications Board

Page 37 of 78

31-Ma
ar-2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011
ISTQB - Foundation Level Syllabus 2011

More Related Content

What's hot

Cigniti Independent Software Testing Services
Cigniti Independent Software Testing ServicesCigniti Independent Software Testing Services
Cigniti Independent Software Testing ServicesCigniti Technologies Ltd
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testingdidev
 
Test Environment Management
Test Environment ManagementTest Environment Management
Test Environment ManagementKanoah
 
What is Software Testing | Edureka
What is Software Testing | EdurekaWhat is Software Testing | Edureka
What is Software Testing | EdurekaEdureka!
 
Web Test Automation with Selenium
Web Test Automation with SeleniumWeb Test Automation with Selenium
Web Test Automation with Seleniumvivek_prahlad
 
Hybrid framework for test automation
Hybrid framework for test automationHybrid framework for test automation
Hybrid framework for test automationsrivinayak
 
Build a Quality Engineering and Automation Framework
Build a Quality Engineering and Automation FrameworkBuild a Quality Engineering and Automation Framework
Build a Quality Engineering and Automation FrameworkJosiah Renaudin
 
Manual Testing Notes
Manual Testing NotesManual Testing Notes
Manual Testing Notesguest208aa1
 
Managed Test Services - Maveric Systems
Managed Test Services - Maveric SystemsManaged Test Services - Maveric Systems
Managed Test Services - Maveric SystemsMaveric Systems
 
Top 50 Software Testing Interview Questions & Answers | Edureka
Top 50 Software Testing Interview Questions & Answers | EdurekaTop 50 Software Testing Interview Questions & Answers | Edureka
Top 50 Software Testing Interview Questions & Answers | EdurekaEdureka!
 
Test Automation Tool comparison – HP UFT/QTP vs. Selenium
Test Automation Tool comparison –  HP UFT/QTP vs. SeleniumTest Automation Tool comparison –  HP UFT/QTP vs. Selenium
Test Automation Tool comparison – HP UFT/QTP vs. SeleniumAspire Systems
 
52892006 manual-testing-real-time
52892006 manual-testing-real-time52892006 manual-testing-real-time
52892006 manual-testing-real-timeSunil Pandey
 
Chapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for TestingChapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for TestingNeeraj Kumar Singh
 
An Introduction to Performance Testing
An Introduction to Performance TestingAn Introduction to Performance Testing
An Introduction to Performance TestingDavid Tzemach
 
Quality assurance k.meenakshi
Quality assurance   k.meenakshiQuality assurance   k.meenakshi
Quality assurance k.meenakshiMeenakshiK19
 
Managing Test Environments
Managing Test EnvironmentsManaging Test Environments
Managing Test EnvironmentsKevin Harvey
 
Environment Delivery Management Services
Environment Delivery Management  ServicesEnvironment Delivery Management  Services
Environment Delivery Management Servicesdrummondrj
 

What's hot (20)

Cigniti Independent Software Testing Services
Cigniti Independent Software Testing ServicesCigniti Independent Software Testing Services
Cigniti Independent Software Testing Services
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testing
 
Test Environment Management
Test Environment ManagementTest Environment Management
Test Environment Management
 
What is Software Testing | Edureka
What is Software Testing | EdurekaWhat is Software Testing | Edureka
What is Software Testing | Edureka
 
Web Test Automation with Selenium
Web Test Automation with SeleniumWeb Test Automation with Selenium
Web Test Automation with Selenium
 
Hybrid framework for test automation
Hybrid framework for test automationHybrid framework for test automation
Hybrid framework for test automation
 
Chapter 3 - Test Automation
Chapter 3 - Test AutomationChapter 3 - Test Automation
Chapter 3 - Test Automation
 
Getting Ready for UAT
Getting Ready for UATGetting Ready for UAT
Getting Ready for UAT
 
Chapter 2 - Test Management
Chapter 2 - Test ManagementChapter 2 - Test Management
Chapter 2 - Test Management
 
Build a Quality Engineering and Automation Framework
Build a Quality Engineering and Automation FrameworkBuild a Quality Engineering and Automation Framework
Build a Quality Engineering and Automation Framework
 
Manual Testing Notes
Manual Testing NotesManual Testing Notes
Manual Testing Notes
 
Managed Test Services - Maveric Systems
Managed Test Services - Maveric SystemsManaged Test Services - Maveric Systems
Managed Test Services - Maveric Systems
 
Top 50 Software Testing Interview Questions & Answers | Edureka
Top 50 Software Testing Interview Questions & Answers | EdurekaTop 50 Software Testing Interview Questions & Answers | Edureka
Top 50 Software Testing Interview Questions & Answers | Edureka
 
Test Automation Tool comparison – HP UFT/QTP vs. Selenium
Test Automation Tool comparison –  HP UFT/QTP vs. SeleniumTest Automation Tool comparison –  HP UFT/QTP vs. Selenium
Test Automation Tool comparison – HP UFT/QTP vs. Selenium
 
52892006 manual-testing-real-time
52892006 manual-testing-real-time52892006 manual-testing-real-time
52892006 manual-testing-real-time
 
Chapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for TestingChapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for Testing
 
An Introduction to Performance Testing
An Introduction to Performance TestingAn Introduction to Performance Testing
An Introduction to Performance Testing
 
Quality assurance k.meenakshi
Quality assurance   k.meenakshiQuality assurance   k.meenakshi
Quality assurance k.meenakshi
 
Managing Test Environments
Managing Test EnvironmentsManaging Test Environments
Managing Test Environments
 
Environment Delivery Management Services
Environment Delivery Management  ServicesEnvironment Delivery Management  Services
Environment Delivery Management Services
 

Similar to ISTQB - Foundation Level Syllabus 2011

Bộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà Nội
Bộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà NộiBộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà Nội
Bộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà NộiNiit Nguyễn Tuân
 
1 istqb foundation-level_sydddddddddddllabus_2011
1 istqb foundation-level_sydddddddddddllabus_20111 istqb foundation-level_sydddddddddddllabus_2011
1 istqb foundation-level_sydddddddddddllabus_2011Pranav KS
 
1 istqb foundation-level_syllabus_2011
1 istqb foundation-level_syllabus_20111 istqb foundation-level_syllabus_2011
1 istqb foundation-level_syllabus_2011Pranav Chaudhary
 
Istqb ctfl syll_2011-
Istqb ctfl syll_2011-Istqb ctfl syll_2011-
Istqb ctfl syll_2011-akash181992
 
Istqb foundation level syllabus 2010
Istqb   foundation level syllabus 2010Istqb   foundation level syllabus 2010
Istqb foundation level syllabus 2010Professional Testing
 
ISTQB Syllabus Foundation
ISTQB Syllabus FoundationISTQB Syllabus Foundation
ISTQB Syllabus FoundationNitin Mhaskar
 
ISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdfISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdfD19CQVT01NTATHIMAILI
 
ISTQB - CTFL Syllabus 2018 v3.1.1.pdf
ISTQB - CTFL Syllabus 2018 v3.1.1.pdfISTQB - CTFL Syllabus 2018 v3.1.1.pdf
ISTQB - CTFL Syllabus 2018 v3.1.1.pdfnhung875961
 
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdfISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdfssuser407aa7
 
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdfISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdfJohnsonstephen Jsstc
 
ISEB Certified Tester Foundation Level Tester
ISEB Certified Tester Foundation Level TesterISEB Certified Tester Foundation Level Tester
ISEB Certified Tester Foundation Level Testerguest670113
 
Syllabus foundation
Syllabus foundationSyllabus foundation
Syllabus foundationJenny Nguyen
 
ISTQB_CTAL-TA_Syllabus_v3.1.2.pdf
ISTQB_CTAL-TA_Syllabus_v3.1.2.pdfISTQB_CTAL-TA_Syllabus_v3.1.2.pdf
ISTQB_CTAL-TA_Syllabus_v3.1.2.pdfssuser407aa7
 
ISTQB - Foundation level testing topics
ISTQB - Foundation level testing topicsISTQB - Foundation level testing topics
ISTQB - Foundation level testing topicsShan Kings
 
Expert tm syllabus beta version 041511
Expert tm syllabus beta version 041511Expert tm syllabus beta version 041511
Expert tm syllabus beta version 041511jicheng687
 
ISTQB Advance Material
ISTQB Advance MaterialISTQB Advance Material
ISTQB Advance MaterialMandar Kharkar
 
Advanced security tester syllabus ga 2016
Advanced security tester syllabus   ga 2016Advanced security tester syllabus   ga 2016
Advanced security tester syllabus ga 2016Yasir Ali
 
Istqb exam guidelines
Istqb exam guidelinesIstqb exam guidelines
Istqb exam guidelinesJenny Nguyen
 

Similar to ISTQB - Foundation Level Syllabus 2011 (20)

Bộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà Nội
Bộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà NộiBộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà Nội
Bộ tài liệu học Tester - học Kiểm thử phần mềm tại NIIT-ICT Hà Nội
 
1 istqb foundation-level_sydddddddddddllabus_2011
1 istqb foundation-level_sydddddddddddllabus_20111 istqb foundation-level_sydddddddddddllabus_2011
1 istqb foundation-level_sydddddddddddllabus_2011
 
1 istqb foundation-level_syllabus_2011
1 istqb foundation-level_syllabus_20111 istqb foundation-level_syllabus_2011
1 istqb foundation-level_syllabus_2011
 
Istqb ctfl syll_2011-
Istqb ctfl syll_2011-Istqb ctfl syll_2011-
Istqb ctfl syll_2011-
 
Istqb foundation level syllabus 2010
Istqb   foundation level syllabus 2010Istqb   foundation level syllabus 2010
Istqb foundation level syllabus 2010
 
ISTQB Syllabus Foundation
ISTQB Syllabus FoundationISTQB Syllabus Foundation
ISTQB Syllabus Foundation
 
ISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdfISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1 (1).pdf
 
ISTQB - CTFL Syllabus 2018 v3.1.1.pdf
ISTQB - CTFL Syllabus 2018 v3.1.1.pdfISTQB - CTFL Syllabus 2018 v3.1.1.pdf
ISTQB - CTFL Syllabus 2018 v3.1.1.pdf
 
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdfISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
 
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdfISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
ISTQB-CTFL_Syllabus_2018_v3.1.1.pdf
 
Temp efv
Temp efvTemp efv
Temp efv
 
ISEB Certified Tester Foundation Level Tester
ISEB Certified Tester Foundation Level TesterISEB Certified Tester Foundation Level Tester
ISEB Certified Tester Foundation Level Tester
 
Syllabus foundation
Syllabus foundationSyllabus foundation
Syllabus foundation
 
ISTQB_CTAL-TA_Syllabus_v3.1.2.pdf
ISTQB_CTAL-TA_Syllabus_v3.1.2.pdfISTQB_CTAL-TA_Syllabus_v3.1.2.pdf
ISTQB_CTAL-TA_Syllabus_v3.1.2.pdf
 
ISTQB - Foundation level testing topics
ISTQB - Foundation level testing topicsISTQB - Foundation level testing topics
ISTQB - Foundation level testing topics
 
Expert tm syllabus beta version 041511
Expert tm syllabus beta version 041511Expert tm syllabus beta version 041511
Expert tm syllabus beta version 041511
 
ISTQB Advance Material
ISTQB Advance MaterialISTQB Advance Material
ISTQB Advance Material
 
Exploring MobileXPRT 2015
Exploring MobileXPRT 2015Exploring MobileXPRT 2015
Exploring MobileXPRT 2015
 
Advanced security tester syllabus ga 2016
Advanced security tester syllabus   ga 2016Advanced security tester syllabus   ga 2016
Advanced security tester syllabus ga 2016
 
Istqb exam guidelines
Istqb exam guidelinesIstqb exam guidelines
Istqb exam guidelines
 

More from Professional Testing (20)

Electronic Sign
Electronic Sign Electronic Sign
Electronic Sign
 
Pdf World
Pdf WorldPdf World
Pdf World
 
Applicant and Employer
Applicant and EmployerApplicant and Employer
Applicant and Employer
 
Foss in history
Foss in historyFoss in history
Foss in history
 
Hard Web Testing
Hard Web Testing Hard Web Testing
Hard Web Testing
 
Software Libre
Software LibreSoftware Libre
Software Libre
 
Images Fromats for Social Media
Images Fromats for Social MediaImages Fromats for Social Media
Images Fromats for Social Media
 
State
StateState
State
 
Bugs in Software
Bugs in SoftwareBugs in Software
Bugs in Software
 
Images Formats
Images FormatsImages Formats
Images Formats
 
Applicant and Employes
Applicant and EmployesApplicant and Employes
Applicant and Employes
 
Pdf World
Pdf WorldPdf World
Pdf World
 
State of Testing
State of TestingState of Testing
State of Testing
 
Web Tests
Web TestsWeb Tests
Web Tests
 
Bugs in sofware
Bugs in sofwareBugs in sofware
Bugs in sofware
 
Software Libre
Software LibreSoftware Libre
Software Libre
 
Foss in history
Foss in historyFoss in history
Foss in history
 
Electronic Sign
Electronic SignElectronic Sign
Electronic Sign
 
Fundamentos de Pruebas de Software
Fundamentos de Pruebas de SoftwareFundamentos de Pruebas de Software
Fundamentos de Pruebas de Software
 
Fundamentos de Pruebas de Software
Fundamentos de Pruebas de SoftwareFundamentos de Pruebas de Software
Fundamentos de Pruebas de Software
 

Recently uploaded

Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 

Recently uploaded (20)

Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 

ISTQB - Foundation Level Syllabus 2011

  • 1. Certifi Tester C ied r Found dation Lev Sy n vel yllabu us Released R Ver rsion 201 11 Int ternatio onal Software Testing Qualif g fication Board ns r
  • 2. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Copyrigh Notice ht This doc cument may be copied in its entirety, or extracts made, if the s m source is ack knowledged. Copyrigh Notice © In ht nternational Software Te esting Qualific cations Boar (hereinafte called ISTQB®) rd er ISTQB is a registered trademark of the Intern s d national Softw ware Testing Qualifications Board, g Copyrigh © 2011 the authors for the update 2011 (Thomas Müller (ch ht e r hair), Debra Friedenberg, and the ISTQ WG Foun QB ndation Level) Copyrigh © 2010 the authors for the update 2010 (Thomas Müller (ch ht e r hair), Armin B Beer, Martin Klonk, R Rahul Verma) ) Copyrigh © 2007 the authors for the update 2007 (Thomas Müller (ch ht e r hair), Dorothy Graham, Debra y D Friedenb berg and Erik van Veenendaal) k Copyrigh © 2005, th authors (T ht he Thomas Mülle (chair), Re Black, Sig Eldh, Dorothy Graham, er ex grid Klaus Ol lsen, Maaret Pyhäjärvi, G Geoff Thompson and Erik van Veenen k ndaal). All rights reserved. s The auth hors hereby t transfer the c copyright to t Internatio the onal Softwar Testing Qu re ualifications Board (ISTQB). The author (as current copyright holders) and ISTQB (as th future cop rs t I he pyright holder) have agr reed to the fo ollowing cond ditions of use e: 1) Any individual or training com r mpany may u this sylla use abus as the b basis for a tra aining course if the e ed ource and co opyright owners of the sy yllabus authors and the ISTQB are acknowledge as the so at rtisement of such a train ning course m mention the syllabu only may n us and provided tha any adver r n on aining mater rials to an I ISTQB recognized after submission for official accreditatio of the tra Natio onal Board. 2) Any individual or group of in r ndividuals ma use this syllabus as t basis for articles, boo ay s the oks, or othe derivative writings if th authors a er he and the ISTQB are acknowledged a the sourc and as ce copy yright owners of the syllabus. s 3) Any ISTQB-reco ognized Natio onal Board m translate this syllabu and licens the syllab (or may e us se bus ranslation) to other parties. o its tr Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 2 of 78 7 31-Mar r-2011
  • 3. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Revision Histo ory Version Date D Remarks s ISTQB 2 2011 Effective 1-Ap E pr-2011 ISTQB 2 2010 Effective 30-M E Mar-2010 ISTQB 2 2007 01-May-2007 0 7 ISTQB 2 2005 ASQF V2.2 01-July-2005 0 July-2003 J ISEB V2 2.0 25-Feb-1999 2 Certified Tester Foundation Level Syllabus l Maintena ance Release – see Appe e endix E – Re elease Notes Certified Tester Foundation Level Syllabus l Maintena ance Release – see Appe e endix E – Re elease Notes Certified Tester Foundation Level Syllabus l Maintena ance Release e Certified Tester Foundation Level Syllabus l ASQF Sy yllabus Foundation Level Version 2.2 “Lehrplan Grundlagen des Softwa n n are-testens“ ISEB Sof ftware Testin Foundatio Syllabus V2.0 ng on V 25 February 1999 Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 3 of 78 7 31-Mar r-2011
  • 4. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Table of Conte ents Acknowledgements.. ........................................ ........................................ ................................................... 7  Introduct tion to this S Syllabus............................ ........................................ ................................................... 8  Purpo of this Do ose ocument .......................... ........................................ ................................................... 8  The C Certified Test Foundatio Level in S ter on Software Testing .............. ................................................... 8  Learn ning Objective es/Cognitive Level of Kno owledge .......................... ................................................... 8  The E Examination . ........................................ ........................................ ................................................... 8  Accre editation ........ ........................................ ........................................ ................................................... 8  Level of Detail...... ........................................ ........................................ ................................................... 9  How t this Syllabus is Organized .................. ........................................ ................................................... 9  1.  Fun ndamentals o Testing (K of K2)................ ........................................ ................................................. 10  1.1  Why is Te esting Necessary (K2) ..... ........................................ ................................................. 11  1.1.1  Software Systems C Context (K1) ....................................... ) ................................................. 11  1.1.2  Causes of Software Defects (K2 .................................... s e 2) ................................................. 11  1.1.3  Role of Testing in S f Software Dev velopment, Maintenance and Operations (K2) ............... 11  M 1.1.4  Testing and Quality (K2) ........... g y ........................................ ................................................. 11  1.1.5  How Much Testing is Enough? (K2) ................................ ................................................. 12  1.2  What is Testing? (K2) .................... ........................................ ................................................. 13  Seven Testing Princip 1.3  ples (K2) ....... ........................................ ................................................. 14  Fundamental Test Pro 1.4  ocess (K1) ... ........................................ ................................................. 15  1.4   Test Planning and C 4.1 Control (K1) ....................................... ................................................. 15  1.4   Test An 4.2 nalysis and D Design (K1) . ........................................ ................................................. 15  1.4   Test Im 4.3 mplementatio and Execu on ution (K1)......................... ................................................. 16  1.4   Evaluating Exit Crit 4.4 teria and Rep porting (K1) ..................... ................................................. 16  1.4   Test Cl 4.5 losure Activit ties (K1) ...... ........................................ ................................................. 16  1.5  The Psych hology of Testing (K2) .... ........................................ ................................................. 18  1.6  Code of E Ethics ............................... ........................................ ................................................. 20  2.  Tes sting Throug ghout the Sof ftware Life C Cycle (K2) ......................... ................................................. 21  2.1  Software Developmen Models (K2 .................................... nt 2) ................................................. 22  2.1.1  V-mode (Sequentia Development Model) (K2) .............. el al ................................................. 22  2.1.2  Iterative e-incrementa Development Models (K2) ............. al ( ................................................. 22  2.1.3  Testing within a Life Cycle Model (K2) ............................ g e ................................................. 22  2.2  Test Leve (K2) ............................ els ........................................ ................................................. 24  2.2   Compo 2.1 onent Testing (K2) ........... g ........................................ ................................................. 24  2.2   Integra 2.2 ation Testing (K2) ............ ........................................ ................................................. 25  2.2   System Testing (K2 ................. 2.3 m 2) ........................................ ................................................. 26  2.2   Acceptance Testing (K2)........... 2.4 g ........................................ ................................................. 26  Test Type (K2) ............................. 2.3  es ........................................ ................................................. 28  2.3   Testing of Function (Functional Testing) (K2 ................. 3.1 g n 2) ................................................. 28  2.3   Testing of Non-func 3.2 g ctional Softw ware Characte eristics (Non n-functional T Testing) (K2) ......... 28  2.3   Testing of Software Structure/A 3.3 g e Architecture (Structural Te esting) (K2) .............................. 29  2.3   Testing Related to Changes: Re 3.4 g e-testing and Regression Testing (K2 ........................... 29  d n 2) 2.4  Maintenan Testing ( nce (K2) ............. ........................................ ................................................. 30  3.  Sta Techniqu (K2)........................... atic ues ........................................ ................................................. 31  3.1  Static Tec chniques and the Test Pr d rocess (K2) ...................... ................................................. 32  3.2  Review Process (K2) ..................... ........................................ ................................................. 33  3.2   Activitie of a Form Review (K ................................... 2.1 es mal K1) ................................................. 33  3.2   Roles a Respons 2.2 and sibilities (K1) ....................................... ) ................................................. 33  3.2   Types o Reviews ( 2.3 of (K2) .............. ........................................ ................................................. 34  3.2   Succes Factors fo Reviews (K ................................... 2.4 ss or K2) ................................................. 35  3.3  Static Ana alysis by Too (K2) ........ ols ........................................ ................................................. 36  4.  Tes Design Te st echniques (K ................ K4) ........................................ ................................................. 37  4.1  The Test Developmen Process (K ................................... nt K3) ................................................. 38  4.2  es esign Techniq ques (K2) ........................ ................................................. 39  Categorie of Test De Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 4 of 78 7 31-Mar r-2011
  • 5. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 4.3  Specificat tion-based or Black-box T r Techniques (K3) ............. ( ................................................. 40  4.3   Equivalence Partitio 3.1 oning (K3) ... ........................................ ................................................. 40  4.3   Bounda Value An 3.2 ary nalysis (K3) .. ........................................ ................................................. 40  4.3   Decisio Table Tes 3.3 on sting (K3) ..... ........................................ ................................................. 40  4.3   State T 3.4 Transition Testing (K3) .... ........................................ ................................................. 41  4.3   Use Ca Testing ( 3.5 ase (K2).............. ........................................ ................................................. 41  4.4  Structure-based or Wh hite-box Techniques (K4 .................. 4) ................................................. 42  4.4   Statem 4.1 ment Testing a Coverag (K4) ............................ and ge ................................................. 42  4.4   Decisio Testing an Coverage (K4) ............................... 4.2 on nd e ................................................. 42  4.4   Other S 4.3 Structure-bas Techniqu (K1) .......................... sed ues ................................................. 42  4.5  Experienc ce-based Tec chniques (K2 ..................................... 2) ................................................. 43  4.6  Choosing Test Techni iques (K2).... ........................................ ................................................. 44  5.  Tes Management (K3) .......................... st ........................................ ................................................. 45  Test Orga 5.1  anization (K2 .................. 2) ........................................ ................................................. 47  5.1.1  Test Organization a Independ and dence (K2) ...................... ................................................. 47  5.1.2  Tasks o the Test L of Leader and T Tester (K1) ....................... ................................................. 47  5.2  Test Planning and Est timation (K3)....................................... ) ................................................. 49  5.2   Test Planning (K2) .................... 2.1 ........................................ ................................................. 49  5.2   Test Planning Activ 2.2 vities (K3) ..... ........................................ ................................................. 49  5.2   Entry C 2.3 Criteria (K2) ..................... ........................................ ................................................. 49  5.2   Exit Criteria (K2)........................ 2.4 ........................................ ................................................. 49  5.2   Test Es 2.5 stimation (K2 ................. 2) ........................................ ................................................. 50  5.2   Test St 2.6 trategy, Test Approach (K .................................. t K2) ................................................. 50  5.3  Test Prog gress Monitor ring and Con ntrol (K2) ......................... ................................................. 51  5.3   Test Pr 3.1 rogress Monitoring (K1) .. ........................................ ................................................. 51  5.3   Test Re 3.2 eporting (K2)................... ........................................ ................................................. 51  5.3   Test Co 3.3 ontrol (K2)....................... ........................................ ................................................. 51  5.4  Configura ation Manage ement (K2) ... ........................................ ................................................. 52  5.5  Risk and T Testing (K2) .................... ........................................ ................................................. 53  5.5   Project Risks (K2) ..................... 5.1 t ........................................ ................................................. 53  5.5   Produc Risks (K2) .................... 5.2 ct ........................................ ................................................. 53  5.6  Incident M Management (K3) ............ ........................................ ................................................. 55  6.  Too Support fo Testing (K2 ol or 2)................. ........................................ ................................................. 57  6.1  Types of T Test Tools (K ............... K2) ........................................ ................................................. 58  6.1.1  Tool Su upport for Te esting (K2) ... ........................................ ................................................. 58  6.1.2  Test To Classifica ool ation (K2) ..... ........................................ ................................................. 58  6.1.3  Tool Su upport for Ma anagement o Testing an Tests (K1) ............................................... 59  of nd ) 6.1.4  Tool Su upport for Sta Testing (K1) ................................ atic ................................................. 59  6.1.5  Tool Su upport for Te Specificat est tion (K1) .......................... ................................................. 59  6.1.6  Tool Su upport for Te Execution and Loggin (K1) ......... est n ng ................................................. 60  6.1.7  Tool Su upport for Pe erformance a Monitorin (K1)......... and ng ................................................. 60  6.1.8  Tool Su upport for Sp pecific Testin Needs (K1 ................. ng 1) ................................................. 60  6.2  Effective U of Tools Potential B Use s: Benefits and Risks (K2) .. ................................................. 62  6.2   Potential Benefits a Risks of Tool Suppor for Testing (for all tools (K2) ................... 62  2.1 and rt s) 6.2   Special Considerations for Som Types of Tools (K1) .... 2.2 me T ................................................. 62  Introducin a Tool into an Organiz 6.3  ng o zation (K1) ....................... ................................................. 64  7.  References ...... ........................................ ........................................ ................................................. 65  Stand dards ............ ........................................ ........................................ ................................................. 65  Books s................... ........................................ ........................................ ................................................. 65  8.  Appendix A – S Syllabus Background ....... ........................................ ................................................. 67  Histor of this Doc ry cument ............................ ........................................ ................................................. 67  Objec ctives of the F Foundation C Certificate Qualification ...................... ................................................. 67  Objec ctives of the I International Qualification (adapted fr n rom ISTQB m meeting at So ollentuna, Novem mber 2001).. ........................................ ........................................ ................................................. 67  Entry Requiremen for this Qu nts ualification ... ........................................ ................................................. 67  Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 5 of 78 7 31-Mar r-2011
  • 6. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Backg ground and H History of the Foundation Certificate in Software T e n Testing ..................................... 68  Appendix B – L Learning Obje ectives/Cogn nitive Level of Knowledge ............................................... 69  o e Level 1: Remember (K1) ............................ ........................................ ................................................. 69  Level 2: Understand (K2) ........................... ........................................ ................................................. 69  Level 3: Apply (K3 ..................................... 3) ........................................ ................................................. 69  Level 4: Analyze ( (K4) ................................. ........................................ ................................................. 69  10.  Appendix C – Rules App A plied to the IS STQB ............................... ................................................. 71  Found dation Syllab ................................... bus ........................................ ................................................. 71  10.   Genera Rules ........................... .1.1 al ........................................ ................................................. 71  10.   Current Content ........................ .1.2 ........................................ ................................................. 71  10.   Learnin Objectives .................. .1.3 ng s ........................................ ................................................. 71  10.   Overall Structure ....................... .1.4 l ........................................ ................................................. 71  11.  Appendix D – Notice to T A Training Prov viders .............................. ................................................. 73  12.  Appendix E – Release Notes............. A ........................................ ................................................. 74  Relea 2010 ...... ase ........................................ ........................................ ................................................. 74  Relea 2011 ...... ase ........................................ ........................................ ................................................. 74  13.  Index ........... ........................................ ........................................ ................................................. 76  9.  Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 6 of 78 7 31-Mar r-2011
  • 7. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Ackno owledgements Internatio onal Softwar Testing Qu re ualifications Board Working Group Fo oundation Le evel (Edition 2011): Thomas Müller (chair), Debra Friedenberg. T core team thanks the review team (Dan Almog The m m g, Armin Be Rex Black, Julie Gar eer, rdiner, Judy McKay, Tuul Pääkköne Eric Riou du Cosquier Hans la en, r Schaefer, Stephanie Ulrich, Erik van Veenendaal) and all National Bo oards for the suggestions for the curre version o the syllabus. ent of Internatio onal Softwar Testing Qu re ualifications Board Working Group Fo oundation Le evel (Edition 2010): Thomas Müller (chair), Rahul Verma, Martin K Klonk and Ar rmin Beer. T core team thanks the The m review te eam (Rex Bla ack, Mette B Bruhn-Peders son, Debra Friedenberg, Klaus Olsen Judy McKa F n, ay, Tuula Pä ääkkönen, M Meile Posthum Hans Sc ma, chaefer, Step phanie Ulrich, Pete William Erik van ms, Veenend daal) and all National Boa ards for their suggestions r s. Internatio onal Softwar Testing Qu re ualifications Board Working Group Fo oundation Le evel (Edition 2007): Thomas Müller (chair), Dorothy G Graham, Deb Friedenberg, and Erik van Veenendaal. The core bra k c team tha anks the revie team (Ha Schaefer Stephanie Ulrich, Meile Posthuma, Anders ew ans r, e Pettersson, and Won Kwon) and all the National Boards for their sug nil d ggestions. Internatio onal Softwar Testing Qu re ualifications Board Working Group Fo oundation Le evel (Edition 2005): Thomas Müller (chair), Rex Black Sigrid Eldh Dorothy Graham, Klau Olsen, Ma k, h, G us aaret Pyhäjär rvi, hompson and Erik van Ve d eenendaal an the review team and a National B nd w all Boards for their Geoff Th suggestions. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 7 of 78 7 31-Mar r-2011
  • 8. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Introd duction to this Syllabus n s Purpo of this Docume ose s ent This sylla abus forms t basis for the International Softwar Testing Qualification a the Founda the re at ation Level. Th Internatio he onal Software Testing Qualifications Board (ISTQB provides it to the Natio e B B) onal Boards f them to accredit the tr for raining provid ders and to derive examination quest d tions in their local language Training p e. providers will determine a appropriate te eaching meth hods and pro oduce course eware for accre editation. The syllabus w help candidates in their preparation for the exa will amination. Information on the history and ba ackground of the syllabus can be foun in Append A. f s nd dix The C Certified T Tester Foundation Level in Software Testing e The Foundation Leve qualificatio is aimed a anyone inv el on at volved in soft tware testing This includ g. des people in roles such as testers, te analysts, test enginee test cons n est , ers, sultants, test managers, user t acceptan testers a software developers. This Founda nce and ation Level q qualification i also appro is opriate for anyone who want a basic un ts nderstanding of software testing, such as project m h managers, quality manager software developmen managers, business an rs, nt nalysts, IT dir rectors and m management consulta ants. Holders of the Foundation Certif ficate will be able to go on to a higher n r-level softwa are testing q qualification. Learni Objec ing ctives/Co ognitive Level of Knowledge K e Learning objectives a indicated for each section in this syllabus and classified as follows: g are d s d s o K1: r remember o K2: u understand o K3: a apply o K4: a analyze Further d details and e examples of l learning obje ectives are given in Appe endix B. All terms listed under “Terms” jus below chap heading shall be re s st pter gs emembered ( (K1), even if not explicitly mentioned in the learnin objectives y ng s. The E Examinatio on The Foundation Leve Certificate examination will be base on this sy el n ed yllabus. Answ wers to ation question may require the use o material ba ns of ased on more than one section of this e s examina syllabus. All sections of the syllab are exam s bus minable. The form of the exa mat amination is multiple cho oice. Exams m be taken as part of a accredited training cou may n an d urse or taken independen (e.g., at an n ntly a examina ation center o in a public exam). Com or mpletion of an accredited training cou a d urse is not a prerequisite for the exam e m. Accred ditation An ISTQ National B QB Board may accredit training providers whose cour material f s rse follows this syllabus. Training pro oviders shou obtain acc uld creditation guidelines from the board or body that t performs the accreditation. An ac s ccredited cou urse is recog gnized as con nforming to this syllabus, and is allowe to have an ISTQB exa ed n amination as part of the course. guidance for training prov viders is give in Append D. en dix Further g Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 8 of 78 7 31-Mar r-2011
  • 9. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Level of Detail The leve of detail in this syllabus allows inter el s rnationally co onsistent teaching and ex xamination. In order to achieve this goal, the syllabus consis of: sts o General instructional objectiv describin the intentio of the Fou ves ng on undation Lev vel o A list of informati to teach, including a d ion description, and referenc to additio a ces onal sources if requ uired o Lear rning objectiv for each knowledge a ves area, describ bing the cogn nitive learning outcome and a mind dset to be ac chieved o A list of terms tha students m at must be able to recall and understand e d d o A de escription of t key conc the cepts to teach, including sources such as accepte literature or s h ed o standards The sylla abus content is not a des t scription of th entire kno he owledge area of software testing; it ref a flects the level of detail to b covered in Foundation Level traini courses. be n n ing How th Syllab is Or his bus rganized There ar six major c re chapters. The top-level h heading for each chapter shows the h highest level of learning objectives th is covere within the chapter and specifies the time for the chapter. Fo hat ed e e or example e: 2. Tes sting Thr roughout the Sof t ftware Life Cycle (K2) 115 min nutes This hea ading shows that Chapter 2 has learning objective of K1 (ass r es sumed when a higher level is shown) a K2 (but n K3), and it is intended to take 115 minutes to teach the material in the and not d 5 e chapter. Within each chapter there are a num mber of sectio ons. Each se ection also ha the learning as objective and the am es mount of time required. S e Subsections that do not h have a time g given are included within the time for the section. e Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 9 of 78 7 31-Mar r-2011
  • 10. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1. Fundam mentals of Test ting (K2 2) 15 minu 55 utes Learni Objec ing ctives for Fundam r mentals of Testing f The obje ectives identify what you will be able t do followin the compl to ng letion of each module. h 1.1 Wh is Testing Necess hy sary? (K2) LO-1.1.1 1 LO-1.1.2 2 LO-1.1.3 3 LO-1.1.4 4 LO-1.1.5 5 Describe with examples, the way in which a defect in sof e, y ftware can ca ause harm to a o person, t the enviro to onment or to a company (K2) ( Distinguish between the root cau of a defec and its effe use ct ects (K2) Give rea asons why testing is nece essary by giv ving example (K2) es Describe why testing is part of qu e g uality assurance and give examples o how testing e of contribut to higher quality (K2) tes r Explain a compare the terms e and e error, defect, fault, failure and the cor e, rresponding terms mistake and bug, usi examples (K2) ing s 1.2 Wh is Testing? (K2) hat LO-1.2.1 1 LO-1.2.2 2 LO-1.2.3 3 Recall th common o he objectives of testing (K1) f Provide examples for the objectiv of testing in different phases of th software life ves g he cycle (K2 2) Different tiate testing f from debugg ging (K2) 1.3 Sev Testin Princip ven ng ples (K2) LO-1.3.1 1 Explain t seven pr the rinciples in te esting (K2) 1.4 Fun ndamenta Test Pro al ocess (K1) ) LO-1.4.1 1 Recall th five fundamental test a he activities and respective t d tasks from planning to closure (K1) 1.5 The Psychology of Tes e sting (K2) ) LO-1.5.1 1 LO-1.5.2 2 Recall th psycholog he gical factors t that influence the succes of testing ( e ss (K1) Contrast the mindset of a tester a of a deve t t and eloper (K2) Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 10 of 78 31-Ma ar-2011
  • 11. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.1 Why is Testing Necess g sary (K2 2) 20 minut tes Terms Bug, def fect, error, fa ailure, fault, m mistake, qual lity, risk 1.1.1 Software Systems Context ( e s (K1) Software systems ar an integral part of life, f e re l from busines applications (e.g., ban ss nking) to cons sumer products (e.g., cars). Most people have had a experienc with softwa that did n work as s . e an ce are not expected Software t d. that does not work correc can lead to many pro t ctly oblems, including loss of money, t time or busin ness reputation, and coul even caus injury or de ld se eath. 1.1.2 Causes o Softwar Defects (K2) of re s A human being can m n make an erro (mistake), which produ or uces a defect (fault, bug) in the progra am code, or in a docume If a defec in code is executed, th system ma fail to do w ent. ct he ay what it shoul do ld (or do so omething it shouldn’t), ca ausing a failure. Defects in software, s systems or d documents may m result in failures, but not all defec do so. cts Defects occur because human be eings are fallible and bec cause there is time press sure, complex x code, co omplexity of infrastructure changing t e, technologies, and/or man system int ny teractions. Failures can be caus by enviro sed onmental con nditions as well. For example, radiati w ion, magnetis sm, electroni fields, and pollution can cause faults in firmwar or influenc the execut ic re ce tion of softwa by are changing the hardwa conditions. g are 1.1.3 Role of T Testing in Software Developm ment, Main ntenance a and tions (K2) Operat Rigorous testing of s s systems and documentati can help to reduce th risk of problems occurring ion he during operation and contribute to the quality of the software system, if the defects found are d o s corrected before the system is re d eleased for operational us se. Software testing may also be req e y quired to mee contractua or legal req et al quirements, o industry-specific or standard ds. 1.1.4 Testing a and Qualit (K2) ty With the help of testing, it is poss sible to meas sure the qual of software in terms o defects fou lity of und, for both f functional an non-functi nd ional softwar requireme re ents and char racteristics (e e.g., reliabilit ty, usability, efficiency, m maintainability and portab bility). For more information on non-fu unctional tes sting see Cha apter 2; for more informat tion on software characte eristics see ‘S Software Eng gineering – Software Product Qu e uality’ (ISO 9126). Testing c give con can nfidence in th quality of t software if it finds few or no defec A proper he the w cts. rly designed test that pa d asses reduce the overall level of risk in a system When testing does find es k m. d defects, the quality o the softwar system inc of re creases whe those defe en ects are fixed d. Lessons should be le s earned from previous pro ojects. By understanding the root causes of defec cts found in other projec processe can be imp cts, es proved, whic in turn sho ch ould prevent those defect from ts reoccurring and, as a consequen nce, improve the quality of future syst o tems. This is an aspect of o quality assurance. Testing s should be int tegrated as o of the qu one uality assurance activities (i.e., alongs s side develop pment standard training and defect an ds, nalysis). Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 11 of 78 31-Ma ar-2011
  • 12. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.1.5 How Muc Testing is Enoug ch g gh? (K2) Deciding how much t g testing is eno ough should take accoun of the leve of risk, inclu nt el uding technic cal, safety, a business risks, and p and s project constr raints such as time and b a budget. Risk is discussed k d further in Chapter 5. n Testing s should provid sufficient information t stakeholders to make informed decisions abou the de to ut release o the softwa or system being teste for the next development step or h of are m ed, handover to custome ers. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 12 of 78 31-Ma ar-2011
  • 13. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.2 30 minut tes What is Testing? (K2) s Terms Debugging, requirem ment, review, test case, te esting, test objective o Backgr round A common perceptio of testing i that it only consists of running tests i.e., execu on is y s, uting the softw ware. This is p of testing but not all o the testing activities. part g, of g Test acti ivities exist b before and af test exec fter cution. Thes activities in se nclude plann ning and cont trol, choosing test conditions, designing and exec g cuting test cases, checkin results, ev ng valuating exit t criteria, r reporting on the testing p process and system unde test, and fi er inalizing or c completing closure activities after a test phase has b s been complet ted. Testing also includes reviewing d s documents (including source cod and cond de) ducting static analysis. c Both dyn namic testing and static te g esting can be used as a means for ac chieving sim milar objective es, and will provide infor rmation that c be used to improve both the syst can b tem being tes sted and the e developm ment and tes sting process ses. Testing c have the following ob can e bjectives: o Finding defects o Gain ning confiden about the level of qua nce e ality o Prov viding information for dec cision-making g o Prev venting defec cts The thou ught process and activitie involved in designing tests early in the life cycle (verifying the s es n t e test basis via test design) can he to prevent defects from being intro elp t m oduced into c code. Review of ws documen (e.g., req nts quirements) a the ident and tification and resolution o issues also help to prev d of o vent defects a appearing in the code. Different viewpoints in testing tak different o t ke objectives into account. F example, in developm o For ment testing (e e.g., compon nent, integrat tion and syst tem testing), the main ob bjective may be to cause as many fai ilures as pos ssible so that defects in th software are identified and can be fixed. In t he a d e acceptan testing, t main obje nce the ective may b to confirm that the system works as expected, to be gain con nfidence that it has met th requireme he ents. In some cases the m e main objectiv of testing may ve be to ass sess the qua of the so ality oftware (with no intention of fixing defe ects), to give information to e n stakeholders of the r of releasing the syste at a given time. Maint risk em n tenance testi often incl ing ludes testing th no new d hat defects have been introdu uced during development of the chan d nges. During operational testing, the main obje ective may be to assess system chara s acteristics su as reliab uch bility or availability. Debugging and testin are differe Dynamic testing can show failure that are ca ng ent. c es aused by def fects. Debugging is the dev velopment ac ctivity that fin nds, analyzes and remov the cause of the failur ves e re. Subsequ uent re-testin by a tester ensures tha the fix doe indeed res ng at es solve the failure. The responsi ibility for thes activities i usually tes se is sters test and developers debug. d s The proc cess of testin and the te ng esting activitie are explained in Section 1.4. es Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 13 of 78 31-Ma ar-2011
  • 14. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.3 Seven Testing Principles (K2) ) 35 minut tes Terms Exhaustive testing Princip ples A numbe of testing p er principles ha been sug ave ggested over the past 40 years and offer general r guideline common f all testing es for g. esence of d defects Principle 1 – Testing shows pre Testing c show tha defects are present, bu cannot pro that there are no defe can at ut ove e ects. Testing g reduces the probability of undisco overed defec remaining in the softw cts g ware but, eve if no defec are en cts found, it is not a proo of correctn of ness. Principle 2 – Exhau ustive testing is imposs sible Testing e everything (a combinatio of inputs and precon all ons s nditions) is no feasible ex ot xcept for trivi ial cases. In nstead of exh haustive test ting, risk ana alysis and priorities should be used to focus testing d o efforts. Principle 3 – Early t testing To find d defects early, testing activ vities shall be started as early as pos ssible in the s software or system s developm ment life cycle, and shall be focused on defined objectives. o t Principle 4 – Defect clustering Testing e effort shall be focused pr e roportionally to the expec cted and later observed d defect density of y modules A small number of mod s. dules usually contains mo of the def y ost fects discove ered during prep release t testing, or is responsible for most of t operation failures. the nal Principle 5 – Pestic cide paradox x If the sam tests are repeated ov and over again, event me ver tually the sam set of tes cases will no me st longer fin any new d nd defects. To o overcome thi “pesticide paradox”, test cases nee to be regu is ed ularly reviewed and revised and new a different tests need to be written t exercise d d d, and o to different parts of s the softw ware or syste to find potentially mor defects. em re t t Principle 6 – Testing is context dependent Testing i done differ is rently in diffe erent context For example, safety-critical softwa is tested ts. are differentl from an ely -commerce s site. nce-of-errors fallacy s Principle 7 – Absen Finding a fixing de and efects does n help if the system built is unusable and does n fulfill the users’ not e e not needs an expectatio nd ons. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 14 of 78 31-Ma ar-2011
  • 15. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.4 Fundam mental T Test Pro ocess (K K1) 35 minut tes Terms Confirma ation testing, re-testing, e criteria, incident, regr , exit ression testin test basis test condit ng, s, tion, test cove erage, test da test execution, test log, test plan, test proced ata, dure, test policy, test suite test e, summary report, test y tware Backgr round The mos visible part of testing is test executi st t s ion. But to be effective an efficient, t e nd test plans sh hould also inclu time to b spent on p ude be planning the tests, designing test cas ses, preparin for execution ng and eval luating result ts. The fund damental tes process co st onsists of the following ma activities: ain o Test planning an control t nd o Test analysis and design t o Test implementa t ation and exe ecution o Evaluating exit c criteria and re eporting o Test closure activities t Although logically se h equential, the activities in the process may overlap or take plac concurren e p ce ntly. Tailoring these main activities wit g thin the context of the system and the project is u e usually requir red. 1.4.1 Test Plan nning and Control ( d (K1) Test plan nning is the a activity of de efining the ob bjectives of te esting and th specificatio of test ac he on ctivities in order to meet the o objectives an mission. nd Test con ntrol is the on ngoing activit of comparing actual pr ty rogress again the plan, and reportin the nst ng status, in ncluding deviations from the plan. It in nvolves takin actions ne ng ecessary to m meet the mis ssion and obje ectives of the project. In o e order to contr testing, th testing activities shoul be monitor rol he ld red througho the projec Test planning takes in account the feedback from monito out ct. nto k oring and con ntrol activities s. nning and co ontrol tasks a defined in Chapter 5 of this syllab are n o bus. Test plan 1.4.2 Test Ana alysis and Design (K K1) Test ana alysis and de esign is the a activity during which gene testing objectives are transformed into g eral e d tangible test conditio and test c ons cases. The test analysis and design acti d ivity has the following ma tasks: ajor o Revi iewing the te basis (suc as require est ch ements, softw ware integrity level1 (risk level), risk y analysis reports, architecture design, inte e, erface specif fications) o Evaluating testab bility of the te basis and test objects est d s o Identifying and p prioritizing tes conditions based on an st nalysis of tes items, the specification st n, beha avior and stru ucture of the software e o Desi igning and prioritizing hig level test c gh cases o Identifying neces ssary test data to support the test con t nditions and test cases o Desi igning the tes environme setup and identifying any required infrastructu and tools st ent d d ure o Crea ating bi-direc ctional tracea ability betwee test basis and test cas en ses 1 The degr to which sof ree ftware complies or must comply with a set of stakeholder-sele s y s ected software a and/or software-based system cha aracteristics (e.g software com g., mplexity, risk as ssessment, safe level, securit level, desired performance, ety ty reliability, o cost) which a defined to re or are eflect the importa ance of the soft tware to its stak keholders. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 15 of 78 31-Ma ar-2011
  • 16. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.4.3 Test Imp plementation and Ex xecution (K1) Test imp plementation and executio is the activity where te procedur or scripts are specifie by on est res s ed combinin the test ca ng ases in a par rticular order and includin any other information needed for test r ng t executio the enviro on, onment is set up and the tests are run t n. Test imp plementation and executio has the fo on ollowing majo tasks: or o Fina alizing, implem menting and prioritizing t test cases (in ncluding the identification of test data n a) o Deve eloping and prioritizing te procedure creating test data and optionally, preparing te est es, t d, est harn nesses and w writing autom mated test scr ripts o Crea ating test suit from the test procedu tes ures for efficient test exec cution o Verif fying that the test environ e nment has be set up co een orrectly o Verif fying and updating bi-dire ectional trace eability between the test basis and te cases est o Exec cuting test pr rocedures either manuall or by using test execut ly g tion tools, ac ccording to th he planned sequenc ce o Logg ging the outc come of test execution an recording the identities and versions of the sof nd s ftware unde test, test to er ools and test tware o Com mparing actua results with expected r al h results o Repo orting discrepancies as in ncidents and analyzing th d hem in order to establish their cause (e.g., r h a de efect in the co ode, in specified test data in the test document, o a mistake in the way th test a, or he was executed) o Repe eating test activities as a result of act tion taken for each discre epancy, for e example, reexec cution of a te that previo est ously failed in order to co onfirm a fix (c confirmation testing), exe ecution of a corrected tes and/or exe st ecution of tes in order to ensure tha defects have not been sts t at intro oduced in unc changed areas of the sof ftware or that defect fixing did not unc cover other defe ects (regressi testing) ion 1.4.4 Evaluatin Exit Cr ng riteria and Reporting (K1) g Evaluatin exit criteria is the activ where te execution is assessed against the defined ng vity est d objective This shou be done f each test level (see Section 2.2). es. uld for S Evaluatin exit criteria has the following majo tasks: ng or o Chec cking test log against th exit criteria specified in test plannin gs he a n ng o Asse essing if mor tests are n re needed or if t exit criter specified should be ch the ria hanged o Writi a test sum ing mmary repor for stakeho rt olders 1.4.5 Test Clos sure Activ vities (K1) Test clos sure activities collect data from comp a pleted test ac ctivities to consolidate exp perience, testware facts and n e, numbers. Tes closure ac st ctivities occur at project m r milestones su as when a uch software system is re e eleased, a te project is completed (o cancelled), a milestone has been est or achieved or a mainte d, enance relea has been completed. ase n Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 16 of 78 31-Ma ar-2011
  • 17. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Test clos sure activities include the following m e major tasks: o Chec cking which planned deliverables hav been deliv ve vered o Clos sing incident reports or ra aising change records for any that rem e r main open o Docu umenting the acceptance of the syste e e em o Fina alizing and ar rchiving testw ware, the tes environmen and the te infrastruct st nt est ture for later reuse o Hand ding over the testware to the mainten e o nance organi ization o Anal lyzing lesson learned to determine c ns o changes nee eded for futur releases a projects re and o Usin the information gathere to improv test maturity ng ed ve Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 17 of 78 31-Ma ar-2011
  • 18. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.5 The Ps sycholog of Tes gy sting (K2) 25 minut tes Terms Error gue essing, indep pendence Backgr round The mind dset to be us while tes sed sting and rev viewing is diff ferent from th used whi developing hat ile software With the rig mindset d e. ght developers a able to te their own code, but se are est eparation of this t responsi ibility to a tes is typically done to help focus eff and provide additiona benefits, su as ster fort al uch an indep pendent view by trained a professio w and onal testing resources. In r ndependent t testing may be b carried o at any lev of testing. out vel A certain degree of in n ndependenc (avoiding t author bias) often ma ce the akes the teste more effec er ctive at finding defects and failures. Ind g d dependence is not, howe e ever, a replac cement for fa amiliarity, an nd develope can efficiently find ma defects in their own code. Severa levels of in ers any c al ndependence can e be define as shown here from lo to high: ed n ow o Test designed b the person(s) who wro the softw ts by ote ware under test (low level of independence) o Test designed b another person(s) (e.g from the development team) ts by g., d t o Test designed b a person(s) from a diff ts by ferent organi izational grou (e.g., an i up independent test t team or test spe m) ecialists (e.g. usability or performanc test specia ., r ce alists) o Test designed b a person(s) from a diff ts by ferent organi ization or com mpany (i.e., outsourcing or certification by an external bo ody) People a projects are driven by objectives. People tend to align the plans with the objectives set and y . d eir by mana agement and other stakeh holders, for e example, to find defects o to confirm that softwar f or m re meets its objectives. Therefore, it is important to clearly state the obje s t ectives of testing. Identifyin failures du ng uring testing may be perc ceived as criticism agains the produc and agains the st ct st author. A a result, te As esting is ofte seen as a destructive activity, even though it is very constru en a n s uctive in the ma anagement o product ris of sks. Looking for failures in a system r requires curio osity, profess sional pessimis a critical eye, attentio to detail, g sm, on good commu unication with developme peers, and h ent experien on which to base erro guessing. nce or If errors, defects or fa ailures are co ommunicated in a constr ructive way, bad feelings between the e testers a the analy and ysts, designe and developers can be avoided. T ers b This applies t defects fou to und during re eviews as we as in testin ell ng. The teste and test le er eader need g good interper rsonal skills to communic cate factual information about a defects, progress and risks in a c constructive w way. For the author of the software o document, or defect in nformation ca help them improve the skills. Defe an m eir ects found and fixed duri testing will ing w save time and money later, and r y reduce risks. . Commun nication prob blems may oc ccur, particularly if testers are seen o only as messengers of unwante news abou defects. However, ther are severa ways to im ed ut re al mprove comm munication an nd relations ships betwee testers and others: en d Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 18 of 78 31-Ma ar-2011
  • 19. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus o o o o Start with collabo t oration rather than battles – remind everyone of th common goal of bette s e he er quality systems Com mmunicate fin ndings on the product in a neutral, fac e ct-focused w without cr way riticizing the pers who crea son ated it, for example, write objective an factual inc nd cident reports and review s w findings Try t understand how the ot to ther person f feels and why they react as they do Conf firm that the other person has unders n stood what yo have said and vice ve ou d ersa Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 19 of 78 31-Ma ar-2011
  • 20. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 1.6 10 minut tes Code o Ethics of s Involvem ment in software testing e enables indiv viduals to learn confidential and privile eged informa ation. A code of e ethics is necessary, amo other rea ong asons to ensu that the information is not put to ure s inapprop priate use. Re ecognizing th ACM and IEEE code of ethics for engineers, th ISTQB states the he d he following code of ethics: g PUBLIC - Certified so oftware teste shall act consistently with the pub interest ers blic CLIENT AND EMPLO OYER - Cert tified softwar testers sha act in a m re all manner that is in the best interests s of their c client and em mployer, cons sistent with th public inte he erest PRODUC - Certified software te CT d esters shall e ensure that th deliverables they prov he vide (on the products p and syst tems they tes meet the highest profe st) essional stan ndards possible JUDGME ENT- Certifie software t ed testers shall maintain inte egrity and ind dependence in their profe essional judgmen nt MANAGEMENT - Ce ertified softwa test man are nagers and le eaders shall subscribe to and promote an ethical a approach to the managem ment of softw ware testing on PROFES SSION - Cer rtified softwar testers shall advance the integrity and reputatio of the pro re ofession consistent with the public interest t COLLEA AGUES - Cer rtified softwa testers sh be fair to and support are hall tive of their c colleagues, and a promote cooperation with software developer n rs SELF - C Certified softw ware testers shall particip pate in lifelon learning r ng regarding the practice of their e professio and shall promote an ethical appro on oach to the practice of the profession p n Refere ences 1.1.5 Black, 2001, K Kaner, 2002 zer, ack, 2001, M Myers, 1979 1.2 Beiz 1990, Bla 1.3 Beiz 1990, He zer, etzel, 1988, M Myers, 1979 1.4 Hetz 1988 zel, 1.4.5 Black, 2001, C Craig, 2002 1.5 Blac 2001, Het ck, tzel, 1988 Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 20 of 78 31-Ma ar-2011
  • 21. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 2. T Testing Throug g ghout th Softw he ware Lif fe Cycle (K2) e 1 minutes 115 Learni Objec ing ctives for Testing Througho the S r out Software L Cycle Life e The obje ectives identify what you will be able t do followin the compl to ng letion of each module. h 2.1 Sof ftware Dev velopmen Models ( nt (K2) LO-2.1.1 1 LO-2.1.2 2 LO-2.1.3 3 Explain t relationship between developmen test activit the nt, ties and work products in the n developm ment life cyc by giving examples us cle, sing project a product types (K2) and Recognize the fact th software developmen models mu be adapte to the con hat nt ust ed ntext of projec and produc characteris ct ct stics (K1) Recall ch haracteristics of good tes s sting that are applicable t any life cy e to ycle model (K K1) 2.2 Tes Levels ( st (K2) LO-2.2.1 1 Compare the differen levels of te e nt esting: major objectives, typical objec of testing, r cts , typical ta argets of test ting (e.g., fun nctional or st tructural) and related wor products, people d rk who test types of de t, efects and failures to be id dentified (K2 2) 2.3 Tes Types (K2) st LO-2.3.1 1 LO-2.3.2 2 LO-2.3.3 3 LO-2.3.4 4 LO-2.3.5 5 Compare four softwa test types (functional, non-functional, structura and chang e are s , al gerelated) by example (K2) Recognize that funct tional and str ructural tests occur at any test level (K1) s Identify a describe non-functio and e onal test type based on n es non-functional requireme ents (K2) Identify a describe test types b and e based on the analysis of a software sy e ystem’s struc cture or archite ecture (K2) Describe the purpose of confirma e e ation testing and regression testing (K K2) 2.4 Maintenance Testing ( e (K2) LO-2.4.1 1 LO-2.4.2 2 LO-2.4.3 3. Compare maintenance testing (te e esting an existing system to testing a new application m) with resp pect to test ty ypes, triggers for testing and amount of testing (K K2) Recognize indicators for mainten s nance testing (modificatio migration and retirement) g on, (K1) Describe the role of r e regression te esting and im mpact analysis in mainten nance (K2) Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 21 of 78 31-Ma ar-2011
  • 22. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 2.1 Softwa Deve are elopmen Models (K2) nt 20 minut tes Terms Commer rcial Off-The-Shelf (COTS iterative-i S), incremental development model, validation, verification, V-model Backgr round Testing d does not exis in isolation test activiti are relate to softwar developme activities. st n; ies ed re ent Different developmen life cycle m t nt models need different approaches to testing. d 2.1.1 V-model (Sequential Develo opment Mo odel) (K2) Although variants of the V-model exist, a com h l mmon type of V-model us four test levels, f ses correspo onding to the four develop e pment levels s. r bus The four levels used in this syllab are: o Com mponent (unit testing t) o Integ gration testin ng o Syst tem testing o Acce eptance testi ing In practic a V-mode may have more, fewer or different levels of dev ce, el r velopment an testing, nd dependin on the pro ng oject and the software pr e roduct. For example, ther may be co re omponent integratio testing aft compone testing, an system in on ter ent nd ntegration tes sting after sy ystem testing g. Software work produ e ucts (such as business sc s cenarios or use cases, re u equirements s specification ns, design d documents an code) pro nd oduced during developme are often the basis of testing in on or ent f ne more tes levels. Ref st ferences for g generic work products include Capab k bility Maturity Model Integ y gration (CMMI) o ‘Software life cycle pro or ocesses’ (IEE EE/IEC 1220 Verification and valid 07). dation (and early e test design) can be c carried out du uring the dev velopment of the software work produ f e ucts. 2.1.2 Iterative-incremen Develo ntal opment Models (K2) Iterative-incremental developmen is the proc nt cess of estab blishing requi irements, designing, build ding and testi a system in a series o short deve ing m of elopment cyc cles. Examples are: proto otyping, Rapid Applicati Developm ion ment (RAD), Rational Un nified Process (RUP) and agile develo s opment mode A els. system t that is produc using the models m be teste at several test levels d ced ese may ed during each iteration. An increme added to others deve . ent, eloped previo ously, forms a growing pa artial system, which sh hould also be tested. Reg e gression testing is increas singly import tant on all ite erations after the r first one. Verification and validatio can be ca . on arried out on each increm ment. 2.1.3 Testing w within a Life Cycle M Model (K2 2) In any lif cycle model, there are several characteristics of good testin fe o ng: o For e every develo opment activi there is a corresponding testing ac ity ctivity o Each test level h test objec h has ctives specifi to that leve ic el o The analysis and design of te d ests for a giv test level should begi during the correspondi ven in ing deve elopment act tivity o Test ters should b involved in reviewing d be n documents as soon as dr a rafts are available in the deve elopment life cycle Test leve can be co els ombined or r reorganized d depending on the nature of the projec or the syst ct tem architect ture. For exa ample, for the integration of a Comme e ercial Off-The e-Shelf (COT software TS) product into a system the purcha m, aser may per rform integra ation testing a the system level (e.g., at m Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 22 of 78 31-Ma ar-2011
  • 23. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus integratio to the infr on rastructure and other systems, or sys stem deploym ment) and ac cceptance tes sting (function and/or no nal on-functional, and user an , nd/or operational testing) ). Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 23 of 78 31-Ma ar-2011
  • 24. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 2.2 40 minut tes Test Le evels (K K2) Terms Alpha testing, beta te esting, comp ponent testing driver, field testing, fun g, d nctional requ uirement, integratio integratio testing, no on, on on-functional requiremen robustness testing, stu system te l nt, s ub, esting, test environment, tes level, test-d st driven development, use acceptance testing er e Backgr round For each of the test levels, the fo h ollowing can b identified: the generic objectives, t work be c the product(s) being refe erenced for d deriving test c cases (i.e., th test basis the test ob he s), bject (i.e., wh is hat being tes sted), typical defects and failures to b found, tes harness requirements a tool supp l d be st and port, and spec approac cific ches and responsibilities. Testing a system’s configuration data shall be considered during test planning, e d 2.2.1 Component Testin (K2) ng Test bas sis: o Com mponent requ uirements o Deta ailed design o Code e Typical t test objects: o Com mponents o Prog grams o Data conversion / migration p a programs o Data abase modules Compon nent testing (a also known a unit, modu or progra testing) searches for d as ule am defects in, an nd verifies t functionin of, softwa modules, programs, objects, class the ng are o ses, etc., that are separat tely testable. It may be do in isolati from the rest of the sy . one ion ystem, depending on the context of the e developm ment life cycle and the sy ystem. Stubs drivers and simulators may be used s, d d. nent testing m include t may testing of fun nctionality an specific no nd on-functional characterist l tics, Compon such as resource-behavior (e.g., searching fo memory le or eaks) or robu ustness testin as well as ng, s structura testing (e.g decision c al g., coverage). Te cases are derived fro work prod est e om ducts such as a s specifica ation of the component, th software design or the data model. he e Typically component testing occ y, curs with acce to the co being tes ess ode sted and with the support of a h t developm ment environ nment, such as a unit test framework or debugging tool. In practice, comp ponent testing u usually involv the progr ves rammer who wrote the co ode. Defects are typically fixed as soo as y on they are found, witho formally m out managing the defects. ese One app proach to com mponent test ting is to prepare and aut tomate test c cases before coding. This is e s called a test-first app proach or tes st-driven deve elopment. Th approach is highly iterative and is his h s based on cycles of developing te cases, the building and integratin small piec of code, and n est en ng ces a executing the compo onent tests co orrecting any issues and iterating unt they pass. y til Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 24 of 78 31-Ma ar-2011
  • 25. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 2.2.2 Integratio Testing (K2) on g Test bas sis: o Softw ware and sys stem design o Arch hitecture o Workflows o Use cases Typical t test objects: o Subs systems o Data abase implem mentation o Infra astructure o Inter rfaces o Syst tem configura ation and configuration d data Integratio testing tests interfaces between components, interactions with differen parts of a on nt system, such as the operating sy ystem, file sy ystem and ha ardware, and interfaces b between syst tems. There may be more t than one lev of integrat vel tion testing and it may be carried out on test objec of a e cts varying s size as follow ws: 1. Com mponent integ gration testin tests the in ng nteractions between softw b ware compo onents and is done s after component testing r 2. Syst tem integratio testing tests the intera on actions betwe different systems or between een t hard dware and so oftware and m be done after system testing. In this case, the developing may e m g orga anization may control only one side of the interface. This migh be conside y y f ht ered as a risk k. Business proces sses impleme ented as wor rkflows may involve a ser ries of system Cross-platform ms. issue may be si es ignificant. The grea the scop of integrat ater pe tion, the more difficult it becomes to is b solate defect to a specif ts fic compone or system which may lead to incr ent m, y reased risk and additiona time for tro a al oubleshooting g. Systema integratio strategies may be bas on the sy atic on sed ystem archite ecture (such as top-down and n bottom-u functiona tasks, tran up), al nsaction proc cessing sequ uences, or so ome other as spect of the system s or compo onents. In or rder to ease fault isolation and detect defects earl integration should nor n t ly, n rmally be increm mental rathe than “big bang”. er Testing o specific no of on-functional characterist (e.g., performance) m be includ in integr l tics may ded ration testing a well as fun as nctional testin ng. At each stage of inte egration, teste concentr ers rate solely on the integrat n tion itself. Fo example, if they or f are integ grating modu A with mo ule odule B they are intereste in testing the commun ed nication betw ween the modu ules, not the functionality of the indivi y idual module as that was done during component e s g t testing. B Both function and struc nal ctural approaches may be used. e Ideally, t testers should understand the archite d ecture and inf fluence integ gration plann ning. If integra ation tests are planned before compon e nents or syste ems are built, those com mponents can be built in th n he order req quired for mo efficient t ost testing. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 25 of 78 31-Ma ar-2011
  • 26. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 2.2.3 System T Testing (K K2) Test bas sis: o Syst tem and softw ware require ement specification o Use cases o Func ctional specif fication o Risk analysis rep k ports Typical t test objects: o Syst tem, user and operation m manuals o Syst tem configura ation and configuration d data System t testing is con ncerned with the behavio of a whole system/prod h or duct. The tes sting scope shall s be clearl addressed in the Master and/or Lev Test Plan for that test level. ly d vel n In system testing, the test environ m e nment should correspond to the final target or pro d oduction environm ment as much as possible in order to minimize the risk of environment-spe e e ecific failures not s being fou in testing und g. testing may include tests based on risks and/or on requirements specifica s o ations, busine ess System t processe use case or other high level text descriptions or models of system be es, es, t s ehavior, interactio with the operating sy ons ystem, and sy ystem resources. System t testing should investigate functional a non-func e and ctional requir rements of th system, and he data qua characte ality eristics. Teste also need to deal with incomplete or undocum ers d h e mented requirem ments. System testing of f m functional requirements starts by usin the most a s ng appropriate specifica ation-based ( (black-box) te echniques fo the aspect of the syste to be teste For exam or t em ed. mple, a decision table may b created for combinations of effects described in business ru be s n ules. Structurebased te echniques (w white-box) ma then be us to asses the thoroughness of th testing with ay sed ss he respect t a structura element, s to al such as menu structure or web page n u o navigation (s Chapter 4). see An indep pendent test team often c carries out sy ystem testing g. 2.2.4 Acceptan Testin (K2) nce ng Test bas sis: o User requiremen r nts o Syst tem requirem ments o Use cases sses o Business proces o Risk analysis rep k ports Typical t test objects: o Business proces sses on fully integrated sy ystem o Operational and maintenance processes e o User procedures r s o Form ms o Repo orts o Conf figuration da ata Acceptance testing is often the re s esponsibility of the customers or user of a system other rs m; e s stakeholders may be involved as well. The goal in acceptan testing is to establish confidence in the system parts of th system or nce s h m, he r specific non-functional characteri istics of the s system. Find ding defects is not the ma focus in ain acceptan testing. A nce Acceptance t testing may assess the system’s read s diness for de eployment an nd Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 26 of 78 31-Ma ar-2011
  • 27. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus use, alth hough it is no necessarily the final lev of testing. For example, a large-sc ot y vel cale system integratio test may c on come after th acceptanc test for a system. he ce Acceptance testing m occur at various time in the life cycle, for exa may t es ample: o A CO OTS software product ma be accept ay tance tested when it is in nstalled or int tegrated o Acce eptance testi of the usa ing ability of a co omponent may be done d during component testing g o Acce eptance testi of a new functional en ing nhancement may come b t before system testing m Typical f forms of acce eptance testi include th following: ing he ceptance te esting User acc Typically verifies the fitness for use of the sys y stem by business users. onal (accept tance) testin ng Operatio The acce eptance of th system by the system administrato including he y ors, g: o Test ting of backu up/restore o Disa aster recover ry o User manageme r ent o Main ntenance tas sks o Data load and m a migration task ks o Perio odic checks of security vulnerabilities s ct ation accept tance testin ng Contrac and regula Contract acceptance testing is pe t e erformed aga ainst a contra act’s accepta ance criteria for producing custom-d developed so oftware. Acc ceptance crite should be defined wh the parti agree to the eria b hen ies t contract. Regulation acceptance testing is pe . erformed aga ainst any regu ulations that must be adh hered to, such as governme legal or safety regula ent, ations. g Alpha and beta (or field) testing Developers of marke or COTS, software ofte want to get feedback from potentia or existing et, en al g custome in their ma ers arket before the software product is put up for sale commercia e p ally. Alpha te esting is performed at the d developing or rganization’s site but not by the developing team. Beta testing or s g, field-test ting, is perfor rmed by cust tomers or po otential custo omers at their own locatio ons. Organiza ations may u other term as well, s use ms such as facto acceptanc testing an site accep ory ce nd ptance testing fo systems th are tested before and after being moved to a customer’s s or hat d site. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 27 of 78 31-Ma ar-2011
  • 28. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 2.3 40 minut tes Test Ty ypes (K2 2) Terms Black-bo testing, co coverage functional testing, inter ox ode e, roperability te esting, load t testing, maintain nability testing performan testing, p g, nce portability tes sting, reliabili testing, se ity ecurity testing, stress te esting, structu testing, u ural usability test ting, white-bo testing ox Backgr round A group of test activities can be a aimed at veri ifying the sof ftware system (or a part o a system) based m of on a spe ecific reason or target for testing. A test typ is focused on a partic pe d cular test obje ective, which could be an of the follo h ny owing: o A fun nction to be performed by the software y o A no on-functional quality characteristic, su as reliabi or usability uch ility o The structure or architecture of the softwa or system are m o Change related, i.e., confirmi that defe ing ects have bee fixed (con en nfirmation tes sting) and loo oking for u unintended ch hanges (regr ression testin ng) A model of the software may be d developed and/or used in structural te n esting (e.g., a control flow w model or menu struc r cture model), non-function testing (e nal e.g., performa ance model, usability mo odel security threat modeling), and fun nctional testing (e.g., a process flow m model, a stat transition model te or a plain language s n specification) ). 2.3.1 Testing o Functio (Functio of on onal Testi ing) (K2) The func ctions that a system, subs system or co omponent are to perform may be described in work e products such as a re s equirements specification use cases or a functio s n, s, onal specifica ation, or they may y be undoc cumented. T functions are “what” t system does. The s the d Function tests are based on fun nal nctions and f features (des scribed in do ocuments or u understood by the b testers) a their inte and eroperability with specific systems, an may be pe c nd erformed at a test levels (e.g., all s tests for components may be bas on a com s sed mponent specification). Specifica ation-based t techniques m be used to derive tes conditions and test cas from the may st s ses functiona of the so ality oftware or sy ystem (see C Chapter 4). Fu unctional tes sting conside the extern ers nal behavior of the softw r ware (black-b testing). box A type of functional testing, security testing, in f nvestigates the functions (e.g., a firew t s wall) relating to g detection of threats, such as virus n ses, from ma alicious outsiders. Anothe type of fun er nctional testing, interoper rability testin evaluates the capabili of the soft ng, s ity tware produc to interact with one or more ct specified component or systems d ts s. 2.3.2 Testing o Non-fun of nctional S Software Characteris C stics (Non n-functional Testing (K2) g) Non-func ctional testing includes, b is not limited to, perfo but ormance testing, load testing, stress testing, u usability testing, maintain nability testin reliability testing and p ng, t portability tes sting. It is the e testing o “how” the s of system works s. Non-func ctional testing may be pe erformed at a test levels. The term non-functiona testing des all al scribes the tests required to measure cha s aracteristics of systems and software that can be quantified on a a e o varying s scale, such a response times for per as rformance te esting. These tests can be referenced to a e d quality m model such a the one de as efined in ‘Sof ftware Engine eering – Soft tware Product Quality’ (IS SO Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 28 of 78 31-Ma ar-2011
  • 29. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 9126). N Non-functiona testing con al nsiders the external beha avior of the so oftware and in most case es uses bla ack-box test d design techn niques to acc complish that t. 2.3.3 Testing o Softwar Structu of re ure/Archite ecture (Str ructural Testing) (K K2) Structura (white-box testing may be perform at all test levels. Structural techni al x) y med t iques are best used afte specification-based techniques, in order to help measure th thoroughn er p he ness of testin ng through assessment of coverage of a type of structure. e Coverag is the exte that a stru ge ent ucture has be exercise by a test s een ed suite, expressed as a percenta of the ite age ems being co overed. If cov verage is not 100%, then more tests m be desig may gned to test th hose items th were miss to increa coverage Coverage techniques a covered in hat sed ase e. are Chapter 4. At all tes levels, but especially in component testing and component integration te st n t esting, tools can be used to measure the code cov verage of ele ements, such as stateme h ents or decisions. Structural testing m be based on the arch may d hitecture of th system, such as a cal he s lling hierarch hy. Structura testing app al proaches can also be applied at syste system i n em, integration or acceptance e testing le evels (e.g., to business m o models or me structure enu es). 2.3.4 Testing R Related to Changes Re-testing and Re o s: egression Testing (K2) After a d defect is dete ected and fixe the softw ed, ware should be re-tested t confirm th the origina b to hat al defect ha been succ as cessfully rem moved. This i called confirmation. De is ebugging (loc cating and fix xing a defect) is a developm s ment activity, not a testing activity. g Regress sion testing is the repeate testing of an already te s ed ested progra after mod am, dification, to discover any defects introduced o uncovered as a result of the chang r s or d ge(s). These defects may be y either in the software being tested, or in anoth related or unrelated s e her o software com mponent. It is s performe when the software, or its environm ed ment, is changed. The ext tent of regres ssion testing is g based on the risk of not finding defects in soft n ftware that was working p previously. hould be repe eatable if the are to be u ey used for conf firmation test ting and to assist regress sion Tests sh testing. Regress sion testing m be performed at all te levels, an includes functional, no may est nd on-functional and structura testing. Re al egression tes suites are r many tim and gene st run mes erally evolve slowly, so e regressio testing is a strong can on ndidate for au utomation. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 29 of 78 31-Ma ar-2011
  • 30. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 2.4 Maintenance T Testing ( (K2) 15 minut tes Terms Impact a analysis, maintenance tes sting Backgr round Once de eployed, a so oftware syste is often in service for years or decades. During this time the em n y g system, its configura ation data, or its environm ment are often corrected, changed or extended. Th n he planning of releases in advance i crucial for successful maintenance testing. A distinction has to be g is m e s made be etween plann releases and hot fixe Maintenan testing is done on an existing ned es. nce s n operational system, a is triggered by modif and fications, mig gration, or retirement of th software or he system. Modifica ations include planned en e nhancement c changes (e.g release-ba g., ased), correc ctive and emergen changes, and change of environ ncy es nment, such as planned o a operating sys stem or database upgrades, planned upgrade of Co ommercial-O Off-The-Shelf software, or patches to correct newly f r exposed or discovere vulnerabilities of the o d ed operating sys stem. Maintena ance testing for migration (e.g., from one platform to another) should inclu operation n m ude nal tests of t new environment as w as of the changed so the well e oftware. Migration testing (conversion g n testing) i also neede when data from anoth applicatio will be mig is ed a her on grated into th system be he eing maintain ned. Maintena ance testing for the retire ement of a sy ystem may in nclude the te esting of data migration or a archiving if long data g a-retention pe eriods are required. In additio to testing what has be changed, maintenanc testing inc on een ce cludes regression testing to g parts of t system that have not been chang the t ged. The sco of mainte ope enance testin is related to the ng t risk of th change, th size of the existing sys he he e stem and to the size of th change. D t he Depending on the n changes maintenanc testing may be done a any or all test levels an for any or all test types. s, ce at t nd r Determin ning how the existing sys e stem may be affected by changes is c called impact analysis, an is t nd used to h help decide h how much re egression tes sting to do. The impact analysis may be used to T determin the regres ne ssion test sui ite. Maintena ance testing can be diffic if specific cult cations are out of date or missing, or testers with o r domain k knowledge a not availa are able. Refere ences 2.1.3 CM MMI, Craig, 2 2002, Hetzel 1988, IEEE 12207 l, E 2.2 Hetz 1988 zel, 2.2.4 Co opeland, 200 Myers, 19 04, 979 2.3.1 Be eizer, 1990, B Black, 2001, Copeland, 2 2004 2.3.2 Black, 2001, IS 9126 SO 2.3.3 Be eizer, 1990, C Copeland, 20 004, Hetzel, 1988 2.3.4 He etzel, 1988, I IEEE STD 82 29-1998 2.4 Blac 2001, Cra 2002, He ck, aig, etzel, 1988, I IEEE STD 82 29-1998 Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 30 of 78 31-Ma ar-2011
  • 31. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 3. Static T S Techniques (K2 2) 6 minut 60 tes Learni Objec ing ctives for Static Te r echniques The obje ectives identify what you will be able t do followin the compl to ng letion of each module. h 3.1 Sta Techniques and the Test Process (K2) atic d ( LO-3.1.1 1 LO-3.1.2 2 LO-3.1.3 3 Recognize software work produc that can be examined by the differ cts b rent static techniqu (K1) ues Describe the importa e ance and valu of conside ue ering static te echniques fo the assess or sment of softwa work products (K2) are Explain t differenc between s the ce static and dyn namic techniques, consid dering object tives, types of defects to be identified, a the role of these tech e and hniques with the softwa life hin are cycle (K2 2) 3.2 Rev view Proc cess (K2) LO-3.2.1 1 LO-3.2.2 2 LO-3.2.3 3 Recall th activities, roles and responsibilities of a typical formal revie (K1) he s ew Explain t differenc between different type of reviews informal re the ces es s: eview, techni ical review, w walkthrough and inspection (K2) Explain t factors fo successful performanc of reviews (K2) the or ce s 3.3 Sta Analys by Too (K2) atic sis ols LO-3.3.1 1 LO-3.3.2 2 LO-3.3.3 3 Recall ty ypical defects and errors identified by static analys and comp s y sis pare them to o reviews and dynamic testing (K1) c Describe using exam e, mples, the ty ypical benefits of static an nalysis (K2) List typic code and design defe cal ects that may be identified by static an y d nalysis tools (K1) Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 31 of 78 31-Ma ar-2011
  • 32. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 3.1 Static T Techniques and the Tes Proce d st ess (K2) 15 minut tes Terms Dynamic testing, stat testing c tic Backgr round Unlike dy ynamic testin which req ng, quires the ex xecution of so oftware, static testing tec chniques rely on y the manu examinat ual tion (reviews and autom s) mated analysi (static ana is alysis) of the code or othe er project d documentatio without the execution of the code. on Reviews are a way o testing soft s of tware work p products (including code) and can be performed well ) w before dynamic test e execution. D Defects detec cted during re eviews early in the life cy ycle (e.g., def fects found in requirement are often much cheap to remove than those detected by running test on ts) per e y ts the exec cuting code. A review could be do entirely a a manual activity, but there is also tool support The main w one as t. manual a activity is to e examine a w work product and make co omments about it. Any so oftware work k product c be reviewed, includin requireme can ng ents specifica ations, desig specifications, code, te gn est plans, te specifications, test cas est ses, test scri ipts, user guides or web pages. Benefits of reviews in nclude early defect detec ction and cor rrection, deve elopment pro oductivity improvem ments, reduc developm ced ment timesca ales, reduced testing cos and time, li d st ifetime cost reduction fewer def ns, fects and improved comm munication. Reviews can find omissio R n ons, for exam mple, in require ements, whic are unlike to be foun in dynamic testing. ch ely nd Reviews static analy and dyna s, ysis amic testing have the same objective – identifying defects. Th e g hey are complementary; the different techniques can find diffe erent types o defects effe of ectively and efficiently. Compared to dynamic testing, stat technique find causes of failures (defects) rather d c tic es than the failures them mselves. Typical d defects that a easier to find in reviews than in dynamic testin include: d are ng deviations fro om standard requireme defects, d ds, ent design defec insufficie maintaina cts, ent ability and inc correct interfa ace specifica ations. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 32 of 78 31-Ma ar-2011
  • 33. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 3.2 25 minut tes Review Proces (K2) w ss Terms Entry criteria, formal review, infor rmal review, inspection, metric, mode m erator, peer r review, review wer, scribe, te echnical review, walkthro ough Backgr round The diffe erent types o reviews vary from inform characterized by no written instr of mal, o ructions for reviewer to systematic, charact rs, terized by tea participat am tion, docume ented results of the review and s w, documen nted procedu ures for cond ducting the re eview. The fo ormality of a review proce is related to ess d factors s such as the m maturity of the developme process, any legal or regulatory re ent equirements or the s need for an audit trai r il. The way a review is carried out d y depends on t agreed objectives of t review (e the the e.g., find defe ects, gain und derstanding, educate test ters and new team memb w bers, or discu ussion and d decision by consens sus). 3.2.1 Activities of a Form Revie (K1) s mal ew A typical formal revie has the fo l ew ollowing main activities: n 1. Plan nning • D Defining the review criter ria • S Selecting the personnel e • A Allocating ro oles • D Defining the entry and ex criteria for more forma review type (e.g., insp xit r al es pections) • S Selecting wh hich parts of documents t review to • C Checking en criteria (f more form review types) ntry for mal 2. Kick k-off • D Distributing d documents • E Explaining th objectives process an document to the participants he s, nd ts 3. Indiv vidual prepar ration • P Preparing for the review meeting by r reviewing the document(s) e • N Noting poten ntial defects, questions an comment nd ts 4. Exam mination/eva aluation/recor rding of resu (review meeting) ults m • D Discussing o logging, with documented results or minutes (fo more form review typ or o or mal pes) • N Noting defec making re cts, ecommenda ations regarding handling the defects, making dec cisions about the de a efects • E Examining/e evaluating and recording issues during any physic meetings or tracking any g cal a group electro g onic commun nications 5. Rew work • F Fixing defect found (typ ts pically done b the author by r) • R Recording updated status of defects (in formal reviews) 6. Follo ow-up • C Checking tha defects ha been add at ave dressed • G Gathering metrics • C Checking on exit criteria (for more for n rmal review types) t 3.2.2 Roles an Respon nd nsibilities (K1) A typical formal revie will includ the roles b l ew de below: o Manager: decide on the exe es ecution of rev views, alloca ates time in p project sched dules and dete ermines if the review obje e ectives have been met. Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 33 of 78 31-Ma ar-2011
  • 34. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus o o o o Moderator: the p person who le eads the review of the do ocument or s of docume set ents, includin ng planning the revi iew, running the meeting, and followin ng-up after th meeting. If necessary the he y, moderator may m mediate betw ween the various points of view and is often the pe o s erson upon whom w the s success of th review res he sts. Auth the writer or person w chief res hor: with sponsibility fo the docum or ment(s) to be reviewed. Revi iewers: indiv viduals with a specific technical or bus siness backg ground (also called check kers or inspe ectors) who, after the necessary prep paration, iden ntify and des scribe finding (e.g., defe gs ects) in the p product unde review. Re er eviewers sho ould be chose to represe different perspectives and en ent s roles in the revie process, a should ta part in any review me s ew and ake eetings. Scrib (or record be der): docume ents all the is ssues, proble ems and open points that were identif t fied durin the meetin ng ng. Looking at software p products or r related work products fro different p om perspectives and using checklist can make reviews mor effective a efficient. For example a checklist based on various ts re and e, t v perspect tives such as user, maint s tainer, tester or operation or a chec r ns, cklist of typica requirements al problems may help to uncover pr s o reviously und detected issu ues. 3.2.3 Types of Reviews (K2) f A single software pro oduct or relat work pro ted oduct may be the subject of more than one review If e n w. more tha one type o review is u an of used, the ord may vary For example, an inform review ma be der y. mal ay carried o before a t out technical rev view, or an in nspection ma be carried out on a req ay quirements specifica ation before a walkthroug with customers. The main characte gh m eristics, optio and purp ons poses of comm review ty mon ypes are: Informal Review o No fo ormal proces ss o May take the form of pair pro m ogramming o a technical lead review or wing designs and code o Resu may be d ults documented o Varie in usefuln es ness dependi on the re ing eviewers o Main purpose: in n nexpensive w to get so way ome benefit rough Walkthr o Mee eting led by a author o May take the form of scenarios, dry runs, peer group participation m , n ssions o Open-ended ses Optional pre-meeting pre eparation of r reviewers • O • O Optional preparation of a review repo including list of finding ort gs o Optio onal scribe (who is not th author) he o May vary in prac ctice from qui informal to very forma ite al o Main purposes: learning, gaining unders n standing, find ding defects cal Technic Review o Docu umented, de efined defect-detection pr rocess that in ncludes peer and techni rs ical experts with w optio onal manage ement particip pation o May be performe as a peer review witho managem ed out ment participation o Idea led by trained modera (not the a ally ator author) o Pre-meeting prep paration by r reviewers o Optio onal use of c checklists o Prep paration of a review repor which inclu rt udes the list of findings, t verdict whether the the softw ware product meets its re t equirements and, where appropriate, recommendations relate to a ed findings o May vary in prac ctice from qui informal to very forma ite al o Main purposes: discussing, making decis n sions, evalua ating alternat tives, finding defects, sol g lving technical problem and chec ms cking conform mance to spe ecifications, p plans, regula ations, and standards Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 34 of 78 31-Ma ar-2011
  • 35. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus Inspecti ion o Led by trained m moderator (no the author) ot o Usua conducte as a peer examination ally ed n o Defin roles ned o Inclu udes metrics gathering o Form process b mal based on rules and chec cklists o Spec cified entry a exit criter for accep and ria ptance of the software pro oduct o Pre-meeting prep paration o Inspection report including lis of findings t st o Form follow-up process (wi optional p mal p ith process impro ovement com mponents) o Optio onal reader o Main purpose: fin n nding defects s Walkthro oughs, technical reviews and inspecti ions can be performed w p within a peer g group, i.e., colle eagues at the same organizational lev This type of review is called a “pe review”. e vel. e s eer 3.2.4 Success Factors f Review (K2) s for ws Success factors for r s reviews include: o Each review has clear predef h s fined objectiv ves o The right people for the revie objectives are involved ew s d o Test ters are value reviewers who contrib ed s bute to the re eview and als learn abou the produc so ut ct whic enables th ch hem to prepa tests earlier are o Defe ects found ar welcomed and express objective re sed ely o Peop issues an psycholog ple nd gical aspects are dealt with (e.g., mak s king it a positive experien for nce the a author) o The review is conducted in an atmospher of trust; th outcome w not be us for the re he will sed evaluation of the participants e s o Revi iew techniqu are applie that are s ues ed suitable to ac chieve the ob bjectives and to the type and a level of software work produc and revie cts ewers o Chec cklists or role are used if appropriate to increase effectivenes of defect identification es e e ss n o Train ning is given in review techniques, es specially the more formal techniques such as l inspe ection o Management sup pports a goo review pro od ocess (e.g., by incorporat b ting adequate time for rev e view vities in proje schedules ect s) activ o Ther is an emphasis on lear re rning and pro ocess improv vement Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 35 of 78 31-Ma ar-2011
  • 36. International Software Te esting Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 3.3 Static A Analysis by Too (K2) s ols 20 minut tes Terms Compiler, complexity control flow data flow, static analys y, w, sis Backgr round The obje ective of stati analysis is to find defects in softwa source co and softw ic s are ode ware models s. Static an nalysis is per rformed witho actually e out executing the software be e eing examine by the too ed ol; dynamic testing does execute the software co c s e ode. Static analysis can l locate defect that are ha to ts ard find in dy ynamic testin As with re ng. eviews, static analysis fin defects r c nds rather than fa ailures. Static c analysis tools analyz program c ze code (e.g., co ontrol flow an data flow) as well as g nd ), generated ou utput such as HTML and X XML. The valu of static an ue nalysis is: o Early detection o defects prior to test exe y of ecution o Early warning ab y bout suspicio aspects o the code or design by t calculatio of metrics such ous of o the on s, as a high comple exity measur re o Identification of d defects not e easily found b dynamic testing by t o Dete ecting dependencies and inconsistenc cies in softw ware models s such as links s o Impr roved mainta ainability of c code and des sign o Prev vention of defects, if lesso are learn in develo ons ned opment Typical d defects disco overed by sta analysis tools include atic e: o Refe erencing a va ariable with a undefined value an d o Inconsistent interfaces betwe modules and compon een s nents o Varia ables that ar not used o are improp re or perly declared d o Unre eachable (de ead) code o Miss sing and erro oneous logic (potentially infinite loops) o Overly complicat construct ted ts o Prog gramming sta andards viola ations o Secu urity vulnerab bilities o Synt violations of code and software m tax s d models Static an nalysis tools are typically used by dev velopers (che ecking against predefined rules or d programming standa ards) before a during component an integration testing or w and nd when checking-in configuration manageme tools, and by designers during sof n ent d ftware model ling. Static code to c analysis tools may produce a larg number o warning me ge of essages, which need to b well-mana be aged to allow the most effe ective use of the tool. f Compilers may offer some suppo for static a ort analysis, incl luding the ca alculation of m metrics. Refere ences 3.2 IEEE 1028 E 3.2.2 Gi 1993, van Veenendaa 2004 ilb, n al, 3.2.4 Gi 1993, IEE 1028 ilb, EE 3.3 van Veenendaal 2004 l, Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 36 of 78 31-Ma ar-2011
  • 37. International esting Software Te Q Qualifications Board s Certif fied Teste er Founda ation Level Sy yllabus 4. Test De T esign Te echniqu (K4) ues ) 28 minu 85 utes Learni Objec ing ctives for Test Des r sign Tech hniques The obje ectives identify what you will be able t do followin the compl to ng letion of each module. h 4.1 The Test Dev e velopment Process (K3) t LO-4.1.1 1 LO-4.1.2 2 LO-4.1.3 3 LO-4.1.4 4 Different tiate between a test desig specificat n gn tion, test case specificatio and test on procedure specificati (K2) ion Compare the terms t e test condition test case and test proc n, a cedure (K2) Evaluate the quality of test cases in terms of clear traceab e s bility to the re equirements and s expected results (K2 d 2) Translate test cases into a well-s e structured tes procedure specification at a level of st n o detail rel levant to the knowledge o the testers (K3) of s 4.2 Cat tegories o Test Des of sign Tech hniques (K K2) LO-4.2.1 1 LO-4.2.2 2 Recall re easons that b both specification-based (black-box) a structure and e-based (whitebox) test design tech t hniques are u useful and lis the commo technique for each (K st on es K1) Explain t characteristics, comm the monalities, an difference between s nd es specification-based testing, s structure-bas testing a experienc sed and ce-based tes sting (K2) 4.3 Spe ecification n-based or Black-bo Techniques (K3) ox LO-4.3.1 1 LO-4.3.2 2 LO-4.3.3 3 Write tes cases from given softw st m ware models using equiva alence partiti ioning, bound dary value an nalysis, decis sion tables an state transition diagra nd ams/tables (K K3) Explain t main pur the rpose of each of the four testing techn h niques, what level and ty of t ype testing c could use the technique, a how cov e and verage may b measured (K2) be d Explain t concept of use case testing and its benefits (K the K2) 4.4 Str ructure-ba ased or Wh hite-box T Techniques (K4) LO-4.4.1 1 LO-4.4.2 2 LO-4.4.3 3 LO-4.4.4 4 Describe the concep and value o code cove e pt of erage (K2) Explain t concepts of statemen and decision coverage and give re the s nt e, easons why these t concepts can also be used at tes levels othe than component testing (e.g., on s e st er g business procedures at system le s s evel) (K2) Write tes cases from given contr flows usin statement and decision test design st m rol ng t n techniqu (K3) ues Assess s statement an decision c nd coverage for completenes with respe to defined exit ss ect d criteria. ( (K4) 4.5 Exp perience-b based Tec chniques ( (K2) LO-4.5.1 1 LO-4.5.2 2 Recall re easons for w writing test ca ases based on intuition, e o experience an knowledg nd ge about co ommon defec (K1) cts Compare experience e e-based tech hniques with specification n-based testin technique (K2) ng es 4.6 Cho oosing Te Techni est iques (K2) ) LO-4.6.1 1 Classify test design t techniques a according to their fitness t a given co t to ontext, for the test e basis, re espective mo odels and sof ftware charac cteristics (K2 2) Version 2 2011 © Internationa Software Testing Q al Qualifications Board Page 37 of 78 31-Ma ar-2011