Riley, Jenn. "Assessment of Metadata Remediation Efforts." Metadata Enhancement and OAI Workshop (MEOW), Robert W. Woodruff Library, Emory University, July 24-25, 2006.
2. General approach to assessment
Traditional library approach
Measure inter-indexer consistency
Count errors
Record focus
For this project, I’ll argue
User & task focus
Define accuracy in terms of functionality achieved
2
Metadata Enhancement and OAI Workshop
3. Why that approach?
Functionalities we’re thinking about aren’t
supported well by traditional library structures
and vocabularies
High-level browsing
Genre access
“Ground truth” doesn’t really exist for things
like “subject” and “genre”
We need to figure out where human effort is
most useful
3
Metadata Enhancement and OAI Workshop
4. One size does not fit all for metadata
enhancement activities
(a la DCMI
Type)
(syntax
(syntax normalization,
normalization)
plus ?)
Dates
Resource
Type
Names
(form? style?)
Subject Genre
Factual
Interpretive
easy?
hard?
What about geography?
4
Metadata Enhancement and OAI Workshop
5. Very preliminary possible metrics
Syntax normalization: what functionality can we
provide on the records after syntax normalization?
More interpretive tasks: do the resources retrieved
in response to a query match user expectations?
All: how confident are we in the results of the
automatic enhancement?
Need actual numbers at some point
To figure out how well services will work
Think about pre-human involvement vs. post-human
involvement
5
Metadata Enhancement and OAI Workshop