An introduction to Enterprise Search. A two hour course to introduce Enterprise Search by Kristian Norling. This is a class I love to do, so if you have interest in it for on premise/in-house class or at a conference or such, please contact me.
The course covers:
- Problems for enterprise search to solve.
- Web Search
- How we search and find?
- Current state of Enterprise Search, including stats
- Technical concept
- Information quality and metadata
- Feedback cycle
- Five dimensions of Findability
3. • Problem
• Web search
• How we search and !nd?
• Current state of Enterprise Search + stats
• Technical concept
• Information quality
• Feedback cycle
Agenda
10. Finding something I have seen
before, but can’t remember
where.
Four modes of seeking information
11. •Amount of information is growing
everyday
•What to Search for?
•Where to Search?
•How to Search?
•Search is simple, complex and powerful
•Findability Dimensions
The State of Enterprise Search
21. WHAT ARE THE OBSTACLES
TO FINDING THE RIGHT
INFORMATION?
22. 63.4% POOR SEARCH FUNCTIONALITY
52.1% DON'T KNOW WHERE TO LOOK
51.4% INCONSISTENCY IN HOW WE TAG
CONTENT
50.0% LACK OF ADEQUATE TAGS
33.1% DON’T KNOW WHAT TO LOOK FOR
Globally
23. “Enterprise search is the practice of making content from multiple
enterprise-type sources, such as databases and intranets, searchable to
a de!ned audience.”
http://en.wikipedia.org/wiki/Enterprise_search
Wikipedia De!nition
24. In the !eld of information retrieval, precision is the
fraction of retrieved documents that are relevant to the
search.
Precision takes all retrieved documents into account,
but it can also be evaluated at a given cut-o" rank,
considering only the topmost results returned by the
system. This measure is called precision at n or P@n.
Source: Wikipedia
The Concept of Enterprise
Search: Precision
25. Recall in information retrieval is the fraction of the
documents that are relevant to the query that are
successfully retrieved.
For example for text search on a set of documents recall
is the number of correct results divided by the number
of results that should have been returned.
Source: Wikipedia
The Concept of Enterprise
Search: Recall
26. M number of
relevant documents
N number of
retrieved documents
R number of
retrieved documents
that are also relevant
Precision and Recall
28. We do not have PageRank...
...but we have social!
Social Reconnects Enterprise Search
Emails, People Catalogues, Connections, Tagging, Sharing etc.
Relevance
30. Examples of implementations:
- People Search
- Product Search
- Document Search
- Intranet and Website Search
- E-commerce
- Dashboard / Search as a Service
Search based Solutions
31. • Good Data/Information hygiene
• Crap in = Crap out
• Metadata is very important!
• Taxonomy and Metadata demysti!ed
• TetraPak example (video)
• SimCorp example
• VGR example (video)
Information / Content
38. Example: Ernst & Young
• Metadata
• Titles
• Content Quality
• Information Life Cycle Management
ESEO: Actionable activities
39. But, an average Search budget is 100K Euro
• TCO
• ROI
• KPI
Search Analytics is key
Show me the Money
40. Important, delivers actionable to-dos quickly
•0-results
•Top Terms Searched for
Video: Search Analytics in Practice
Search Analytics
41. • Feedback form
• KPI from Search Analytics
• Session time x n:o sessions = Time spent
on search x hourly price = Cost per
“answer”
• Add search re!nements + exit page (=is
the right answer)
User Satisfaction
42. Findability by Findwise
1. BUSINESS
BUILD SOLUTIONS TO SUPPORT YOUR BUSINESS PROCESSES
AND GOALS
2. INFORMATION
PREPARE INFORMATION TO MAKE IT FINDABLE
3. USERS
BUILD USABLE SOLUTIONS BASED ON USER NEEDS
4. ORGANISATION
GOVERN AND IMPROVE YOUR SOLUTION OVER TIME
5. SEARCH TECHNOLOGY
43. • Analyze how your business goals and
strategies can be met by improved
information access
• Set Findability goals. Examples; increase the
revenue on sales, raise productivity, improve
knowledge sharing, better collaboration
• Specify your requirements
• De!ne KPI’s and measure the success of your
investments
Business
44. • Clean up and archive or delete outdated/
unrelevant information
• Ensure good quality of information by
adding structured and suitable metadata
• Create and use information models and
taxonomies
• Tagging?
Information
45. • Get to know your users and their needs
• Make sure your solution is easy to use
• Perform continuous usability evaluations,
like usage tests and expert evaluations
• Make sure users !nd what they are looking
for
• Enable feedback loops for complaints,
feedback and praise
Users
46. •List• Resources!
• De!ne processes, roles and routines to
govern the solution
• Perform Search Analytics
• Create easy to use administration
interfaces
• Perform training, technical and editorial
• Help publishers get started with processes
for better !ndability
Organisation
47. •List• Select a suitable search platform or make
the most of your current solution
• Design your architecture with search-as-a-
service in mind
• Utilise the full potential of the selected
technology
Search Technology