Presentation is loading. Please wait.

Presentation is loading. Please wait.

The SearchMaster's Toolbox ECIR Industry Day 01 Apr 2010 David Hawking.

Similar presentations


Presentation on theme: "The SearchMaster's Toolbox ECIR Industry Day 01 Apr 2010 David Hawking."— Presentation transcript:

1 The SearchMaster's Toolbox ECIR Industry Day 01 Apr 2010 David Hawking

2

3

4 UK Customers From 2004/5: Staffordshire University, Scottish Care Commission From 2009:The Electoral Commission, Digital UK, Hargreaves Lansdown From 2010: London School of Economics and Political Science, Incisive Media, British Medical Journal, East Ayrshire Council,...

5 Search is life

6 Costs of poor search Butler Group: Up to 10% of salary costs wasted through ineffective search IDC: A company with 1000 information workers can expect to waste more than $5M p.a. due to poor search Accenture: A survey of 1000 middle managers spend as long as 2 hrs/day searching for information.

7 Who's the SearchMaster in your organisation?

8 Stakeholders expect every SearchMaster to do her duty! To make external website search work –Sales conversions –Information dissemination –Reduced inquiry handling load To provide effective search of corporate information –Happy, productive employees (plus students and other stakeholders)

9 Give them the tools and they will do the job! Searchmaster End-user Simple Powerful

10 1. The basic search tool Should: –Have good performance out of the box, without weeks of implementation. –Be simple to configure –Avoid features which are too complex to use or set up. –Be able to cover your content and scale to the necessary level

11 2. FineTuner Every search deployment is different –Web, database, fileshare, Lotus The weighting of ranking features must accommodate to the differences Manual tweaking is fraught with danger –Fix one query, break a dozen Make a test file and use a tuning tool to learn feature weightings

12 Testfile Desiderata Representative of real workload –Need an unbiased sample Many queries (typically >> 100) Multiple weighted answers (where applicable) Redirects Equivalent answers See es.csiro.au/C-TEST/

13 Academic Research on Evaluation Masses of academic research How does it translate to tuning an enterprise search system? –Setting good defaults –Tuning to specific characteristics in hundreds of customer deployments Note: the system starts with no user interaction data. Creation of testfiles must be affordable.

14 Spreadsheet testfile

15 LSE Case Study

16 Sources of testfiles at LSE A-Z Sitemap (>500 entries) –Biased toward anchortext Keymatches file (>500 entries) –Pessimistic Click data (>250 queries with > t clicks) –Biased toward clicks – 100% success! Pop/crit queries (134 manually judged) All biased – Use a sampling tool!

17 1 2 3 dim 2 dim1 Dimension-at-a-time tuning

18 Popular/Critical Set

19 Fine Tuning Summary Tuning a large number of dimensions (Funnelback FineTune covers 38) Millions of query executions Achieves substantial gains

20 But why do queries still fail? Misspelled –Europian Conferense oninformation retreival Query words don't match document –door or MOPEM v. manually operated personnel egress mechanism There is no answer to that question. –Maybe there should be –Scope issues.

21 Need more tools!

22 3. Spelling suggestion tools Suggestions may be useful even if words are correctly spelled: –Carlton furball club Carlton football club Suggestions based on whole query, not word-by-word Don't suggest queries which make no sense in the collection being searched Autocompletion: Guide users to the best query Context is king

23 4. Query expansion tools Manual rules: –Rego [registration rego] –MOPEM [manually operated personnel egress mechanism door] Related queries (automatic) –Based on co-clicking Contextual navigation (on-the-fly) –Finding superphrases in a deep result set Faceting (semi-automatic)

24

25 5. Reporting and alerting tools Reporting on Queries which: –Produced no results –Logged behaviour suggestive of unfulfilment Alerting when: –Submissions of a query (or group of related queries) sharply increase in frequency For: –business intelligence –Triggering creation or changes to content

26 Query Spike Alerting

27 Conclusions Search is important Organisations benefit when someone takes responsibility for effective search – the SearchMaster. Academic research into evaluation needs careful translation for use in enterprise search tuning. Further tools are needed to overcome poor queries and missing content. Thanks to Mike Swanson of Oxfam Australia for the Ned Kelly line.

28


Download ppt "The SearchMaster's Toolbox ECIR Industry Day 01 Apr 2010 David Hawking."

Similar presentations


Ads by Google