Cross-Language Evaluation Forum (CLEF) IST-2000-31002 Expected Kick-off Date: August 2001 Carol Peters IEI-CNR, Pisa, Italy Carol Peters: blabla Carol.

Slides:



Advertisements
Similar presentations
The Application of Machine Translation in CADAL Huang Chen, Chen Haiying Zhejiang University Libraries, Hangzhou, China
Advertisements

DELOS NoE DELOS Network of Excellence for Digital Libraries Carol Peters IEI-CNR, Pisa Italy.
SINAI-GIR A Multilingual Geographical IR System University of Jaén (Spain) José Manuel Perea Ortega CLEF 2008, 18 September, Aarhus (Denmark) Computer.
Thomas Mandl: Robust CLEF Overview 1 Cross-Language Evaluation Forum (CLEF) Thomas Mandl Information Science Universität Hildesheim
Current and Future Research Directions University of Tehran Database Research Group 1 October 2009 Abolfazl AleAhmad, Ehsan Darrudi, Hadi.
A Maximum Coherence Model for Dictionary-based Cross-language Information Retrieval Yi Liu, Rong Jin, Joyce Y. Chai Dept. of Computer Science and Engineering.
Overview of Collaborative Information Retrieval (CIR) at FIRE 2012 Debasis Ganguly, Johannes Leveling, Gareth Jones School of Computing, CNGL, Dublin City.
The Challenges of Multilingual Search Paul Clough The Information School University of Sheffield ISKO UK conference 8-9 July 2013.
CLEF 2008 Multilingual Question Answering Track UNED Anselmo Peñas Valentín Sama Álvaro Rodrigo CELCT Danilo Giampiccolo Pamela Forner.
Search Engines and Information Retrieval
Cross Language IR Philip Resnik Salim Roukos Workshop on Challenges in Information Retrieval and Language Modeling Amherst, Massachusetts, September 11-12,
Issues in Pre- and Post-translation Document Expansion: Untranslatable Cognates and Missegmented Words Gina-Anne Levow University of Chicago July 7, 2003.
Information Retrieval in Practice
Reference Collections: Task Characteristics. TREC Collection Text REtrieval Conference (TREC) –sponsored by NIST and DARPA (1992-?) Comparing approaches.
Advance Information Retrieval Topics Hassan Bashiri.
With or without users? Julio Gonzalo UNEDhttp://nlp.uned.es.
Evaluation of Hindi→English, Marathi→English and English→Hindi CLIR at FIRE 2008 Nilesh Padariya, Manoj Chinnakotla, Ajay Nagesh and Om P. Damani Center.
Cross-Language Retrieval INST 734 Module 11 Doug Oard.
April 7, 2006 Natural Language Processing/Language Technology for the Web Cross-Language Information Retrieval (CLIR) Ananthakrishnan R Computer Science.
Search is not only about the Web An Overview on Printed Documents Search and Patent Search Walid Magdy Centre for Next Generation Localisation School of.
Spanish Question Answering Evaluation Anselmo Peñas, Felisa Verdejo and Jesús Herrera UNED NLP Group Distance Learning University of Spain CICLing 2004,
DELOS NoE DELOS Network of Excellence on Digital Libraries Curent Trends in Digitization Torun, 3-4 February 2003.
Evaluating the Contribution of EuroWordNet and Word Sense Disambiguation to Cross-Language Information Retrieval Paul Clough 1 and Mark Stevenson 2 Department.
Evaluating Cross-language Information Retrieval Systems Carol Peters IEI-CNR.
The Evolution of Shared-Task Evaluation Douglas W. Oard College of Information Studies and UMIACS University of Maryland, College Park, USA December 4,
August 21, 2002Szechenyi National Library Support for Multilingual Information Access Douglas W. Oard College of Information Studies and Institute for.
Search Engines and Information Retrieval Chapter 1.
CLEF – Cross Language Evaluation Forum Question Answering at CLEF 2003 ( Bridging Languages for Question Answering: DIOGENE at CLEF-2003.
CLEF Ǻrhus Robust – Word Sense Disambiguation exercise UBC: Eneko Agirre, Oier Lopez de Lacalle, Arantxa Otegi, German Rigau UVA & Irion: Piek Vossen.
22 August 2003CLEF 2003 Answering Spanish Questions from English Documents Abdessamad Echihabi, Douglas W. Oard, Daniel Marcu, Ulf Hermjakob USC Information.
Impressions of 10 years of CLEF Donna Harman Scientist Emeritus National Institute of Standards and Technology.
LREC 2008 From Research to Application in Multilingual Information Access: The Contribution of Evaluation Carol Peters ISTI-CNR, Pisa, Italy.
Evaluation Experiments and Experience from the Perspective of Interactive Information Retrieval Ross Wilkinson Mingfang Wu ICT Centre CSIRO, Australia.
1 The Domain-Specific Track at CLEF 2008 Vivien Petras & Stefan Baerisch GESIS Social Science Information Centre, Bonn, Germany Aarhus, Denmark, September.
DELOS NoE DELOS Network of Excellence on Digital Libraries Vittore Casarosa CNR-IEI, Pisa, Italy.
CLEF – Cross Language Evaluation Forum Question Answering at CLEF 2003 ( The Multiple Language Question Answering Track at CLEF 2003.
D L T French Question Answering in Technical and Open Domains Aoife O’Gorman Documents and Linguistic Technology Group Univeristy of Limerick.
1 Cross-Lingual Query Suggestion Using Query Logs of Different Languages SIGIR 07.
CLEF 2004 – Interactive Xling Bookmarking, thesaurus, and cooperation in bilingual Q & A Jussi Karlgren – Preben Hansen –
CLEF 2005: Multilingual Retrieval by Combining Multiple Multilingual Ranked Lists Luo Si & Jamie Callan Language Technology Institute School of Computer.
Cross-Language Evaluation Forum CLEF Workshop 2004 Carol Peters ISTI-CNR, Pisa, Italy.
The CLEF 2003 cross language image retrieval task Paul Clough and Mark Sanderson University of Sheffield
Information Retrieval and Web Search Cross Language Information Retrieval Instructor: Rada Mihalcea Class web page:
DELOS NoE DELOS NoE DELOS Network of Excellence on Digital Libraries, Jan Dec 2002 IST , Jan Dec 2002 Director: Costantino Thanos.
MIRACLE Multilingual Information RetrievAl for the CLEF campaign DAEDALUS – Data, Decisions and Language, S.A. Universidad Carlos III de.
Ontology-based information retrieval of scientific information Natalia V. Loukachevitch Laboratory of Information Resources Analysis Research Computing.
IIIT Hyderabad’s CLIR experiments for FIRE-2008 Sethuramalingam S & Vasudeva Varma IIIT Hyderabad, India 1.
Collocations and Information Management Applications Gregor Erbach Saarland University Saarbrücken.
1 01/10/09 1 INFILE CEA LIST ELDA Univ. Lille 3 - Geriico Overview of the INFILE track at CLEF 2009 multilingual INformation FILtering Evaluation.
How robust is CLIR? Proposal for a new robust task at CLEF Thomas Mandl Information Science Universität Hildesheim 6 th Workshop.
CLEF2003 Forum/ August 2003 / Trondheim / page 1 Report on CLEF-2003 ML4 experiments Extracting multilingual resources from corpora N. Cancedda, H. Dejean,
Iterative Translation Disambiguation for Cross Language Information Retrieval Christof Monz and Bonnie J. Dorr Institute for Advanced Computer Studies.
CLEF 2007 Workshop Budapest, Hungary, 19–21 September 2007 Nicola Ferro Information Management Systems (IMS) Research Group Department of Information Engineering.
CLEF Kerkyra Robust – Word Sense Disambiguation exercise UBC: Eneko Agirre, Arantxa Otegi UNIPD: Giorgio Di Nunzio UH: Thomas Mandl.
Cross-Language Evaluation Forum CLEF 2003 Carol Peters ISTI-CNR, Pisa, Italy Martin Braschler Eurospider Information Technology AG.
Measuring How Good Your Search Engine Is. *. Information System Evaluation l Before 1993 evaluations were done using a few small, well-known corpora of.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Iterative Translation Disambiguation for Cross-Language.
Stiftung Wissenschaft und Politik German Institute for International and Security Affairs CLEF 2005: Domain-Specific Track Overview Michael Kluck SWP,
What’s happening in iCLEF? (the iCLEF Flickr Challenge) Julio Gonzalo (UNED), Paul Clough (U. Sheffield), Jussi Karlgren (SICS), Javier Artiles (UNED),
The Cross Language Image Retrieval Track: ImageCLEF Breakout session discussion.
1 13/05/07 1/20 LIST – DTSI – Interfaces, Cognitics and Virtual Reality Unit The INFILE project: a crosslingual filtering systems evaluation campaign Romaric.
Thomas Mandl: Robust CLEF Overview 1 Cross-Language Evaluation Forum (CLEF) Thomas Mandl Information Science Universität Hildesheim
Multilingual Search Shibamouli Lahiri
1 The Domain-Specific Track at CLEF 2007 Vivien Petras, Stefan Baerisch & Max Stempfhuber GESIS Social Science Information Centre, Bonn, Germany Budapest,
Analysis of Experiments on Hybridization of different approaches in mono and cross-language information retrieval DAEDALUS – Data, Decisions and Language,
CLEF Workshop ECDL 2003 Trondheim Michael Kluck slide 1 Introduction to the Monolingual and Domain-Specific Tasks of the Cross-language.
From CLEF to TrebleCLEF Promoting Technology Transfer
Multilingual Search using Query Translation and Collection Selection Jacques Savoy, Pierre-Yves Berger University of Neuchatel, Switzerland
CLEF 2008 Multilingual Question Answering Track
Presentation transcript:

Cross-Language Evaluation Forum (CLEF) IST Expected Kick-off Date: August 2001 Carol Peters IEI-CNR, Pisa, Italy Carol Peters: blabla Carol Peters: blabla

Concertation Event, Vienna, 21 June Cross-Language Evaluation Forum Objectives Promote research in cross-language system development for European languages by providing an appropriate infrastructure for: zsystem evaluation, testing and tuning zcomparison and discussion of results between R&D groups working on common problems zbuilding test-suites for cross-language system developers

Concertation Event, Vienna, 21 June Evaluation for Cross- Language Systems Why Evaluation is Important for CLIR zCLIR systems are still in experimental stage of development  Evaluation activities stimulate progress through objective assessment and also by comparison of systems and approaches

Concertation Event, Vienna, 21 June Evaluation for Cross- Language Systems yevaluation methodolgy yreference multilingual document collection ystatements of information needs (> queries) in multiple languages yobjective assessment of results ycomparative analysis of results Creating the infrastructure for an evaluation campaign

Concertation Event, Vienna, 21 June Cross-Language Evaluation Forum Background zJan CLEF launched as collaboration between DELOS NoE and US National Institute for Standards and Technology (NIST) and the TREC Conferences yMethodology for CLEF is an adaptation of TREC evaluation methodology for multilingual context zCLEF 2000 and 2001 organised within DELOS zFrom August 2001, CLEF becomes independent

Concertation Event, Vienna, 21 June CLEF 2001 Task Description Four main evaluation tracks in CLEF 2001: zmultilingual information retrieval zbilingual information retrieval zmonolingual (non-English) information retrieval zdomain-specific IR plus zexperimental track for interactive C-L systems

Concertation Event, Vienna, 21 June CLEF 2001 Multilingual Data Collection zMultilingual comparable corpus of news agencies and newspaper documents for six languages (DE,EN,FR,IT,NL,SP). Over 1 million documents zCommon set of 50 topics (from which queries are extracted) created in 9 European languages (DE,EN,FR,IT,NL,SP+FI,RU,SV) and 3 Asian languages (JP,TH,ZH)

Concertation Event, Vienna, 21 June Topics either DE,E,F,I or FI,NL,SP,SV EnglishGermanFrenchItalian Participant’s MLIR/CLIR Information Retrieval System documents CLEF 2001 Multilingual IR One result list of DE, FE, F and I documents ranked in decreasing order of estimated relevance

Concertation Event, Vienna, 21 June CLEF 2001 Bilingual IR Task : query language DE,FR,IT,FI,NL,SP,SV, RU,ZH,JP,TH - target document collection is English Goal: retrieve documents for target language, listing results in ranked list zEasier task for beginners !

Concertation Event, Vienna, 21 June CLEF 2001 Monolingual IR Task: querying document collections in FR|DE|IT|NL|SP Goal: acquire better understanding of language dependent retrieval problems zdifferent languages present different retrieval problems zissues include word order, morphology, diacritic characters, language variants

Concertation Event, Vienna, 21 June CLEF 2001 Domain-Specific IR Task: querying a structured database from a vertical domain (social sciences) in German zGerman/English/Russian thesaurus and English translations of document titles  Monolingual (DE) or cross-language (DE, EN, RU) task

Concertation Event, Vienna, 21 June CLEF 2001 Participation z30 groups: 8 N.American; 18 European; 4 Rest of the World zRuns submitted for all tasks: yCross-Language = 20 groups xMultilingual = 8 groups xBilingual -> EN= 18 groups xBilingual -> NL= 3 groups yMonolingual = 20 groups yDomain-specific = 1 group zA total of approx 200 runs were submitted

Concertation Event, Vienna, 21 June Approaches to CLIR CLEF 2000 zcommercial MT systems (Systran, Lernout and Hauspie Power Translator) zbilingual dictionary look-up zaligned parallel corpora (web-derived) zsimilarity thesaurus (using comparable corpora) Different strategies experimented for query expansion and results merging

Concertation Event, Vienna, 21 June Evaluation - Summing up zsystem evaluation is not a competition to find the best zevaluation provides opportunity to test, tune, and compare approaches in order to improve system performance zan evaluation campaign creates a community interested in examining the same issues and comparing ideas and experiences

Concertation Event, Vienna, 21 June Cross-Language Evaluation Forum zIntentions for CLEF 2002/2003 zstudy evaluation methodologies wrt user needs zaddition of more languages zaddition of new tasks (eg interactive CLEF) zC-L evaluation for other document types (eg speech) zproduce CLIR system test-suites for the R&D community

Concertation Event, Vienna, 21 June Cross-Language Evaluation Forum z For more information: z or