EUROCAST’01EUROCAST’01 Marta E. Zorrilla, José L. Crespo and Eduardo Mora Department of Applied Mathematics and Computer Science University of Cantabria.

Slides:



Advertisements
Similar presentations
Chapter 5: Introduction to Information Retrieval
Advertisements

Scott Wiese ECE 539 Professor Hu
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Neural Networks II CMPUT 466/551 Nilanjan Ray. Outline Radial basis function network Bayesian neural network.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
T.Sharon - A.Frank 1 Internet Resources Discovery (IRD) Classic Information Retrieval (IR)
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
Artificial Intelligence Statistical learning methods Chapter 20, AIMA (only ANNs & SVMs)
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Radial Basis Function Networks 표현아 Computer Science, KAIST.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Multimedia Databases LSI and SVD. Text - Detailed outline text problem full text scanning inversion signature files clustering information filtering and.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Aula 4 Radial Basis Function Networks
Radial Basis Function (RBF) Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Radial Basis Function Networks
Neural Networks.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Explorations in Neural Networks Tianhui Cai Period 3.
Chapter 9 Neural Network.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Radial Basis Function Networks:
1 Information Retrieval Acknowledgements: Dr Mounia Lalmas (QMW) Dr Joemon Jose (Glasgow)
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Latent Semantic Analysis Hongning Wang Recap: vector space model Represent both doc and query by concept vectors – Each concept defines one dimension.
Yang, Luyu.  Postal service for sorting mails by the postal code written on the envelop  Bank system for processing checks by reading the amount of.
Introduction to Digital Libraries hussein suleman uct cs honours 2003.
SINGULAR VALUE DECOMPOSITION (SVD)
Artificial Neural Network Building Using WEKA Software
Multi-Layer Perceptron
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
Non-Bayes classifiers. Linear discriminants, neural networks.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Information Retrieval
Citation-Based Retrieval for Scholarly Publications 指導教授:郭建明 學生:蘇文正 M
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.

A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Today’s Lecture Neural networks Training
Machine Learning Supervised Learning Classification and Regression
Adavanced Numerical Computation 2008, AM NDHU
Self organizing networks
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
Neuro-Computing Lecture 4 Radial Basis Function Network
Text Categorization Assigning documents to a fixed set of categories
Introduction to Radial Basis Function Networks
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Vector Representation of Text
Artificial Neural Networks / Spring 2002
Presentation transcript:

EUROCAST’01EUROCAST’01 Marta E. Zorrilla, José L. Crespo and Eduardo Mora Department of Applied Mathematics and Computer Science University of Cantabria An Online Information Retrieval Systems by means of Artificial Neural Networks An Online Information Retrieval Systems by means of Artificial Neural Networks

EUROCAST’01EUROCAST’01 Introduction I What’s about ‘Information Retrieval Systems’? structured field search full-text search

EUROCAST’01EUROCAST’01 Indexing and Storing Search Interface Relevance classification Indexes Documents Document transfer Documents General process

EUROCAST’01EUROCAST’01 Documents database Original documents ‘Pure text’ files Files to index Stopwords Stemming Thesaurus List of terms Text extraction Filtering Link to documents Indexation Storing Indexes Indexing and storing

EUROCAST’01EUROCAST’01 Classification of Information Retrieval Systems ClassificationClassification Free dictionary Clustering Latent Semantic Indexing Statistics Self-organising ANN In words In n-grams Inverse indexes Pre-established dictionary Vectorial representation

EUROCAST’01EUROCAST’01 Inverse index

EUROCAST’01EUROCAST’01 inputs bmu Neighbourhood radio Kohonen’s topological map Fritzke’s growing topological maps Self-organising ANN

EUROCAST’01EUROCAST’01 Distances * * * * * * * * * * * * * * * * * * * clusters Clustering statistics

EUROCAST’01EUROCAST’01 AkAk m x n Documents Terms Singular Value Decomposition = U  VtVt Term vectors Document vectors k k m x r r x rr x n XX Query New documents New termsLSILSI

EUROCAST’01EUROCAST’01 Competitive networks (self-organising, p.e.) a processor in output layer with non-null response Radial basis networks a continuos response, generally in one layer Multilayer perceptrons similar to radial networks, except in activation function and operations made at the connections ANN for classification

EUROCAST’01EUROCAST’01ProposalProposal Information Retrieval System doc1 doc2 doc3 doc doc n w1 w2 w3 w wn dictionary word in binary representation documents in output layer COES: Spanish dictionary developed by Santiago Rodriguez and Jesús Carretero Documents: Spanish Civil Code Articles

EUROCAST’01EUROCAST’01 Test and Results I Results: Error function tends to mistaken minima (gradient is essentially zero) The Neural Network needs a processor for each word in the dictionary, i.e., the network isn’t compact Neural network with Radial Basis Functions Error function: mean squared error, entropia Nº documents: 93 Nº words in dictionary: 140 Conclusion: ordinary RBF approach is not appropriate, it is necessary a change of approach or a change of network; we present another network: MLP

EUROCAST’01EUROCAST’01 Test and Results II Multilayer Perceptron with tanh activation function Error function: mean squared error, entropia Nº documents: 10Nº words in dictionary: 14 Architecture: 10x5x10 ; 10x7x10 ; 10x10x10 Optimisation methods: Conjugate Gradient, Quasi-Newton with lineal and parabolic minimisation. Results: A 10x5x10 architecture can learn the training set The optimisation method can have a definitive importance The same method, in different programs, offers different results The error function does not make much of a difference Conclusion: In order to gain insight into optimisation process, we program the network Results: A 10x5x10 architecture almost learn the training set, in 10x10x10 is perfect Quasi-Newton with parabolic minimisation is the most efficient method Mean squared error offers better results than entropia. Sorting the training set by number of occurrences or scaling the output between 0 and 1 doesn’t offer better results.

EUROCAST’01EUROCAST’01 Test and Results III Future: Growing output layer with more documents. It will be necessary to increase the number of hidden neurons when the error becomes high.

EUROCAST’01EUROCAST’01 What’s an Information Retrieval System What does it work? Classification of IRS Proposal A neural network in which each output layer processor represents a document and the input layer receives words in binary representationResults Promising results of MLP in a toy problem ConclusionsConclusions