Neural representation and decoding of the meanings of words (continued)
Some PubMed search tips that you might not already know Selecting by subfield, e.g. title Boolean searches and wildcards Different ways of referring to same thing Review tag
Ways of representing meaning Corpus semantics / distributional semantics See how words co-occur with each other in large bodies of text Semantic network Relations between words: “is-a”, etc. WordNet Embodied semantics Meanings are grounded in sense modalities
The Mitchell study: word stimuli and semantic features Stimuli: concrete nouns E.g. hammer, shirt, dog, celery (60 words in all) 12 categories (tools, clothing, food, etc.), each with 5 words Semantic features: action verbs E.g. push, move, taste, see (25 semantic features in all) Each noun has a 25-element semantic feature vector of its co-occurrence freqs with the verbs, from Google text- corpus hammer = 0.13*break + 0.93*touch + 0.01*eat + … celery = 0.00*break + 0.03*touch + 0.84*eat + …
Example co-occurrence features Features for cat: say said says (0.592), see sees (0.449), eat ate eats (0.435), run ran runs (0.303), hear hears heard (0.208), open opens opened (0.175), smell smells smelled (0.163), clean cleaned cleans (0.146), move moved moves (0.088), listen listens listened (0.075), touch touched touches (0.075) … http://www.cs.cmu.edu/~tom/science2008/semanticFeatureVectors.html
Mitchell model architecture: represent nouns in terms of semantic features (action verbs) fMRI data and semantic features publicly available at http://www.cs.cmu.edu/~tom/science2008
Interpolating between stimuli, using a model of the stimulus space Pattern-information analysis: from stimulus decoding to computational-model testing. Kriegeskorte N. Neuroimage. 2011 May 15;56(2):411-21.
Corpus based approach: pros and cons Advantages: Works quite well in practice Used a lot in computer language processing Good for capturing semantic relations between single words Disadvantages: Unclear how to relate it to neural representations Unclear how to handle logical relations between words
Semantic network http://www.visualthesaurus.com
WordNet Founded by George Miller (“magic number 7”) http://wordnet.princeton.edu/
Huth et al. (2012) Semantic space in cortex
Representing categories in WordNet
Principal Components Analysis (PCA) http://web.media.mit.edu/~tristan/phd/dissertation/figures/PCA.jpg
Representing multiple semantic principal components
A closer look at semantic space
What do the components mean?
Highly distributed representations
How much of each region’s activation does the model explain?
Embodied theory of meaning Words are represented in terms of bodily sense modalities: vision, hearing, movement, etc. Barsalou, L. W. (2008). Grounded cognition. Annu. Rev. Psychol., 59, 617-645. Pulvermüller, F. (2013). How neurons make meaning: brain mechanisms for embodied and abstract-symbolic semantics. Trends in cognitive sciences, 17(9), 458-470. Binder, J. R., & Desai, R. H. (2011). The neurobiology of semantic memory. Trends in cognitive sciences, 15(11), 527-536.
Embodied theory of meaning Binder, J. R., & Desai, R. H. (2011). The neurobiology of semantic memory. Trends in cognitive sciences, 15(11), 527-536.
Embodied theory of meaning Pulvermüller, F. (2013). How neurons make meaning: brain mechanisms for embodied and abstract-symbolic semantics. Trends in cognitive sciences, 17(9), 458-470.
Example: somatotopic representation of motor words
Example: somatotopic representation of motor words Pulvermüller, F., Trends in Cog Sci (2013).
Embodied-looking activation shows up even using corpus statistics “Gustatory cortex” for celery in Mitchell et al. (2008) Mouth / toungue areas
What about abstract words? Pulvermüller, F., Trends in Cog Sci (2013).
Higher level “abstraction” areas? Pulvermüller, F., Trends in Cog Sci (2013).
Lots of open questions! Composition of meaning: How does the brain build a representation of “The child threw the ball” out of its representations of “child”, “threw” and “ball”? Systematicity / compositionality The brain can recombine words into a potentially unlimited number of new sentences. How? Syntax How does the brain represent “the cat chased the dog” vs “the dog chased the cat” ?