Download presentation
Presentation is loading. Please wait.
2
January 2004 ADC’04 - 1 What Do You Want— Semantic Understanding? (You’ve Got to be Kidding) David W. Embley Brigham Young University Funded in part by the National Science Foundation
3
January 2004 ADC’04 - 2 Presentation Outline Grand Challenge Meaning, Knowledge, Information, Data Fun and Games with Data Information Extraction Ontologies Applications Limitations and Pragmatics Summary and Challenges
4
January 2004 ADC’04 - 3 Grand Challenge Semantic Understanding Can we quantify & specify the nature of this grand challenge?
5
January 2004 ADC’04 - 4 Grand Challenge Semantic Understanding “If ever there were a technology that could generate trillions of dollars in savings worldwide …, it would be the technology that makes business information systems interoperable.” (Jeffrey T. Pollock, VP of Technology Strategy, Modulant Solutions)
6
January 2004 ADC’04 - 5 Grand Challenge Semantic Understanding “The Semantic Web: … content that is meaningful to computers [and that] will unleash a revolution of new possibilities … Properly designed, the Semantic Web can assist the evolution of human knowledge …” (Tim Berners-Lee, …, Weaving the Web)
7
January 2004 ADC’04 - 6 Grand Challenge Semantic Understanding “20 th Century: Data Processing “21 st Century: Data Exchange “The issue now is mutual understanding.” (Stefano Spaccapietra, Editor in Chief, Journal on Data Semantics)
8
January 2004 ADC’04 - 7 Grand Challenge Semantic Understanding “The Grand Challenge [of semantic understanding] has become mission critical. Current solutions … won’t scale. Businesses need economic growth dependent on the web working and scaling (cost: $1 trillion/year).” (Michael Brodie, Chief Scientist, Verizon Communications)
9
January 2004 ADC’04 - 8 Why Semantic Understanding? Because we’re overwhelmed with data Point and click too slow “Give me what I want when I want it.” Because it’s the key to revolutionary progress Automated interoperability and knowledge sharing Negotiation in e-business Large-scale, in-silico experiments in e-science We succeed in managing information if we can “[take] data and [analyze] it and [simplify] it and [tell] people exactly the information they want, rather than all the information they could have.” - Jim Gray, Microsoft Research
10
January 2004 ADC’04 - 9 What is Semantic Understanding? Understanding: “To grasp or comprehend [what’s] intended or expressed.’’ Semantics: “The meaning or the interpretation of a word, sentence, or other language form.” - Dictionary.com
11
January 2004 ADC’04 - 10 Can We Achieve Semantic Understanding? “A computer doesn’t truly ‘understand’ anything.” But computers can manipulate terms “in ways that are useful and meaningful to the human user.” - Tim Berners-Lee Key Point: it only has to be good enough. And that’s our challenge and our opportunity! …
12
January 2004 ADC’04 - 11 Presentation Outline Grand Challenge Meaning, Knowledge, Information, Data Fun and Games with Data Information Extraction Ontologies Applications Limitations and Pragmatics Summary and Challenges
13
January 2004 ADC’04 - 12 Information Value Chain Meaning Knowledge Information Data Translating data into meaning
14
January 2004 ADC’04 - 13 Foundational Definitions Meaning: knowledge that is relevant or activates Knowledge: information with a degree of certainty or community agreement Information: data in a conceptual framework Data: attribute-value pairs - Adapted from [Meadow92]
15
January 2004 ADC’04 - 14 Foundational Definitions Meaning: knowledge that is relevant or activates Knowledge: information with a degree of certainty or community agreement (ontology) Information: data in a conceptual framework Data: attribute-value pairs - Adapted from [Meadow92]
16
January 2004 ADC’04 - 15 Foundational Definitions Meaning: knowledge that is relevant or activates Knowledge: information with a degree of certainty or community agreement (ontology) Information: data in a conceptual framework Data: attribute-value pairs - Adapted from [Meadow92]
17
January 2004 ADC’04 - 16 Foundational Definitions Meaning: knowledge that is relevant or activates Knowledge: information with a degree of certainty or community agreement (ontology) Information: data in a conceptual framework Data: attribute-value pairs - Adapted from [Meadow92]
18
January 2004 ADC’04 - 17 Data Attribute-Value Pairs Fundamental for information Thus, fundamental for knowledge & meaning
19
January 2004 ADC’04 - 18 Data Attribute-Value Pairs Fundamental for information Thus, fundamental for knowledge & meaning Data Frame Extensive knowledge about a data item ̶ Everyday data: currency, dates, time, weights & measures ̶ Textual appearance, units, context, operators, I/O conversion Abstract data type with an extended framework
20
January 2004 ADC’04 - 19 Presentation Outline Grand Challenge Meaning, Knowledge, Information, Data Fun and Games with Data Information Extraction Ontologies Applications Limitations and Pragmatics Summary and Challenges
21
January 2004 ADC’04 - 20 ? Olympus C-750 Ultra Zoom Sensor Resolution:4.2 megapixels Optical Zoom:10 x Digital Zoom:4 x Installed Memory:16 MB Lens Aperture:F/8-2.8/3.7 Focal Length min:6.3 mm Focal Length max:63.0 mm
22
January 2004 ADC’04 - 21 ? Olympus C-750 Ultra Zoom Sensor Resolution:4.2 megapixels Optical Zoom:10 x Digital Zoom:4 x Installed Memory:16 MB Lens Aperture:F/8-2.8/3.7 Focal Length min:6.3 mm Focal Length max:63.0 mm
23
January 2004 ADC’04 - 22 ? Olympus C-750 Ultra Zoom Sensor Resolution:4.2 megapixels Optical Zoom:10 x Digital Zoom:4 x Installed Memory:16 MB Lens Aperture:F/8-2.8/3.7 Focal Length min:6.3 mm Focal Length max:63.0 mm
24
January 2004 ADC’04 - 23 ? Olympus C-750 Ultra Zoom Sensor Resolution4.2 megapixels Optical Zoom10 x Digital Zoom4 x Installed Memory16 MB Lens ApertureF/8-2.8/3.7 Focal Length min6.3 mm Focal Length max63.0 mm
25
January 2004 ADC’04 - 24 Digital Camera Olympus C-750 Ultra Zoom Sensor Resolution:4.2 megapixels Optical Zoom:10 x Digital Zoom:4 x Installed Memory:16 MB Lens Aperture:F/8-2.8/3.7 Focal Length min:6.3 mm Focal Length max:63.0 mm
26
January 2004 ADC’04 - 25 ? Year 2002 MakeFord ModelThunderbird Mileage5,500 miles FeaturesRed ABS 6 CD changer keyless entry Price$33,000 Phone(916) 972-9117
27
January 2004 ADC’04 - 26 ? Year 2002 MakeFord ModelThunderbird Mileage5,500 miles FeaturesRed ABS 6 CD changer keyless entry Price$33,000 Phone(916) 972-9117
28
January 2004 ADC’04 - 27 ? Year 2002 MakeFord ModelThunderbird Mileage5,500 miles FeaturesRed ABS 6 CD changer keyless entry Price$33,000 Phone(916) 972-9117
29
January 2004 ADC’04 - 28 ? Year 2002 MakeFord ModelThunderbird Mileage5,500 miles FeaturesRed ABS 6 CD changer keyless entry Price$33,000 Phone(916) 972-9117
30
January 2004 ADC’04 - 29 Car Advertisement Year 2002 MakeFord ModelThunderbird Mileage5,500 miles FeaturesRed ABS 6 CD changer keyless entry Price$33,000 Phone(916) 972-9117
31
January 2004 ADC’04 - 30 ? Flight # Class From Time/Date To Time/Date Stops Delta 16 Coach JFK 6:05 pm CDG 7:35 am 0 02 01 04 03 01 04 Delta 119 Coach CDG 10:20 am JFK 1:00 pm 0 09 01 04 09 01 04
32
January 2004 ADC’04 - 31 ? Flight # Class From Time/Date To Time/Date Stops Delta 16 Coach JFK 6:05 pm CDG 7:35 am 0 02 01 04 03 01 04 Delta 119 Coach CDG 10:20 am JFK 1:00 pm 0 09 01 04 09 01 04
33
January 2004 ADC’04 - 32 Airline Itinerary Flight # Class From Time/Date To Time/Date Stops Delta 16 Coach JFK 6:05 pm CDG 7:35 am 0 02 01 04 03 01 04 Delta 119 Coach CDG 10:20 am JFK 1:00 pm 0 09 01 04 09 01 04
34
January 2004 ADC’04 - 33 ? Monday, October 13, 2003 Group AWLTGFGAPts. USA300 11 1 9 Sweden210 5 3 6 North Korea120 3 4 3 Nigeria030 0 11 0 Group BWLTGFGAPts. Brazil201 8 2 7 …
35
January 2004 ADC’04 - 34 ? Monday, October 13, 2003 Group AWLTGFGAPts. USA300 11 1 9 Sweden210 5 3 6 North Korea120 3 4 3 Nigeria030 0 11 0 Group BWLTGFGAPts. Brazil201 8 2 7 …
36
January 2004 ADC’04 - 35 World Cup Soccer Monday, October 13, 2003 Group AWLTGFGAPts. USA300 11 1 9 Sweden210 5 3 6 North Korea120 3 4 3 Nigeria030 0 11 0 Group BWLTGFGAPts. Brazil201 8 2 7 …
37
January 2004 ADC’04 - 36 ? Calories250 cal Distance2.50 miles Time23.35 minutes Incline1.5 degrees Speed5.2 mph Heart Rate125 bpm
38
January 2004 ADC’04 - 37 ? Calories250 cal Distance2.50 miles Time23.35 minutes Incline1.5 degrees Speed5.2 mph Heart Rate125 bpm
39
January 2004 ADC’04 - 38 ? Calories250 cal Distance2.50 miles Time23.35 minutes Incline1.5 degrees Speed5.2 mph Heart Rate125 bpm
40
January 2004 ADC’04 - 39 Treadmill Workout Calories250 cal Distance2.50 miles Time23.35 minutes Incline1.5 degrees Speed5.2 mph Heart Rate125 bpm
41
January 2004 ADC’04 - 40 ? PlaceBonnie Lake CountyDuchesne StateUtah TypeLake Elevation10,000 feet USGS QuadMirror Lake Latitude40.711ºN Longitude110.876ºW
42
January 2004 ADC’04 - 41 ? PlaceBonnie Lake CountyDuchesne StateUtah TypeLake Elevation10,000 feet USGS QuadMirror Lake Latitude40.711ºN Longitude110.876ºW
43
January 2004 ADC’04 - 42 ? PlaceBonnie Lake CountyDuchesne StateUtah TypeLake Elevation10,000 feet USGS QuadMirror Lake Latitude40.711ºN Longitude110.876ºW
44
January 2004 ADC’04 - 43 Maps PlaceBonnie Lake CountyDuchesne StateUtah TypeLake Elevation10,100 feet USGS QuadMirror Lake Latitude40.711ºN Longitude110.876ºW
45
January 2004 ADC’04 - 44 Presentation Outline Grand Challenge Meaning, Knowledge, Information, Data Fun and Games with Data Information Extraction Ontologies Applications Limitations and Pragmatics Summary and Challenges
46
January 2004 ADC’04 - 45 Information Extraction Ontologies SourceTarget Information Extraction Information Exchange
47
January 2004 ADC’04 - 46 What is an Extraction Ontology? Augmented Conceptual-Model Instance Object & relationship sets Constraints Data frame value recognizers Robust Wrapper (Ontology-Based Wrapper) Extracts information Works even when site changes or when new sites come on-line
48
January 2004 ADC’04 - 47 Extraction Ontology: Example Car [-> object]; Car [0:1] has Year [1:*]; Car [0:1] has Make [1:*]; … Car [0:*] has Feature [1:*]; PhoneNr [1:*] is for Car [0:1]; Year matches [4] constant {extract “\d{2}”; context “\b’[4-9]\d\b”; …} … Mileage matches [8] keyword {\bmiles\b”, “\bmi\b.”, …} …
49
January 2004 ADC’04 - 48 Extraction Ontologies: An Example of Semantic Understanding “Intelligent” Symbol Manipulation Gives the “Illusion of Understanding” Obtains Meaningful and Useful Results
50
January 2004 ADC’04 - 49 Presentation Outline Grand Challenge Meaning, Knowledge, Information, Data Fun and Games with Data Information Extraction Ontologies Applications Limitations and Pragmatics Summary and Challenges
51
January 2004 ADC’04 - 50 A Variety of Applications Information Extraction High-Precision Classification Schema Mapping Semantic Web Creation Agent Communication Ontology Generation
52
January 2004 ADC’04 - 51 Application #1 Information Extraction
53
January 2004 ADC’04 - 52 '97 CHEVY Cavalier, Red, 5 spd, only 7,000 miles. Previous owner heart broken! Asking only $11,995. #1415. JERRY SEINER MIDVALE, 566-3800 or 566-3888 Constant/Keyword Recognition Descriptor/String/Position(start/end) Year|97|2|3 Make|CHEV|5|8 Make|CHEVY|5|9 Model|Cavalier|11|18 Feature|Red|21|23 Feature|5 spd|26|30 Mileage|7,000|38|42 KEYWORD(Mileage)|miles|44|48 Price|11,995|100|105 Mileage|11,995|100|105 PhoneNr|566-3800|136|143 PhoneNr|566-3888|148|155
54
January 2004 ADC’04 - 53 Heuristics Keyword proximity Subsumed and overlapping constants Functional relationships Nonfunctional relationships First occurrence without constraint violation
55
January 2004 ADC’04 - 54 Year|97|2|3 Make|CHEV|5|8 Make|CHEVY|5|9 Model|Cavalier|11|18 Feature|Red|21|23 Feature|5 spd|26|30 Mileage|7,000|38|42 KEYWORD(Mileage)|miles|44|48 Price|11,995|100|105 Mileage|11,995|100|105 PhoneNr|566-3800|136|143 PhoneNr|566-3888|148|155 Keyword Proximity '97 CHEVY Cavalier, Red, 5 spd, only 7,000 miles on her. Previous owner heart broken! Asking only $11,995. #1415. JERRY SEINER MIDVALE, 566-3800 or 566-3888
56
January 2004 ADC’04 - 55 Subsumed/Overlapping Constants '97 CHEVY Cavalier, Red, 5 spd, only 7,000 miles. Previous owner heart broken! Asking only $11,995. #1415. JERRY SEINER MIDVALE, 566-3800 or 566-3888 Year|97|2|3 Make|CHEV|5|8 Make|CHEVY|5|9 Model|Cavalier|11|18 Feature|Red|21|23 Feature|5 spd|26|30 Mileage|7,000|38|42 KEYWORD(Mileage)|miles|44|48 Price|11,995|100|105 Mileage|11,995|100|105 PhoneNr|566-3800|136|143 PhoneNr|566-3888|148|155
57
January 2004 ADC’04 - 56 Year|97|2|3 Make|CHEV|5|8 Make|CHEVY|5|9 Model|Cavalier|11|18 Feature|Red|21|23 Feature|5 spd|26|30 Mileage|7,000|38|42 KEYWORD(Mileage)|miles|44|48 Price|11,995|100|105 Mileage|11,995|100|105 PhoneNr|566-3800|136|143 PhoneNr|566-3888|148|155 Functional Relationships '97 CHEVY Cavalier, Red, 5 spd, only 7,000 miles on her. Previous owner heart broken! Asking only $11,995. #1415. JERRY SEINER MIDVALE, 566-3800 or 566-3888
58
January 2004 ADC’04 - 57 Nonfunctional Relationships '97 CHEVY Cavalier, Red, 5 spd, only 7,000 miles on her. Previous owner heart broken! Asking only $11,995. #1415. JERRY SEINER MIDVALE, 566-3800 or 566-3888 Year|97|2|3 Make|CHEV|5|8 Make|CHEVY|5|9 Model|Cavalier|11|18 Feature|Red|21|23 Feature|5 spd|26|30 Mileage|7,000|38|42 KEYWORD(Mileage)|miles|44|48 Price|11,995|100|105 Mileage|11,995|100|105 PhoneNr|566-3800|136|143 PhoneNr|566-3888|148|155
59
January 2004 ADC’04 - 58 First Occurrence without Constraint Violation '97 CHEVY Cavalier, Red, 5 spd, only 7,000 miles on her. Previous owner heart broken! Asking only $11,995. #1415. JERRY SEINER MIDVALE, 566-3800 or 566-3888 Year|97|2|3 Make|CHEV|5|8 Make|CHEVY|5|9 Model|Cavalier|11|18 Feature|Red|21|23 Feature|5 spd|26|30 Mileage|7,000|38|42 KEYWORD(Mileage)|miles|44|48 Price|11,995|100|105 Mileage|11,995|100|105 PhoneNr|566-3800|136|143 PhoneNr|566-3888|148|155
60
January 2004 ADC’04 - 59 Year|97|2|3 Make|CHEV|5|8 Make|CHEVY|5|9 Model|Cavalier|11|18 Feature|Red|21|23 Feature|5 spd|26|30 Mileage|7,000|38|42 KEYWORD(Mileage)|miles|44|48 Price|11,995|100|105 Mileage|11,995|100|105 PhoneNr|566-3800|136|143 PhoneNr|566-3888|148|155 Database-Instance Generator insert into Car values(1001, “97”, “CHEVY”, “Cavalier”, “7,000”, “11,995”, “556-3800”) insert into CarFeature values(1001, “Red”) insert into CarFeature values(1001, “5 spd”)
61
January 2004 ADC’04 - 60 Application #2 High-Precision Classification
62
January 2004 ADC’04 - 61 An Extraction Ontology Solution
63
January 2004 ADC’04 - 62 Document 1: Car Ads Document 2: Items for Sale or Rent Density Heuristic
64
January 2004 ADC’04 - 63 Document 1: Car Ads Year: 3 Make: 2 Model: 3 Mileage: 1 Price: 1 Feature: 15 PhoneNr: 3 Expected Values Heuristic Document 2: Items for Sale or Rent Year: 1 Make: 0 Model: 0 Mileage: 1 Price: 0 Feature: 0 PhoneNr: 4
65
January 2004 ADC’04 - 64 Vector Space of Expected Values OV______ D1D2 Year 0.98 16 6 Make 0.93 10 0 Model 0.91 12 0 Mileage 0.45 6 2 Price 0.80 11 8 Feature 2.10 29 0 PhoneNr 1.15 1511 D1: 0.996 D2: 0.567 ov D1 D2
66
January 2004 ADC’04 - 65 Grouping Heuristic Year Make Model Price Year Model Year Make Model Mileage … Document 1: Car Ads { { { Year Mileage … Mileage Year Price … Document 2: Items for Sale or Rent { {
67
January 2004 ADC’04 - 66 Grouping Car Ads ---------------- Year Make Model -------------- 3 Price Year Model Year ---------------3 Make Model Mileage Year ---------------4 Model Mileage Price Year ---------------4 … Grouping: 0.875 Sale Items ---------------- Year Mileage -------------- 2 Mileage Year Price ---------------3 Year Price Year ---------------2 Price ---------------1 … Grouping: 0.500 Expected Number in Group = floor(∑ Ave ) = 4 (for our example) Sum of Distinct 1-Max Object Sets in each Group Number of Groups * Expected Number in a Group 1-Max 3+3+4+4 4*4 = 0.875 2+3+2+1 4*4 = 0.500
68
January 2004 ADC’04 - 67 Application #3 Schema Mapping
69
January 2004 ADC’04 - 68 Problem: Different Schemas Target Database Schema {Car, Year, Make, Model, Mileage, Price, PhoneNr}, {PhoneNr, Extension}, {Car, Feature} Different Source Table Schemas {Run #, Yr, Make, Model, Tran, Color, Dr} {Make, Model, Year, Colour, Price, Auto, Air Cond., AM/FM, CD} {Vehicle, Distance, Price, Mileage} {Year, Make, Model, Trim, Invoice/Retail, Engine, Fuel Economy}
70
January 2004 ADC’04 - 69 Solution: Remove Internal Factoring Discover Nesting: Make, (Model, (Year, Colour, Price, Auto, Air Cond, AM/FM, CD)*)* Unnest: μ (Model, Year, Colour, Price, Auto, Air Cond, AM/FM, CD)* μ (Year, Colour, Price, Auto, Air Cond, AM/FM, CD)* Table Legend ACURA
71
January 2004 ADC’04 - 70 Solution: Replace Boolean Values Legend ACURA β CD Table Yes, CD Yes, β Auto β Air Cond β AM/FM Yes, AM/FM Air Cond. Auto
72
January 2004 ADC’04 - 71 Solution: Form Attribute-Value Pairs Legend ACURA CD AM/FM Air Cond. Auto,,,,,,,,
73
January 2004 ADC’04 - 72 Solution: Adjust Attribute-Value Pairs Legend ACURA CD AM/FM Air Cond. Auto,,,,,,,
74
January 2004 ADC’04 - 73 Solution: Do Extraction Legend ACURA CD AM/FM Air Cond. Auto
75
January 2004 ADC’04 - 74 Solution: Infer Mappings Legend ACURA CD AM/FM Air Cond. Auto {Car, Year, Make, Model, Mileage, Price, PhoneNr}, {PhoneNr, Extension}, {Car, Feature} Each row is a car. π Model μ (Year, Colour, Price, Auto, Air Cond, AM/FM, CD)* Table π Make μ (Model, Year, Colour, Price, Auto, Air Cond, AM/FM, CD)* μ (Year, Colour, Price, Auto, Air Cond, AM/FM, CD)* Table π Year Table Note: Mappings produce sets for attributes. Joining to form records is trivial because we have OIDs for table rows (e.g. for each Car).
76
January 2004 ADC’04 - 75 Solution: Do Extraction Legend ACURA CD AM/FM Air Cond. Auto {Car, Year, Make, Model, Mileage, Price, PhoneNr}, {PhoneNr, Extension}, {Car, Feature} π Model μ (Year, Colour, Price, Auto, Air Cond, AM/FM, CD)* Table
77
January 2004 ADC’04 - 76 Solution: Do Extraction Legend ACURA CD AM/FM Air Cond. Auto {Car, Year, Make, Model, Mileage, Price, PhoneNr}, {PhoneNr, Extension}, {Car, Feature} π Price Table
78
January 2004 ADC’04 - 77 Solution: Do Extraction Legend ACURA CD AM/FM Air Cond. Auto {Car, Year, Make, Model, Mileage, Price, PhoneNr}, {PhoneNr, Extension}, {Car, Feature} Yes, ρ Colour←Feature π Colour Table U ρ Auto ← Feature π Auto β Auto Table U ρ Air Cond. ← Feature π Air Cond. β Air Cond. Table U ρ AM/FM ← Feature π AM/FM β AM/FM Table U ρ CD ← Feature π CD β CD Table Yes,
79
January 2004 ADC’04 - 78 Application #4 Semantic Web Creation
80
January 2004 ADC’04 - 79 The Semantic Web Make web content accessible to machines What prevents this from working? Lack of content Lack of tools to create useful content Difficulty of converting the web to the Semantic Web
81
January 2004 ADC’04 - 80 Converting Web to Semantic Web
82
January 2004 ADC’04 - 81 Superimposed Information
83
January 2004 ADC’04 - 82 Application #5 Agent Communication
84
January 2004 ADC’04 - 83 The Problem Requiring these assumptions precludes agents from interoperating on the fly “T he holy grail of semantic integration in architectures” is to “allow two agents to generate needed mappings between them on the fly without a priori agreement and without them having built-in knowledge of any common ontology.” [Uschold 02] Agents must: 1- share ontologies, 2- speak the same language, 3- pre-agree on message format.
85
January 2004 ADC’04 - 84 Solution Agents must: 1- share ontologies, 2- speak the same language, 3- pre-agree on message format. Eliminate all assumptions - Dynamically capturing a message’s semantics - Matching a message with a service - Translating (developing mutual understanding) This requires:
86
January 2004 ADC’04 - 85 MatchMaking System (MMS)
87
January 2004 ADC’04 - 86 Application #6 Ontology Generation
88
January 2004 ADC’04 - 87 TANGO: Table Analysis for Generating Ontologies Recognize and normalize table information Construct mini-ontologies from tables Discover inter-ontology mappings Merge mini-ontologies into a growing ontology
89
January 2004 ADC’04 - 88 Recognize Table Information Religion Population Albanian Roman Shi’a Sunni Country (July 2001 est.) Orthodox Muslim Catholic Muslim Muslim other Afganistan 26,813,057 15% 84% 1% Albania 3,510,484 20% 70% 30%
90
January 2004 ADC’04 - 89 Construct Mini-Ontology Religion Population Albanian Roman Shi’a Sunni Country (July 2001 est.) Orthodox Muslim Catholic Muslim Muslim other Afganistan 26,813,057 15% 84% 1% Albania 3,510,484 20% 70% 30%
91
January 2004 ADC’04 - 90 Discover Mappings
92
January 2004 ADC’04 - 91 Merge
93
January 2004 ADC’04 - 92 Presentation Outline Grand Challenge Meaning, Knowledge, Information, Data Fun and Games with Data Information Extraction Ontologies Applications Limitations and Pragmatics Summary and Challenges
94
January 2004 ADC’04 - 93 Limitations and Pragmatics Data-Rich, Narrow Domain Ambiguities ~ Context Assumptions Incompleteness ~ Implicit Information Common Sense Requirements Knowledge Prerequisites …
95
January 2004 ADC’04 - 94 Busiest Airport in 2003? Chicago - 928,735 Landings (Nat. Air Traffic Controllers Assoc.) - 931,000 Landings (Federal Aviation Admin.) Atlanta - 58,875,694 Passengers (Sep., latest numbers available) Memphis - 2,494,190 Metric Tons (Airports Council Int’l.)
96
January 2004 ADC’04 - 95 Busiest Airport in 2003? Chicago - 928,735 Landings (Nat. Air Traffic Controllers Assoc.) - 931,000 Landings (Federal Aviation Admin.) Atlanta - 58,875,694 Passengers (Sep., latest numbers available) Memphis - 2,494,190 Metric Tons (Airports Council Int’l.)
97
January 2004 ADC’04 - 96 Busiest Airport in 2003? Chicago - 928,735 Landings (Nat. Air Traffic Controllers Assoc.) - 931,000 Landings (Federal Aviation Admin.) Atlanta - 58,875,694 Passengers (Sep., latest numbers available) Memphis - 2,494,190 Metric Tons (Airports Council Int’l.)
98
January 2004 ADC’04 - 97 Busiest Airport in 2003? Chicago - 928,735 Landings (Nat. Air Traffic Controllers Assoc.) - 931,000 Landings (Federal Aviation Admin.) Atlanta - 58,875,694 Passengers (Sep., latest numbers available) Memphis - 2,494,190 Metric Tons (Airports Council Int’l.) Ambiguous Whom do we trust? (How do they count?)
99
January 2004 ADC’04 - 98 Busiest Airport in 2003? Chicago - 928,735 Landings (Nat. Air Traffic Controllers Assoc.) - 931,000 Landings (Federal Aviation Admin.) Atlanta - 58,875,694 Passengers (Sep., latest numbers available) Memphis - 2,494,190 Metric Tons (Airports Council Int’l.) Important qualification
100
January 2004 ADC’04 - 99 Dow Jones Industrial Average High Low Last Chg 30 Indus 10527.03 10321.35 10409.85 +85.18 20 Transp 3038.15 2998.60 3008.16 +9.83 15 Utils 268.78 264.72 266.45 +1.72 66 Stocks 3022.31 2972.94 2993.12 +19.65 44.07 10,409.85 Graphics, Icons, …
101
January 2004 ADC’04 - 100 Dow Jones Industrial Average High Low Last Chg 30 Indus 10527.03 10321.35 10409.85 +85.18 20 Transp 3038.15 2998.60 3008.16 +9.83 15 Utils 268.78 264.72 266.45 +1.72 66 Stocks 3022.31 2972.94 2993.12 +19.65 44.07 10,409.85 Reported on same date Weekly Daily Implicit information: weekly stated in upper corner of page; daily not stated.
102
January 2004 ADC’04 - 101 “Mad Cow” hurts Utah jobs “Utah stands to lose 1,200 jobs from Asian countries’ import bans on beef products,...” Common sense: a cow can’t hurt jobs.
103
January 2004 ADC’04 - 102 “Mad Cow” hurts Utah jobs “Utah stands to lose 1,200 jobs from Asian countries’ import bans on beef products,...” Knowledge required for understanding: Mad Cow disease discovered in Washington. Washington state (not DC), which is in the western US. Humans can get the disease by eating contaminated beef. Utah is in the western US. Beef cattle are regionally linked(somehow?) People in Asian countries don’t want to get sick.
104
January 2004 ADC’04 - 103 Presentation Outline Grand Challenge Meaning, Knowledge, Information, Data Fun and Games with Data Information Extraction Ontologies Applications Limitations and Pragmatics Summary and Challenges
105
January 2004 ADC’04 - 104 Some Key Ideas Data, Information, and Knowledge Data Frames Knowledge about everyday data items Recognizers for data in context Ontologies Resilient Extraction Ontologies Shared Conceptualizations Limitations and Pragmatics
106
January 2004 ADC’04 - 105 Some Research Issues Building a library of open source data recognizers Creating a corpora of test data for extraction, integration, table understanding, … Precisely finding and gathering relevant information Subparts of larger data Scattered data (linked, factored, implied) Data behind forms in the hidden web Improving concept matching Indirect matching Calculations and unit conversions …
107
January 2004 ADC’04 - 106 Some Research Challenges Automating ontology construction Converting web data to Semantic Web data Accommodating different views Developing effective personal software agents … www.deg.byu.edu
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.