Ontology Summit 2007 Population Framework and Survey Analysis Ken Baclawski.

Slides:



Advertisements
Similar presentations
What is eLearning? E = electronic Not always distance learning Can be online-via the Internet Offline using CDROMs etc. Free standing or combined with.
Advertisements

1 Lessons Learned from Virtual Organizing for the Ontology Summit 2007 Presented by Ontolog: Steve Ray, Peter Yim, Frank Olken, Ken Baclawski, Doug Holmes,
1 Lessons Learned from Virtual Organizing for the Ontology Summit 2007 Presented by Ontolog: Steve Ray, Peter Yim, Frank Olken, Ken Baclawski, Doug Holmes,
Emerging Ontology Showcase Ken Baclawski Northeastern University.
Engaging high school science teachers as learners and collaborators to develop instructional materials Ashley Shade Delta Internship Spring 2009.
School Improvement Through Capacity Building The PLC Process.
Engaging Students in Classroom Discussion. Learning Targets O I can explain the difference between text-based discussion and other discussion formats.
Addressing the needs of the local.  members new in their role  previous training  local issues and culture  prior knowledge and information  work.
Web technologies Session 3. Slide 3.1  To develop participants’ knowledge, skills and understanding of website structure  To use the system life cycle.
Landscapes of participation: Findings & lessons from community mapping Eddie Cowling and Ellie Brodie NCVO/VSSN Researching the Voluntary Sector Conference.
For several years, NOAA has been assessing how well we are doing at “enabling” local capacity for accurate positioning. This metric:  Makes use of NOAA’s.
Identifying enablers & disablers to change
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
ImageCLEF breakout session Please help us to prepare ImageCLEF2010.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Open Educational Resources Sample Communications Slides for Educators and Administrators CC BYCC BY Achieve 2015.
Back to Cool How to Teach For Understanding T-527.
Community Needs Assessments Thomas P. Holland, Ph.D., Professor UGA Institute for Nonprofit Organizations.
Training of Trainers Workshop Reference Centres and Regional Partners in the Performance Review and Assessment of Implementation of the Strategy (PRAIS)
By Dominique Rabine-Bucknor. The Army Career and Alumni Program (ACAP) have counselors available to address the many questions available to transitioning.
What is e-learning? Using the post-it notes please write the key words you associate with e-learning. E-learning is a broad term which is in essence any.
Intel Teach Elements Collaboration in the Digital Classroom Charity I. Mulig First Webinar Session October 18, :00 – 9:30 pm.
Exploring Online Assessment Techniques: Wikis & Discussion Boards September 28, 2006 Margaret (Peggy) Cassey, MPH, RN, BC Office 1024 College of Nursing.
Lessons Learned from Virtual Organizing for the Ontology Summit 2007 Presented by Ontolog: Steve Ray, Peter Yim, Frank Olken, Ken Baclawski, Doug Holmes,
The Savvy Cyber Teacher ® Using the Internet Effectively in the K-12 Classroom 1 Savvy Cyber Teacher ® Using the Internet Effectively in the K-12 Classroom.
ISLLC Standard #2 Implementation
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
Building a Strong Regional Team Module Three. Reflecting on the Previous Session What was most useful from the previous session? What progress have you.
PVNC MATH 3 PROJECT DIGGING DEEPER WITH DIFFERENTIATION Session 2.
Autism Team Training in South Dakota Presented by Brittany Schmidt, MA-CCC/SLP Center for Disabilities Autism Spectrum Disorders Program
Strategies for Grappling with a Changing Business Environment and Achieving Excellence Kathy Koehler Koehler Partners September 19, 2012.
Office of School Improvement Differentiated Webinar Series A Framework for Formative Assessment November 15, 2011 Dr. Dorothea Shannon Dr. Greg Wheeler.
TUALATIN TOMORROW VISION AND STRATEGIC ACTION PLAN UPDATE City Council Work Session – June 24, 2013.
LESSON STUDY INSTRUCTIONAL LEARNING TEAMS – LIVING WITHIN A PROFESSIONAL LEARNING COMMUNITY By Nancy J. Larsen, Ph.D.
ATL’s in the Personal Project
Ontology Summit2007 Survey Response Analysis Ken Baclawski Northeastern University.
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
Evaluation Process and Findings. 300 briefings and presentations 10,000 people 400 workshops 12,000 people 175 service trips 3,000 people Program Activities.
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
 Knowledge management is managing the organization’s knowledge by means of systematic and organized processes.  For acquiring, organizing, sustaining,
“Broad Community Support”- IFC’s Current Practice October 9, 2008.
LPSA Zone Leaders Meeting Illinois Science Teachers Association (ISTA) Illinois Section of the American Association of Physics Teachers (ISAAPT) Illinois.
Information Literacy: Assessing Your Instruction Texas Library Association District 10 Fall Workshop October 15, 2005 Michelle Millet Information Literacy.
Benefits Logic Mapping. What is it? A way of determining the rationale for an investment. A one page ‘story’ of an investment. A communication tool. Re-visited.
ASD20’s Learning Services Department Curriculum & Instruction Team April 2013.
6 Steps to Implement 21st Century Skills Increasing Awareness It is important that all members of the educational community understand and.
Building Effective Staff Development to Support Employer Engagement Jane Timlin & Renata Eyres The University of Salford.
Evaluation Assistant Research Projects EAs are required to lead an evaluation research project for the academic year.
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Experiences in Soliciting and Incorporating Feedback from Diverse Data Users and Providers Presented by Suzanne King U.S. Department of Agriculture, National.
2005 Adobe Systems Incorporated. All Rights Reserved. Authorware End-of-Development: Current Status and Future Direction Ellen Wagner, Adobe Systems.
Climate Change Response Framework projects Presentation to the Forest Community Vulnerability and Adaptive Capacity in the Context of Climate Change Workshop.
Moodle Wiki Trial Design for Online Learning SEM
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
1 Lessons Learned from Virtual Organizing for the Ontology Summit 2007 Presented by Ontolog: Steve Ray, Peter Yim, Frank Olken, Ken Baclawski, Doug Holmes,
JOHN NORTON’S PORTFOLIO Once upon a Time. INTASC 1  “Making Content Meaningful”, refers to the teacher helping the students make a connection to the.
BREAKOUT SESSION 1 Refining and Defining Learning Outcomes Dr. Kenneth Royal.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
Phases of Web Usage From its earliest days, the web supported read/write functionality. That is, it enabled users to edit or tag web pages in the browser.
Statistical process model Workshop in Ukraine October 2015 Karin Blix Quality coordinator
IP Community Outreach (Suite of Patent Products) Andrew Faile Director, TC 2600 United States Patent and Trademark Office
NATIONAL EDUCATION INITIATIVE FOR K–12 CLASSROOMS.
THE USE OF TWO TOOLS TO EVALUATE HEALTHY START COALITIONS Susan M. Wolfe, Ph.D.
Pedagogical aspects in assuring quality in virtual education environments University of Gothenburg, Sweden.
Engagement of Informal Learners Undertaking Open Online Courses and the Impact of Design Hannah Gore Senior Producer: Social & Syndication.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Building Knowledge about ESD Indicators
Implementing the Feedback Protocols
Presentation transcript:

Ontology Summit 2007 Population Framework and Survey Analysis Ken Baclawski

Objectives Outreach to the communities that have an interest in ontologies Collection of terminology related to ontologies from as many communities as possible Understand the different types of artifacts that fall broadly within the range of ontologies Ultimately help develop better methods for comparing, combining and mapping ontologies to one another.

Mechanisms 1.Survey solicted via broadcast to Ontolog and other collegial mailing lists. 2.Respondants' input collected via a web form, with results openly available on wiki and in csv and xls format 3.Survey analysis/synthesis presented on the wiki 4.Presentation at face-to-face workshop 5.Group breakout session at workshop 6.Followup with detailed assessment criteria on the wiki

Results Reached more than twice as many communities as originally anticipated Much larger diversity of terminology than previously realized The framework dimensions were revised based partly on the population analysis –Dimensions were added/dropped –Assessment criteria were tested and refined

Unexpected Benefits The original focus was on assessment criteria for ontology artifacts. The survey also helped to understand who was participating in the summit: –Large number of communities –Large variety of domains –Diverse collection of ontology artifacts Concerns and issues of the communities were articulated prior to the summit –Avoided neglecting any communities –Helped foster an atmosphere of inclusiveness at the summit

What worked well The survey was very effective at meeting its objectives The survey had many unexpected benefits The wiki enabled effective communication of complex survey analyses that would be difficult to convey over a mailing list. Improved productivity at the workshop

What didnt work well Survey design could have improved, if given more time for community input –Respondents did not always understand what was being asked –Responses were often misplaced A skilled analyst is necessary to extract and organize survey data –Questions were necessarily open-ended –One must expect the unexpected

Lessons learned Surveys can be complementary to online discussions and other collaborative tools Use of break out sessions was very helpful for improving productivity at the meeting