Presentation on theme: "Robert P. Spindler June, 2014 AN EVALUATION OF CROWDSOURCING AND PARTICIPATORY ARCHIVES PROJECTS FOR ARCHIVAL DESCRIPTION AND TRANSCRIPTION."— Presentation transcript:
Robert P. Spindler firstname.lastname@example.org June, 2014 AN EVALUATION OF CROWDSOURCING AND PARTICIPATORY ARCHIVES PROJECTS FOR ARCHIVAL DESCRIPTION AND TRANSCRIPTION
ARCHIVAL DESCRIPTION, DIGITAL LIBRARIES MOVING IN DIFFERENT DIRECTIONS Acquisitions by Repository (linear feet) Department of Archives and Special Collections 1990-2014 Fiscal Year Arizona Collection Chicano Collection Child Drama Labriola Center Special Collections University Archives Visual Literacy TOTAL 1990-9150.50163.00 327.0015.00555.50 1991-92499.20200.85 238.6015.00953.65 1992-93308.7019.95 125.731.30455.68 1993-94239.9743.67 0.06 207.760.50491.96 1994-953,772.4199.18 0.04 140.4217.654,029.70 1995-96130.1649.00 0.97 217.101.27398.50 1996-97185.8225.30 3.37 229.200.87444.56 1997-98167.5056.00 5.31 258.5812.00499.39 1998-99450.9138.15 1.62 381.9228.77901.37 1999-200080.0872.85 40.04 161.773.00357.74 2000-01104.0914.02 8.68 289.6115.00431.40 2001-02211.3526.56 18.04 123.823.04382.81 2002-03576.0870.54172.0019.12 312.936.001,156.67 2003-04265.0327.2734.3822.5022.71573.951.00946.84 2004-05169.4833.12593.3314.2533.03476.06.501,319.77 2005-0646.5410.54313.4538.9284.18153.87 647.50 2006-0741.6528.4120.00 15.33354.861.50481.75 2007-08177.8745.5163.0013.258.50229.4411.00548.57 2008-09168.1646.0060.502.0017.98169.766.00470.40 2009-1026.2654.75262.2520.207.50293.610.00665.57 2010-1167.9123.0031.305.259.0045.211.70261.57 2011-12130.7571.00152.5011.7511.00165.85.25543.10 2012-132,109.658.2562.25 11.2739.3302293.00 2013-14104.481.096.5015.03.048.450268.43 TOTAL10,084.551,227.921,861.46322.62223.505,564.83141.3519,505.43
MUSEUMS, THEN LIBRARIES, THEN ARCHIVES… What Happens After “Here Comes Everybody”: An Examination of Participatory Archives Society of American Archivists Annual Meeting, 2011 Dr. Robert B. Townsend (Chair & Commentator) Deputy Director American Historical Association Kate Theimer ArchivesNext Exploring the Participatory Archives Dr. Elizabeth Yakel University of Michigan Credibility in the Participatory Archives Alexandra Eveleigh University College London Crowding Out the Archivist? A British Perspective on Participatory Archives
CROWDSOURCING, CROWDFUNDING AND SOCIAL BEHAVIOR Crowdsourcing has its origins in early 21st century crowd funding initiatives. There are important distinctions between crowdsourcing, social engagement and participatory archives. While large numbers of individuals visit crowdsourcing projects, few make sustained and useful contributions. Powerful feelings of ownership, belonging and connectedness are derived from feedback provided by the crowdsourcing system or the associated community, and these feelings along with a sense of shared authority motivate the most dedicated participants.
ISTO HUVILA AND SHARED AUTHORITY (2008) Huvila’s progressive view of participatory archives is characterized by “decentralised curation, radical user orientation, and contextualization of both records and the entire archival process”. “Rethinking the relationship between official and unofficial knowledge is probably the main challenge that cultural institutions have to face when undertaking a crowdsourcing process.” Huvila, Isto, Participatory Archive: Towards Decentralised Curation, Radical User Orientation, and Broader Contextualisation of Records Management, Archival Science, Volume 8, Number 1 (2008),15-36. Carletti, Laura, UK, Gabriella Giannachi, UK, Dominic Price, UK, Derek McAuley, UK, “Digital Humanities and Crowdsourcing: An Exploration”, MW2013: Museums and the Web 2013 Conference, April, 2013.
THEIMER, EVELEIGH AND PARTICIPATORY ARCHIVES Participatory archives seek public contributions of work or information that expands our useful knowledge of culture and history. It is more than conversational social engagement as seen in Facebook or Flickr.
METHODS – IMPROVING QUALITY Project developers have experimented with a number of methods to improve the quality of knowledge or metadata production by combining social participation with standards or systems based solutions. Projects seem to be moving toward separate professionally curated and socially curated spaces, although linkages between the spaces are clumsy and manual in most current applications.
METHODS – IMPROVING QUALITY Mediation can improve quality, but it is work-intensive and can leave the host institution vulnerable to claims of censorship, especially when the rules of engagement are not clearly stated in advance. Participants may have an expectation that their posts will be permanently preserved. Peer mediation can be more effective than professional mediation.
METHODS – IMPROVING QUALITY Several technologies can be used to improve quality such as heat maps, transcription version comparisons, personalization features and reward systems. Open source gaming solutions for improving metadata quality are now available. “Computational techniques” can be applied to extract, normalize, and disambiguate terms used in social tags.