Presentation on theme: "Tags: How do users describe their information? Michelle Frisque, Head, Information Systems, Galter Health Sciences Library, Northwestern University,Chicago,"— Presentation transcript:
Tags: How do users describe their information? Michelle Frisque, Head, Information Systems, Galter Health Sciences Library, Northwestern University,Chicago, IL. OBJECTIVE: This poster will examine how users tag medical articles in research oriented social bookmarking tools such as CiteULike and Connotea. METHODS (REVISED): Since the original abstract was written CiteULike has undergone changes, thus the methodology had to be revised in order to get a sufficient number of results to analyze. Here is the revised methodology: Identified a specialty Identified 4 journals within the specialty to search Randomly selected 50 articles reviewed from each journal Each article must have at least 3 people who tagged the article (taggers) Articles must be indexed in PubMed There was no restriction on the publication date Journals must have a minimum of 10 articles that meet above criteria in order to be included The taggers usernames and the tags they used for the selected articles were gathered and analyzed to determine how taggers describe the articles they gather. The user-generated terms were then compared to the MeSH descriptors for the selected articles to see if there was any overlap between user- created and assigned descriptors for each article. The original plan was to compare the tags from the articles gathered from the CiteULike search with those same articles in Connetea, another free online citation manager. However after comparing the results there was not enough data to analyze. THE FINAL SEARCH: NEUROSCIENCE The journals that were analyzed: Nature Neuroscience (Nat Neurosci) Annual Review of Neuroscience (Annu Rev Neurosci) Nature Reviews: Neuroscience (Nat Rev Neurosci) Neuron UNSUCCESSFUL SEARCHES: The first step in the process was to identify a specialty. This was harder to do than originally planned because the searches did not provide enough data to analyze. Here are some examples of specialties that were chosen and journals that were searched without success: Cardiology Circulation, Circulation research, American journal of cardiology, Journal of the American College of Cardiology… Emergency Medicine Academic emergency medicine, American journal of emergency medicine, Annals of emergency medicine, Resuscitation Obstetrics and Gynecology American journal of obstetrics and gynecology, Clinical obstetrics and gynecology, Fertility and sterility… Neurology Annals of neurology, Archives of neurology, Neurosurgery, Lancet neurology, Brain, Stroke, Neuroscientist THE RESULTS: is a free web-based tool to help scientists, researchers and academics store, organize, share and discover links to academic research papers. It was founded in 2004 and is not associated with a publisher. Overview 52 articles across 4 journals were analyzed. Tagger Articles Tagged Gilmcher_lab16 Klouie12 Balicea9 ReadingLab7 mrkrause6 garyfeng6 Top Taggers (across all journals) There was a total of 121 taggers. Top Tags There were a total of 274 unique tags used. Those tags were used 424 times. TagTimes Used review19 decisionmaking *10 notag8 attention8 monkey7 neurophysiology7 * includes decisionmaking and decision_making because CiteULike does not allow spaces in tags How many journals did each tagger tag? How many tags are also MeSH terms? What Taggers Tag CONCLUSION The data gathered shows that the majority of the taggers did not use MeSH terminology to describe their chosen articles. Out of the 430 tags used, only 13% were also MeSH terms. However, I do not think there is enough data available within this data set to determine how the larger research community describes their information. While this study examined the tagging practices of over 120 individuals only 30% of those taggers had actually tagged more then one article.