Presentation is loading. Please wait.

Presentation is loading. Please wait.

User Training vs. System Training Dr. Mark Hepworth Department of Information Science.

Similar presentations

Presentation on theme: "User Training vs. System Training Dr. Mark Hepworth Department of Information Science."— Presentation transcript:

1 User Training vs. System Training Dr. Mark Hepworth Department of Information Science

2 People Zipf s law: people will choose the path of least resistance or "effort" … seeking behaviour stops as soon as minimally acceptable results are found (found at [Accessed ]) But how does acceptable get defined?

3 Young learners Are no different! And perform to expectation! Schooling has encouraged pragmatic, passive, surface learning … fit for assessment Schooling provides little opportunity to develop independent, self directed, learning skills and appropriate attitudes Educators do not seem to be aware of the complexity of information seeking, management and use processes Employers … Search engines have fostered: a shallow attitude to information retrieval a naïve belief in the power of the machine an uncritical approach to information retrieval.

4 Learners (beginning semester 1) 'I don't really understand what 'database' means. Is there any difference between the information that we search by using the normal search engine and the databases that we found via 'Metalib'?

5 Learners (mid semester 1) the information retrieved from Google was not as specific as a database [compared to Emerald] the Boolean AND function allowed me to select a subject information science AND (in the title) security system … to narrow my search I used Computer & Information systems database the information returned was reliable I used ABI … you could be more restrictive … there are a lot of variables you can adjust …this makes it easier to locate material that you are looking for Emerald has many features that allow the search to be refined … the search results can be broadened or limited through using Boolean searching … phrase … combining terms made the information more specific

6 The penny drops! I used the Web of Science advanced search facility to research childrens information seeking behaviour in libraries. Data has to be entered in the search box in a precise way using commands to search for words in topics, titles etc. You can also search for authors, year published etc in the initial search. You can use the Boolean operators – AND, NOT, OR and SAME. I searched for information seeking which brought up 7344 articles on many different subjects and user groups in medicine as well as in libraries and other organisations. I then searched for articles containing the words children and library in the initial search and this brought up 919 document types. There was the facility on the database to combine these two searches which brought up 27 records. This was an easier number to deal with and there were several document types which looked at childrens use of libraries and developing library services for younger users. In order to refine the search more I could have selected publication years, document types, subject areas, e.g. information science and library science, authors and many more. In each case the number of documents relating to that area was stated. I found using this database easy once you know how to use the commands. It is also extremely powerful and user friendly and actually very satisfying as you can often find whole documents. I found some quite interesting ones I might use in my assignment. A disadvantage of using this database, however, was its American bias. Out of the 27 document types I found there were 19 American ones and only one from the UK.

7 Experience the benefits – see the value Students are NOT opposed to effort when it pays off (look at the range of variables embraced in game playing) Students CAN use and DO see the VALUE of more complex IR systems BUT they need to be taught in problem based, contextually relevant, interesting way.

8 There are common problems associated with the search process Define the topic you want to search on Identify useful words/concepts Determine what kind of info. Identify sources Select sources Understand functionality Develop search strategy Choose type of search Choose search technique View/browse results Select items Refine search Capture information

9 Common user experiences Too much Too little Irrelevant Caused by: Not knowing where to go Not knowing how the systems work Wrong terms – unfamiliar with the domain

10 20 years for the penny to drop! Search engines seemed to ignore the online and CDROM experience Naïve belief in their algorithms (eventually had help systems and an advanced search!) BUT have changed Podcast of a marketing manager of an IR firm who knew of Marcia Bates … had read information behaviour research!

11 Better times Federated systems (less need to know where to go) Faceted classification (different views, enables narrowing down, prompts the user) Tag clouds (helps with limited domain knowledge, prompts search terms, help to narrow and broaden the search) Possibly less IR training, quicker converts

12 Room for improvement Its not just about the retrieval process It doesnt have to be all done in the background – give people control However, within the IR remit need to allow: better exploration of the subject domain selection of multiple terms more control – narrowing and broadening More prompted - narrowing and broadening Dependent on the richness of the record i.e. the metadata (the richer the descriptions of the information objects the more that can be extracted)

13 Connecting with information in more depth Connecting with information Topic investigation Identify key words Map the domain See connections Source orientation Identifyin g sources & systems Understanding functionality Evaluating sources Constructing more specific searches using the full functionality of the systems Behaviour Scanning, browsing, chaining Mind maps, concept tables Organising, storing information Thinking Analysis Evaluation Synthesis Reflection (on process, your thinking, knowledge, feelings, learning style) Refining & interpreting Questioning & challenging Constructing new knowledgeKnowing enough? Using more subject specific sources

14 Still a need for training To effectively use the full range of information resources higher level skills are required To be an effective, empowered, learner Need to understand the information landscape Need to understand the information and knowledge creation and storage process Need to understand the value of information Need to understand that IR is part of learning and its messy For some tasks precision and power is required and so on …use, communication … information literacy

15 Joining forces – it is happening! Information & Computer Scientists - peoples information behaviour, information literacy and information retrieval Information retrieval & electronic publishing industry professionals – the products, the needs, the technology Information professionals (librarians, information specialists) – the environment, the needs, information literacy MORE HOLISTIC Systems that support the wider information literacy experience

16 User training or system training? Questions Comments Observations

Download ppt "User Training vs. System Training Dr. Mark Hepworth Department of Information Science."

Similar presentations

Ads by Google