Search-Logger Analyzing Exploratory Search Tasks

UNSPECIFIED Search-Logger Analyzing Exploratory Search Tasks., 2011 [Conference proceedings]

[thumbnail of SAC2011.pdf]
Preview
PDF
SAC2011.pdf

Download (1MB) | Preview

English abstract

In this paper, we focus on a speci c class of search cases: exploratory search tasks. To describe and quantify their complexity, we present a new methodology and corresponding tools to evaluate the user behavior when carrying out exploratory search tasks. These tools consist of a client called Search-Logger, and a server side database with frontend and an analysis environment. The client is a plug-in for Firefox web browsers. The assembly of the Search-Logger tools can be used to carry out user studies for search tasks independent of a laboratory environment. It collects implicit user information by logging a number of signi cant user events. Explicit information is gathered via user feedback in the form of questionnaires before and after each search task. We also present the results of a pilot user study. Some of our main observations are: When carrying out exploratory search tasks, classic search engines are mainly used as an entrance point to the web. Subsequently users work with several search systems in parallel, they have multiple browser tabs open and frequently use the clipboard to memorize, analyze and synthesize potentially useful data and information. Exploratory search tasks typically consist of various sessions and can span from hours up to weeks.

Item type: Conference proceedings
Keywords: Exploratory search tasks, Search-Logger, Search Engine
Subjects: L. Information technology and library technology > LM. Automatic text retrieval.
L. Information technology and library technology > LS. Search engines.
Depositing user: Dirk Lewandowski
Date deposited: 27 Jun 2012
Last modified: 02 Oct 2014 12:22
URI: http://hdl.handle.net/10760/17231

References

P. Borlund and P. Ingwersen. Measures of relative relevance and ranked half-life: performance indicators for interactive IR. In Proceedings of the 21st annualinternational ACM SIGIR conference on Research and development in information retrieval, page 324-331. ACM New York, NY, USA, 1998.

R. Capra. HCI browser: A tool for studying web search behavior. Available from: http://uiir-2009.dfki.de/papers/uiir2009_submission_18.pdf [cited September 10, 2010].

M. Claypool, P. Le, M. Wased, and D. Brown. Implicit interest indicators. In Proceedings of the 6th international conference on Intelligent user interfaces, page 40, 2001.

C. W. Cleverdon, J. Mills, and E. M. Keen. An inquiry in testing of information retrieval systems.(2 vols.). Cran leld, UK: Aslib Cran eld Research Project, College of Aeronautics, 1966.

P. Clough and B. Berendt. Report on the TrebleCLEF query log analysis workshop 2009. In ACM SIGIR Forum, volume 43, page 71-77, 2009.

S. T. Dumais and N. J. Belkin. TREC Experiment and Evaluation in Information Retrieval, chapter The TREC Interactive Track: Putting the User Into Search. MIT Press, 2005.

S. Fox, K. Karnawat, M. Mydland, S. Dumais, and White. Evaluating implicit measures to improve web search. ACM Transactions on Information Systems (TOIS), 23(2):147-168, 2005.

J. Freyne, R. Farzan, P. Brusilovsky, B. Smyth, and M. Coyle. Collecting community wisdom: integrating social search and social navigation. In Proceedings of the 12th international conference on Intelligent user interfaces, pages 52-61, Honolulu, Hawaii, USA, 2007.ACM. Available from: http://portal.acm.org/citation.cfm?id=1216295.1216312, doi:10.1145/1216295.1216312.

C. Grimes, D. Tang, and D. M. Russell. Query logs alone are not enough. In Workshop on Query Log Analysis at WWW. Citeseer, 2007.

M. Hu, E. Lim, A. Sun, H. W. Lauw, and B. Vuong. On improving wikipedia search using article quality. In Proceedings of the 9th annual ACM international workshop on Web information and data management, pages 145-152, Lisbon, Portugal, 2007. ACM. Available from: http://portal.acm.org/citation.cfm?id=1316926, doi:10.1145/1316902.1316926.

B. J. Jansen, R. Ramadoss, M. Zhang, and N. Zang. Wrapper: An application for evaluating exploratory searching outside of the lab. EESS 2006, page 14.

B. J. Jansen and S. Y. Rieh. The seventeen theoretical constructs of information searching and information retrieval. Journal of the American Society for Information Science and Technology, pages n/a-n/a, 2010. Available from: http://onlinelibrary.wiley.com/doi/10.1002/asi.21358/abstract, doi:10.1002/asi.21358.

B. J. Jansen, A. Spink, C. Blakely, and S. Koshman. De ning a session on web search engines. Journal of the American Society for Information Science and Technology, 58(6):862-871, 2007. Available from: http://onlinelibrary.wiley.com/doi/10.1002/asi.20564/full, doi:10.1002/asi.20564.

D. Kelly, S. Dumais, and J. Pedersen. Evaluationchallenges and directions for information seeking support systems. IEEE Computer, 42(3), 2009.

W. Kraaij and W. Post. Task based evaluation of exploratory search systems. EESS 2006, page 24.

B. Kules and R. Capra. Designing exploratory search tasks for user studies of information seeking support systems. In Proceedings of the 9th ACM/IEEE-CS joint conference on Digital libraries, pages 419-420, Austin, TX, USA, 2009. ACM. Available from: http://portal.acm.org/citation.cfm?id=1555400.1555492, doi:10.1145/1555400.1555492.

D. Lewandowski and N. H ochst otter. Web searching: A quality measurement perspective. Web Search, Information Science and Knowledge Management, 14, 2008.

G. Marchionini. Exploratory search: from nding to understanding. Communications of the ACM, Volume 49(4):Pages 41-46, 2006.

H. O'Brien. De ning and Measuring Engagement in User Experiences with Technology. Unpublished doctoral dissertation, Dalhousie University, Halifax, Canada. 2008.

R. W. Reeder, P. Pirolli, and S. K. Card. Webeyemapper and weblogger: Tools for analyzing eye tracking data collected in web-use studies. In CHI'01 extended abstracts on Human factors in computing systems, page 20, 2001.

D. Tunkelang. Precision AND recall. IEEE Computer, 42(3), 2009.

D. Turnbull. Webtacker: A tool for understanding web use. Unpublished report. Retreived on May, 18(2009):19-23, 1998.

L. Vaughan. New measurements for search engine evaluation proposed and tested. Information processing and management, 40(4):677-691, 2004.

E. M. Voorhees and D. K. Harman. TREC: Experiment and evaluation in information retrieval. MIT Press, 2005.

S. Weitz. Search isn't search. Microsoft company report, Microsoft Corporation, SMX 2009, Munich.

R. W. White, B. Kules, and S. M. Drucker. Supporting exploratory search, introduction, special issue, communications of the ACM. Communications of the ACM, 49(4):36-39, 2006.

R. W. White, G. Marchionini, and G. Muresan. Evaluating exploratory search systems: Introduction to special topic issue of information processing and management. Information Processing & Management, 44(2):433-436, Mar. 2008. Available from: http://www.sciencedirect.com/science/article/B6VC8-4R2H21N-3/2/9c2f50439d974562ab4fa0504e0bb865, doi:10.1016/j.ipm.2007.09.011.

R. W. White, G. Muresan, and G. Marchionini. Report on ACM SIGIR 2006 workshop on evaluating exploratory search systems. In ACM SIGIR Forum, volume 40, page 52-60. ACM New York, NY, USA, 2006.

R. W. White and R. A. Roth. Exploratory search: Beyond the Query-Response paradigm. Synthesis Lectures on Information Concepts, Retrieval, and Services, 1(1):1-98, 2009.


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item