A Comparative Assessment of Answer Quality on Four Question Answering Sites

Shachaf, Pnina A Comparative Assessment of Answer Quality on Four Question Answering Sites. Journal of Information Science, 2011, vol. 20, n. 10, pp. 1-13. [Journal article (Paginated)]

[img]
Preview
Text (A Comparative Assessment of Answer Quality on Four Question Answering Sites)
Comparative Assessment on Answer Quality, 2011.pdf - Draft version

Download (343kB) | Preview

English abstract

Question answering (Q&A) sites, where communities of volunteers answer questions, may provide faster, cheaper, and better services than traditional institutions. However, like other Web 2.0 platforms, user-created content raises concerns about information quality. At the same time, Q&A sites may provide answers of different quality because they have different communities and technological platforms. This paper compares answer quality on four Q&A sites: Askville, WikiAnswers, Wikipedia Reference Desk, and Yahoo! Answers. Findings indicate that: 1) the use of similar collaborative processes on these sites results in a wide range of outcomes. Significant differences in answer accuracy, completeness, and verifiability were found; 2) answer multiplication does not always result in better information. Answer multiplication yields more complete and verifiable answers but does not result in higher accuracy levels; and 3) a Q&A site’s popularity does not correlate with its answer quality, on all three measures.

Item type: Journal article (Paginated)
Keywords: Q&A Sites; Social Q&A; Community Question Answering; Social Reference; Information Quality; Crowd-sourcing
Subjects: A. Theoretical and general aspects of libraries and information. > AB. Information theory and library theory.
B. Information use and sociology of information > BA. Use and impact of information.
B. Information use and sociology of information > BC. Information in society.
B. Information use and sociology of information > BG. Information dissemination and diffusion.
H. Information sources, supports, channels. > HI. Electronic Media.
H. Information sources, supports, channels. > HL. Databases and database Networking.
H. Information sources, supports, channels. > HQ. Web pages.
I. Information treatment for information services > ID. Knowledge representation.
I. Information treatment for information services > IJ. Reference work.
L. Information technology and library technology > LC. Internet, including WWW.
L. Information technology and library technology > LZ. None of these, but in this section.
Depositing user: Amanda Ferrara
Date deposited: 07 Oct 2013 15:34
Last modified: 02 Oct 2014 12:28
URI: http://hdl.handle.net/10760/20328

References

[1] Surowiecki J. The wisdom of crowds. New York; Anchor Books, 2004.

[2] O'Reilly T. What is Web 2.0? http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html (accessed 20 August 2008).

[3] Dom B and Paranjpe D. A Bayesian technique for estimating the credibility of question answerers. In Proceedings of the Society for Industrial and Applied Mathematics (SIAM) (2006).

http://www.siam.org/proceedings/datamining/2008/dm08_36_Dom.pdf (accessed 20 August 20, 2008).

[4] Adamic LA, Zhang J, Bakshy E and Ackerman MS. Knowledge sharing and Yahoo! Answers: Everyone knows something. In: Proceedings of the International World Wide Web Conference, (Beijing, ACM, 2008).

[5] Giles J. Internet encyclopedias go head to head, Nature, December 14, 2005.

http://www.nature.com/news/2005/051212/full/438900a.html (accessed 19 August 2008).

[6] Noguchi Y. Web searches go low-tech: You ask, a person answers, Washington Post, August 16, 2006: A01, http://www.washingtonpost.com/wp-dyn/content/article/2006/08/15/AR2006081501142.html (accessed 20 August 2008).

[7] Shachaf P. Social reference: a unifying theory. Library & Information Science Research 2010; 32(1): 66-76.

[8] Weinberger D. Everything is miscellaneous: The power of the new digital disorder. New York: Henry Holt & Co, 2007.

[9] Keen E. The cult of the amateur: how today’s Internet is killing our culture. New York: Doubleday/Currency, 2008.

[10] Shachaf P and Rosenbaum H. Online social reference: A research agenda through a STIN framework. In: Proceedings of the iConference 2009, Feb 8-11, 2009, Chapel Hill, NC.

[11] Saxton ML and Richardson JV. Understanding reference transactions: transforming an art into a science. San Diego, CA: Academic Press, 2002.

[12] Rieh SY. Judgment of information quality and cognitive authority in the Web, Journal of the American Society for Information Science and Technology 2002; 53(2): 145–161.

[13] Fallis D. On verifying the accuracy of information: Philosophical perspectives, Library Trends 2004; 52(3): 463–487.

[14] Frické M and Fallis, D. Indicators of accuracy for answers to ready reference questions on the Internet, Journal of the American Society for Information Science and Technology 2004; 55(3): 238–245.

[15] Kim S and Oh S. Users’ relevance criteria for evaluating answers in social Q&A site, Journal of the American Society for Information Science and Technology 2009; 60(4): 716-727.

[16] Kim S. Questioners' credibility judgments of answers in a social question and answer site, Information Research, 2010; 15(2): paper 432. http://InformationR.net/ir/15-2/paper432.html (accessed December 6 2010).

[17] Stvilia B, Twidale MD, Smith LC and Gasser L. Information quality work organization in Wikipedia, Journal of the American Society for Information Science and Technology 2008; 59(6): 983–1001.

[18] Shachaf P. The paradox of expertise: Is the Wikipedia Reference Desk as good as your library? Journal of Documentation 2009; 65(6): 977-963.

[19] Bian J, Liu Y, Agichtein E and Zha H. Finding the right facts in the crowd: Factoid question answering over social media. In: Proceedings of the International World Wide Web Conference, (Beijing, ACM, 2008).

[20] F.M. Harper, D. Raban, S. Rafaeli and J.A. Konstan, Predictors of answer quality in online Q&A sites. In:

Proceedings of the Conference on Human Factors in Computing Systems, (Florence, ACM, 2008).

[21] O’Neill N. Chacha, Yahoo!, and Amazon, Searcher 2007; 15(4): 7-11.

[22] Howell BJ, Reeves HB and van Willigen J. Fleeting encounters: A role analysis of reference librarian-patron interaction, Reference Quarterly 1976; 16: 124-129.

[23] Poston RS and Speier C. Effective use of knowledge management systems: A process model of content ratings and credibility indicators. MIS Quarterly 2005; 29(2): 221-244.

[24] Suryanto MA, Sun A, Lim E and Chiang RHL. Quality-aware collaborative question answering: Methods and evaluation. In: Proceedings of the Second ACM International Conference on Web Search and Data Mining. (Barcelona, Spain, ACM, 2009).

[25] Liu Y, Li S, Cao Y, Lin C, Han D and Yu Y. Understanding and summarizing answers in community-based question answering services. In: Proceedings of the 22nd International Conference on Computational Linguistics, (Manchester, UK, ACM, 2008).

[26] Nam KK, Ackerman MS and Adamic LA. Questions in, knowledge in?: A study of Naver's question answering community. In: Proceedings of the 27th International Conference on Human Factors in Computing Systems (Boston, MA, ACM, 2009).

[27] Bouguessa M, Dumoulin B and Wang S. Identifying authoritative actors in question-answering forums: The case of Yahoo! Answers. In: Proceeding of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, (Las Vegas, NA, ACM, 2009).

[28] Jurczyk P and Agichtein E. Discovering authorities in question answer communities by using link analysis. In: Proceedings of the Sixteenth ACM Conference on Information and Knowledge Management, (New York, ACM, 2007a).

[29] Jurczyk P and Agichtein E. Hits on question answer portals: Exploration of link analysis for author ranking, Poster presented at the Annual ACM Conference on Research and Development in Information Retrieval, (Amsterdam, ACM, 2007b).

[30] Blooma JM, Chua AYK and Goh DH. A predictive framework for retrieving the best answer. In: Proceedings of the 2008 ACM Symposium on Applied Computing, (Fortaleza, Ceara, Brazil, ACM, 2008).

[31] Chen W, Zeng Q and Wenyin L. A user reputation model for a user-interactive question answering system. In: Proceedings of the Second International Conference on Semantics, Knowledge, and Grid, (Washington, DC, IEEE Computer Society, 2006).

[32] Ong Cc Day M and Hsu M. The measurement of user satisfaction with question answering systems, Information & Management 2009; 46(7): 397-403.

[33] Agichtein E, Castillo C, Donato D, Gionides A and Mishne G. Finding high-quality content in social media. Proceedings of Web Search and Web Data Mining, (Palo Alto, CA, ACM, 2008).

[34] Gazan R. Specialists and synthesists in a question answering community. In: Proceedings of the American Society for Information Science & Technology Annual Meeting 2006; 43(1): 1-10.

[35] Hitwise, U.S. Visits to Question and Answer Websites Increased 118 Percent (2008).

http://www.hitwise.com/news/us200803.html (accessed 25 November 2009).

[36] Krippendorff K. Content analysis: An introduction to its methodology. 2nd ed. Thousand Oaks, CA: Sage, 2004.

[37] Lombard M, Snyder-Duch J and Bracken CC. Content analysis in mass communication: Assessment and reporting of intercoder reliability, Human Communication Research 2002; 28(4): 587–604.

[38] Neuendorf KA. The content analysis guidebook. Thousand Oaks, CA: Sage, 2002.

[39] Rosenbaum H and Shachaf P. A structuration approach to online communities of practice: The case of Q&A communities, Journal of the American Society for Information Science and Technology 2010; 61(10): 1933-1944.

[40] Raban D and Harper M. Motivations for answering questions online. In: D. Caspi and T Azran (eds), New Media and Innovative Technologies. Beer Sheva, Israel: Ben-Gurion University Press, Tzivonim Publications, 2007).

[41] JZhang J, Ackerman MS, Adamic L and Nam KK. QuME: A mechanism to support expertise finding in online help-seeking communities. In: Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (Newport, RI, ACM, 2007).

[42] Gazan R. Microcollaborations in a social Q&A community, Information Processing & Management 2010; 46(6): 693-702.

[43] Harper FM, Moy D and Konstan JA. Facts or friends?: Distinguishing informational and conversational questions in social Q&A sites. In: Conference on Human Factors in Computing Systems, (Boston, MA, ACM, 2009).

[44] Li B, Liu Y, Ram A, Garcia EV and Agichtein E. Exploring question subjectivity prediction in community QA. In: Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, (Singapore, ACM, 2008).

[45] Gray PH and Meister DB. Knowledge sourcing effectiveness, Management Science 2004; 50(6): 821-834.

[46] Tapscott D and Williams AD. Wikinomics: How mass collaboration changes everything (Penguin Group Inc, New York, 2007).


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item