Information Quality on Yahoo! Answers

Fichman, Pnina . Information Quality on Yahoo! Answers., 2013 In: Approaches and Processes for Managing the Economics of Information Systems. Idea Group Publishing. (In Press) [Book chapter]

[img]
Preview
Text (Information Quality on Yahoo! Answers)
fichmanMarch2013submit.pdf - Draft version

Download (210kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Fig.1)
Figure1.tif.pdf - Draft version

Download (105kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Fig.2)
Figure2.tif.pdf - Draft version

Download (34kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Fig.3)
Figure3.tif.pdf - Draft version

Download (27kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Table1)
Table1.tiff.pdf - Draft version

Download (97kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Table 2)
Table2.tiff.pdf - Draft version

Download (160kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Table 3)
Table3.tiff.pdf - Draft version

Download (107kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Table 4)
Table4.tiff.pdf - Draft version

Download (565kB) | Preview
[img]
Preview
Image (Information Quality on Yahoo! Answers, Table 5)
Table5 (1).tiff.pdf - Draft version

Download (94kB) | Preview

English abstract

Along with the proliferation of the social web, question and answer (QA) sites attract millions of users around the globe. On these sites, users ask questions while others provide answers. These QA sites vary by their scope, size, and quality of answers; the most popular QA site is Yahoo! Answers. This chapter aims to examine the quality of information produced by the crowd on Yahoo! Answers, assuming that given enough eyeballs all questions can get good answers. Findings illustrate a process of answer quality improvement through crowd-sourcing questions. Improvement is achieved by having multiple answers to any given question instead of a singe answer, and through a mechanism of answer evaluation, by which users rank the best answer to any given question. Both processes contribute significantly to the quality of answers one can expect to find on Yahoo! Answers.

Item type: Book chapter
Keywords: CQA, Yahoo!Answers, social reference, collaborative question answer site, answer quality, information quality
Subjects: B. Information use and sociology of information
B. Information use and sociology of information > BA. Use and impact of information.
B. Information use and sociology of information > BI. User interfaces, usability.
H. Information sources, supports, channels. > HZ. None of these, but in this section.
L. Information technology and library technology > LZ. None of these, but in this section.
Depositing user: Amanda Ferrara
Date deposited: 07 Oct 2013 15:32
Last modified: 02 Oct 2014 12:28
URI: http://hdl.handle.net/10760/20246

References

"SEEK" links will first look for possible matches inside E-LIS and query Google Scholar if no results are found.

Adamic, L.A., Zhang, J., Bakshy, E., & Ackerman, M.S. (2008). Knowledge sharing and Yahoo! Answers: Everyone knows something. In Proceedings of the International World Wide Web Conference (pp. 665-674). Beijing: ACM.

Blooma, M.J., Chua, A.Y., & Goh, D.H. (2008). A predictive framework for retrieving the best answer. In Proceedings of the 2008 ACM Symposium on Applied Computing (pp. 1107-1111). Fortaleza, Ceara, Brazil: ACM.

Blooma, M.J. Goh, D.H., & Chua, A.Y. (2012). Predictors of high-quality answers. Online Information Review, 36(3), 383 – 400.

Bouguessa, M., Dumoulin, B., & Wang, S. (2009). Identifying authoritative actors in question-answering forums: The case of Yahoo! Answers. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 866-874). Las Vegas: ACM.

Chua, A.Y.K., Balkunje, R.S. (2012). Comparative evaluation of community question answering websites. The outreach of digital libraries: A globalized resource network. Lecture Notes in Computer Science, 7634 (pp. 209-218).

Fichman, P. (2011). A comparative assessment of answer quality on four question answering sites. Journal of Information Science, 37(5), 476-486.

Fichman, P. (2012). How many answers ere enough? Optimal number of answers for Q&A sites. Social Informatics, 260-274.

Gazan, R. (2006). Specialists and synthesists in a question answering community. In Proceedings of the American Society for Information Science & Technology Annual Meeting (pp. 1-10). Austin: ASIST.

Giles, J. (2005). Internet encyclopedias go head to head. Nature, 438, 900-901 (available at http://www.nature.com/news/2005/051212/full/438900a.html).

Harper, F.M., Moy, D., & Konstan, J.A. (2009). Facts or friends?: Distinguishing informational and conversational questions in social Q&A sites. In Conference on Human Factors in Computing Systems (pp. 759-768). Boston: ACM.

Harper, F.M., Raban, D., Rafaeli, S., & Konstan, J.A. (2008). Predictors of answer quality in online Q&A sites. In Proceedings of the Conference on Human Factors in Computing Systems (pp. 865-874). Florence, Italy: ACM.

Kim, S., & Oh, S. (2009). Users’ relevance criteria for evaluating answers in social Q&A site. Journal of the American Society for Information Science and Technology, 60(4), 716-727.

Lombard, M., Snyder-Duch, J., & Bracken, C. C. (2002). Content analysis in mass communication: Assessment and reporting of intercoder reliability. Human Communication Research, 28(4), 587–604.

Meneely, A., & Williams, L. (2009). Secure open source collaboration: An empirical study of Linus’ Law. In Proceedings of the 16th ACM Conference on Computer and Communications Security (pp. 453-462). New York: ACM.

Rehavi, A., & Refaeli, S. (2012). Knowledge and social networks in Yahoo! Answers. 45th Hawaii International Conference on System Science (HICSS), pp. 781-789.

Rosenbaum, H., & Shachaf, P. (2010). A structuration approach to online communities of practice: The case of Q&A communities. Journal of the American Society for Information Science and Technology, 61(9), 1933-1944.

Schweik, C.M., English, R.C., Kisting, M., & Haire, S. (2008). Brooks' versus Linus' law: an empirical test of open source projects. In Proceedings of the 2008 International Conference on Digital Government Research (pp. 423-424). Montreal, Canada: Digital Government Society of North America, ACM.

Shachaf, P. (2009). The paradox of expertise: Is the Wikipedia Reference Desk as good as your library? Journal of Documentation, 65(6), 977-963.

Shachaf, P., & Rosenbaum, H. (2009). Online social reference: A research agenda through a STIN framework. Proceedings of the iConference’09, Chapel Hill, NC, US.

Wu, P.F., & Korfiatis, Ni. (2013). You scratch someone's back and we'll scratch yours: Collective reciprocity in social Q&A communities. Journal of the American Society for Information Science and Technology.


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item