Question-answering systems as efficient sources of terminological information: an evaluation

Olvera-Lobo, María-Dolores and Gutiérrez-Artacho, Juncal Question-answering systems as efficient sources of terminological information: an evaluation. Health Information and Libraries Journal, 2010, vol. 27, n. 4, pp. 268-276. [Journal article (Paginated)]

[img]
Preview
PDF
2010-Health_Information_and_Libraries_Journal-PUBLICADO-DEFINITIVO.pdf

Download (266kB) | Preview

English abstract

A new alternative for Information Retrieval Systems. Most users frequently need to retrieve specific information about a factual question to obtain a whole document. Objectives: The study evaluates the efficiency of QA systems as terminological sources for physicians, specialised translators and users in general. It assesses the performance of one open-domain QA system, START, and one restricted-domain QA system, MedQA. Method: The study collected two hundred definitional questions (What is…?), either general or specialised, from the health website WebMD. Sources used by the open-domain QA system, START, and the restricted-domain QA system, MedQA, were studied to retrieve answers, and later a range of evaluation measures (precision, Mean Reciprocal Rank, Total Reciprocal Rank, First Hit Success) were applied to mark the quality of answers. Results: It was established that both systems are useful in the retrieval of valid definitional healthcare information, with an acceptable degree of coherent and precise responses from both. The answers supplied by MedQA were more reliable that those of START in the sense that they came from specialised clinical or academic sources, most of them showing links to further research articles. Conclusions: Results obtained show the potential of this type of tool in the more general realm of information access, and the retrieval of health information. They may be considered a good, reliable and reasonably precise alternative in alleviating the information overload. Both QA systems can help professionals and users can obtain healthcare information.

Item type: Journal article (Paginated)
Keywords: decision support techniques, evaluation studies as topic, information storage and retrieval, natural language processing, MedQA, START
Subjects: L. Information technology and library technology
Depositing user: Maria Dolores/ M.D. Olvera Lobo
Date deposited: 12 Nov 2010
Last modified: 02 Oct 2014 12:17
URI: http://hdl.handle.net/10760/15083

References

References

1 Jackson, P. & Schilder, F. Natural Language Processing: Overview”. In: Brown, K. (eds). Encyclopedia of Language & Linguistics, 2nd. Ed. Amsterdam: Elsevier Press., 2005: 503−518.

2 Access to Text REtrieval Conference (TREC). Available from: http://trec.nist.gov/

3 Voorhees, E.M. The TREC 8 Question Answering Track Report. In: Voorhees, E.M. & Harman, D.K. (eds). Proceedings of the 8th Text REtrieval Conference, vol. 500-246 in NIST Special Publication, NIST, Gaithersburg, Md, 1999: 107−130.

4 Access to START (Natural Language Question Answering System). Available from: http://start.csail.mit.edu/

5 Access to MedQA. Available from: http://monkey.ims.uwm.edu:8080/MedQA/

6 Blair-Goldensohn, S.B. & Schlaikjer, A.H. Answering Definitional Questions: A Hybrid Approach. New Directions In Question Answering, 2004, 4: 47-58.

7 Costa, L.F. & Santos, D. Question Answering Systems: a partial answer. SINTEF, Oslo, 2007.

8 Zweigenbaum, P. Question answering in biomedicine. In: De Rijke, M. & Webber, B. (eds). Proceedings Workshop on Natural Language Processing for Question Answering, Budapest: ACL, EACL 2003: 1−4.

9 Cui, H., Kan, M.Y., Chua, T.S. & Xiao, J. A Comparative Study on Sentence Retrieval for Definitional Question Answering. SIGIR Workshop on Information retrieval for Question Answering, Sheffield, 2004.

10 Mollá, D. & Vicedo, J.L. Question-Answering in Restricted Domains. Menlo Park, California, AAAI Press, 2005.

11 Tsur, O. Definitional Question-Answering Using Trainable Text Classifiers. PhD Thesis. Institute of Logic Language and Computation (ILLC), University of Amsterdam, 2003.

12 Sing, G.O., Ardil, C., Wong, W. & Sahib, S. Response Quality Evaluation in Heterogeneous Question Answering System: A Black-box Approach. Proceedings of World Academy of Science, Engineering and Technology, 2005: 9.

13 Fahmi, I. Automatic term and relation extraction for medical question answering system. Groningen Dissertations of Linguistics, 2009: 72.

14 Alfonseca, E., De Boni, M., Jara, J.L. & Manandhar, S. A prototype Question Answering system using syntactic and semantic information for answer retrieval. In: Proceedings of the 10th Text Retrieval Conference (TREC-10). Gaithersburg, 2002.

15 Jacquemart, P. & Zweigenbaum, P. Towards a Medical Question-Answering System: a Feasibility Study. In: Beux, P. L. & Baud, R. (eds). Proceedings of Medical Informatics Europe (MIE '03), vol. 95 of Studies in Health Technology and Informatics, San Palo, California, 2003: 463–468.

16 Access to asked (Automatic Multilingual Question Answering System). Available from: http://asked.jp/edw/pc/

17 Access to WolframAlpha computational knowledge engine. Available from: http://www.wolframalpha.com/

18 Katz, B., Felshin, S., Yuret, D., Ibrahim, A., Lin, J., Martion, G., McFarland, A.J. & Temelkuran, B. Omnibase: Uniform Access to Heterogeneous Data for Question Answering. In: Proceedings of the 7th International Workshop on Applications of Natural Language to Information Systems (NLDB 2002). 2002: 230–234.

19 Access to NSIR (Question Answering System). Available from: http://tangra.si.umich.edu/clair/NSIR/html/nsir.cgi/

20 Access to QuaLiM (Question Answering Demo). Available from: http://demos.inf.ed.ac.uk:8080/qualim/

21 Access to Google. Available from: http:// www.google.com/

22 Access to Wikipedia. Available from: http://www.wikipedia.org/

23 Crouch, D., Saurí, R. & Fowler, A. AQUAINT Pilot Knowledge-Based Evaluation: Annotation Guidelines. Tech. rep., Palo Alto Research Center, 2005.

24 Lee, M.; Cimino, J., Zhu, H.R., Sable, C., Shanker, V., Ely, J. & Yu, H. Beyond Information Retrieval – Medical Question Answering. AMIA. Washington DC, 2006.

25 Yu, H., Lee, M, Kaufman, D., Ely, J., Osheroff, J.A., Hripcsak, G. & Cimino, J. Development, implementation, and a cognitive evaluation of a definitional question answering system for physicians. Journal of Biomedicine Informatics, 2007: 4, 236-251.

26 Ely, J.W.; Osheroff, P.N.; Ebell, M.; Bergus, G.; Barcey, L.; Chambliss, M. And E. Evans, 1999. Analysis of questions asked by family doctors regarding patient care. British Medical Journal, 1999: 319, 358–361.

27 Yu, H. & Kaufman, D. A cognitive evaluation of four online search engines for answering definitional questions posed by physicians. Pacific Symposium on Biocomputing, 2007: 12, 328-339.

28 Blair-Goldensohn, S.B., McKeow, K.R. & Schlaikjer, A.H. A hybrid Approach for QA Track Definitional Questions. Proceedings of TREC, 2003: 336-343.

29 Access to WebMD. Available from: http://www.webmd.com/

30 Boris Katz, Sue Felshin, Deniz Yuret, Ali Ibrahim, Jimmy Lin, Gregory Marton, Alton Jerome McFarland and Baris Temelkuran. Omnibase: Uniform Access to Heterogeneous Data for Question Answering. Proceedings of the 7th International Workshop on Applications of Natural Language to Information Systems (NLDB 2002), June, 2002.

31 Ely, J.W, Osheroff, J.A., Gorman, P.N., Ebell, M.H., Chambliss, M.L., Pifer, E.A. & Stavri, P.Z. A taxonomy of generic clinical questions: classification study. British Medical Journal, 2000: 321, 429–432.

32 Katz, B., Felshin, S., Yuret, D., Ibrahim, A., Lin, J., Martion, G., McFarland, A.J. & Temelkuran, B. Omnibase: Uniform Access to Heterogeneous Data for Question Answering. In: Proceedings of the 7th International Workshop on Applications of Natural Language to Information Systems (NLDB 2002). 2002: 230–234.

33 Access to Cross Language Evaluation Forum (CLEF). Available from: http://www.clef-campaign.org/

34 Cao, Y.G.; Ely, J.; Antieau, L.; Yu, H. Evaluation of the Clinical Answering Presentation. Proceedings of the Workshop on BioNLP, 2009: 171 – 178.

35 Raved, D.R., Qi, H., Wu, H. & Fan, W. Evaluating Web-based Question Answering Systems. Technical Report, University of Michigan, 2001.

36 Access to American Medical Association (AMA). Available from: http://www.ama-assn.org/

37 Access to Internet Movie Database (IMDb). Available from: http://www.imdb.com/

38 Access to Yahoo. Available from: http://www.yahoo.com/

39 Access to Webopedia. Available from: http://www.webopedia.com/

40 Access to Merriam-Webster. Available from: http://www.merriam-webster.com/

41 Access to Medline. Available from: http://www.ncbi.nlm.nih.gov/pubmed/

42 Access to Dictionary of Cancer Terms. Available from: http://www.cancer.gov/dictionary/

43 Access to Dorland’s Illustrated Medical Dictionary. Available from: http://www.dorlands.com/wsearch.jsp/

44 Access to MedlinePlus. Available from: http://medlineplus.gov/

45 Access to Glossary of Technical and Popular Medical Terms. Available from: http://users.ugent.be/~rvdstich/eugloss/welcome.html/

46 Access to National Immunization Program Glossary. Available from: http://www.cdc.gov/vaccines/about/terms.htm/

47 Buitelaar, P., Cimiano, P., Frank, P., Hartung, M. & Racioppa, S. Ontology-based information extraction and integration from heterogeneous data sources. Int. J. Human-Computer Studies, 2008: 66, 759 – 788.

48 Cruchet, S.; Gaudinat, A.; Rindflesch, T.; Boyer, C. What about trust in the Question Answering world? AMIA 2009 Annual Symposium, 2009.


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item