Question-answering systems as efficient sources of terminological information: An evaluation

Olvera-Lobo, María-Dolores and Gutiérrez-Artacho, Juncal Question-answering systems as efficient sources of terminological information: An evaluation. Health Information & Libraries Journal, 2010, vol. 27, n. 4, pp. 268-76. [Journal article (Paginated)]

[thumbnail of Journal article]
Preview
Text (Journal article)
Question-answering_systems_as_efficient_sources_of.pdf - Published version
Available under License Creative Commons Attribution.

Download (346kB) | Preview

English abstract

Question-answering systems (or QA Systems) stand as a new alternative for Information Retrieval Systems. Most users frequently need to retrieve specific information about a factual question to obtain a whole document. The study evaluates the efficiency of QA systems as terminological sources for physicians, specialised translators and users in general. It assesses the performance of one open-domain QA system, START, and one restricted-domain QA system, MedQA. The study collected two hundred definitional questions (What is…?), either general or specialised, from the health website WebMD. Sources used by the open-domain QA system, START, and the restricted-domain QA system, MedQA, were studied to retrieve answers, and later a range of evaluation measures (precision, Mean Reciprocal Rank, Total Reciprocal Rank, First Hit Success) were applied to mark the quality of answers. It was established that both systems are useful in the retrieval of valid definitional healthcare information, with an acceptable degree of coherent and precise responses from both. The answers supplied by MedQA were more reliable that those of START in the sense that they came from specialised clinical or academic sources, most of them showing links to further research articles. Results obtained show the potential of this type of tool in the more general realm of information access, and the retrieval of health information. They may be considered a good, reliable and reasonably precise alternative in alleviating the information overload. Both QA systems can help professionals and users can obtain healthcare information.

Item type: Journal article (Paginated)
Keywords: decision support techniques, evaluation studies as topic, information storage and retrieval, natural language processing, MedQA, START
Subjects: L. Information technology and library technology
Depositing user: Maria Dolores/ M.D. Olvera Lobo
Date deposited: 24 Jul 2018 08:38
Last modified: 24 Jul 2018 08:38
URI: http://hdl.handle.net/10760/32948

References

Jackson, P. & Schilder, F. Natural language processing: overview. In: Brown, K. (ed). Encyclopedia of Language & Linguistics, 2nd edn. Amsterdam: Elsevier Press, 2005: 503–518

Access to Text REtrieval Conference (TREC). Accessible at:

http://trec.nist.gov/

Voorhees, E. M. The TREC 8 Question Answering Track Report. In: Voorhees, E. M. & Harman, D. K. (eds). Proceedings of the Eighth Text REtrieval Conference, vol. 500-246 in NIST Special Publication, Gaithersburg, MD: NIST, 1999: 107–130

Access to START (Natural Language Question Answering System). Accessible at: http://start.csail.mit.edu/

Access to MedQA. Accessible at:

http://monkey.ims.uwm. edu:8080/MedQA/

Blair-Goldensohn, S. B. & Schlaikjer, A. H. Answering definitional questions: a hybrid approach. New Directions In Question Answering 2004, 4, 47–58

Costa, L. F. & Santos, D. Question Answering Systems: A Partial Answer. Oslo: SINTEF, 2007

Zweigenbaum, P. Question answering in biomedicine. In: De Rijke, M. & Webber, B. (eds). Proceedings Workshop on Natural Language Processing for Question Answering. Budapest: ACL, EACL, 2003: 1–4

Cui, H., Kan, M. Y., Chua, T. S. & Xiao, J. A. Comparative study on sentence retrieval for definitional question answering. SIGIR Workshop on Information Retrieval for Question Answering, Sheffield, 2004

Molla´, D. & Vicedo, J. L. Question-Answering in Restricted Domains. Menlo Park, CA: AAAI Press, 2005

Tsur, O. Definitional Question-Answering Using Trainable Text Classifiers. PhD Thesis. Amsterdam: Institute of Logic Language and Computation (ILLC), University of Amsterdam, 2003

Sing, G. O., Ardil, C., Wong, W. & Sahib, S. Response quality evaluation in heterogeneous question answering system: a black-box approach. International Journal of Information Technology, 2, 4, 2006

Fahmi, I. Automatic term and relation extraction for medical question answering system. Groningen Dissertations of Linguistics, 2009, 72

Alfonseca, E., De Boni, M., Jara, J. L. & Manandhar, S. A prototype question answering system using syntactic and semantic information for answer retrieval. In: E. M., Voorhees and D. K., Harman. (eds). Proceedings of the 10th Text Retrieval Conference (TREC-10). Gaithersburg, NIST, Gaithersburg, MD, 2002

Jacquemart, P. & Zweigenbaum, P. Towards a medical question-answering system: a feasibility study. In: Beux, P. L. & Baud, R. (eds). Proceedings of Medical Informatics Europe (MIE ‘03), vol. 95 of Studies in Health Technology and Informatics, San Palo, CA, 2003: 463–468

Access to asked (Automatic Multilingual Question Answering System). Accessible at:

http://asked.jp/edw/pc/

Access to WolframAlpha computational knowledge engine. Accessible at:

http://www.wolframalpha.com/

Katz, B., Felshin, S., Yuret, D., Ibrahim, A., Lin, J., Martion, G., McFarland, A. J. & Temelkuran, B. Omnibase: uniform access to heterogeneous data for question answering. In: Johannesson, P. (ed). Proceedings of the Seventh International Workshop on Applications of Natural Language to Information Systems (NLDB 2002), Stockholm, Sweden, Lecture Notes in Computer Sciences, Springer Verlag, 2002: 230–234

Access to NSIR (Question Answering System). Accessible at:

http://tangra.si.umich.edu/clair/NSIR/html/nsir.cgi/

Access to QuaLiM (Question Answering Demo). Accessible at:

http://demos.inf.ed.ac.uk:8080/qualim/

Access to Google. Accessible at:

http://www.google.com/

Access to Wikipedia. Accessible at:

http://www. wikipedia.org/

Crouch, D., Saurı´, R. & Fowler, A. AQUAINT pilot knowledge-based evaluation: annotation guidelines. Tech. rep., Palo Alto Research Center, 2005

Lee, M., Cimino, J., Zhu, H. R., Sable, C., Shanker, V., Ely, J. & Yu, H. Beyond Information Retrieval – Medical Question Answering. Washington, DC: AMIA, 2006

Yu, H., Lee, M., Kaufman, D., Ely, J., Osheroff, J. A., Hripcsak, G. & Cimino, J. Development, implementation, and a cognitive evaluation of a definitional question answering system for physicians. Journal of Biomedicine Informatics 2007, 4, 236–251

Ely, J. W., Osheroff, P. N., Ebell, M., Bergus, G., Barcey, L., Chambliss, M. & Evans, E. Analysis of questions asked by family doctors regarding patient care. British Medical Journal 1999, 319, 358–361

Yu, H. & Kaufman, D. A cognitive evaluation of four online search engines for answering definitional questions posed by physicians. Pacific Symposium on Biocomputing 2007, 12, 328–339

Blair-Goldensohn, S. B., McKeow, K. R. & Schlaikjer, A. H. A hybrid approach for QA track definitional questions. In Proceedings of the 12th Text Retrieval Conference (TREC 2003), Gaithersburg, Maryland, 2003, 336– 343

Katz, B., Felshin, S., Yuret, D., Ibrahim, A., Lin, J., Marton, G., Jerome McFarland, A. & Temelkuran, B. Proceedings of the Seventh International Workshop on Applications of Natural Language to Information Systems (NLDB 2002). Omnibase: Uniform Access to Heterogeneous Data for Question Answering, 2002

Access to WebMD. Accessible at:

http://www.webmd.com/

Ely, J. W., Osheroff, J. A., Gorman, P. N., Ebell, M. H., Chambliss, M. L., Pifer, E. A. & Stavri, P. Z. A taxonomy of generic clinical questions: classification study. British Medical Journal 2000, 321, 429–432.

Katz, B., Felshin, S., Yuret, D., Ibrahim, A., Lin, J., Martion, G., McFarland, A. J. & Temelkuran, B. Uniform access to heterogeneous data for question answering. In: Johannesson, P. (ed). Proceedings of the Seventh International Workshop on Applications of Natural Language to Information Systems (NLDB 2002, Stockholm, Sweden, Lecture Notes in Computer Sciences, Springer Verlag). Omnibase, 2002: 230–234

Access to Cross Language Evaluation Forum (CLEF). Accessible at:

http://www.clef-campaign.org/

Cao, Y. G., Ely, J., Antieau, L. & Yu, H. Evaluation of the clinical answering presentation. Proceedings of the Workshop on BioNLP, Boulder, Colorado, 2009, 171–178

Raved, D. R., Qi, H., Wu, H. & Fan, W. Evaluating Web-Based Question Answering Systems. Technical Report, Michigan: University of Michigan, 2001

Access to American Medical Association (AMA). Accessible at:

http://www.ama-assn.org/

Access to Internet Movie Database (IMDb). Accessible at:

http://www.imdb.com/. 38 Access to Yahoo. Accessible at:

http://www.yahoo.com/

Access to Webopedia. Accessible at:

http:// www.webopedia.com/

Access to Merriam-Webster. Accessible at:

http:// www.merriam-webster.com/

Access to Medline. Accessible at:

http:// www.ncbi.nlm.nih.gov/pubmed/

Access to Dictionary of Cancer Terms. Accessible at:

http:// www.cancer.gov/dictionary/

Access to Dorland’s Illustrated Medical Dictionary. Accessible at: http://www.dorlands.com/wsearch.jsp/

Access to MedlinePlus. Accessible at:

http:// medlineplus.gov/

Access to Glossary of Technical and Popular Medical Terms. Accessible at:

http://users.ugent.be/~rvdstich/eugloss/ welcome.html/

Access to National Immunization Program Glossary. Accessible at:

http://www.cdc.gov/vaccines/about/terms.htm/

Buitelaar, P., Cimiano, P., Frank, P., Hartung, M. & Racioppa, S. Ontology-based information extraction and integration from heterogeneous data sources. International Journal of Human-Computer Studies 2008, 66, 759–788

Cruchet, S., Gaudinat, A., Rindflesch, T. & Boyer, C. What about trust in the Question Answering world? AMIA 2009 Annual Symposium, 2009.


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item