Is the expertise of evaluation panels congruent with the research interests of the research groups: A quantitative approach based on barycenters

Rahman, A. I. M. Jakaria, Guns, Raf, Rousseau , Ronald and Engels , Tim C.E. Is the expertise of evaluation panels congruent with the research interests of the research groups: A quantitative approach based on barycenters., 2015 [Preprint]

[thumbnail of Rahman et. al.__Is the expertise of evaluation panels congruent.pdf] Text
Rahman et. al.__Is the expertise of evaluation panels congruent.pdf

Download (7MB)

English abstract

Discipline-specific research evaluation exercises are typically carried out by panels of peers, known as expert panels. To the best of our knowledge, no methods are available to measure overlap in expertise between an expert panel and the units under evaluation. This paper explores bibliometric approaches to determine this overlap, using two research evaluations of the departments of Chemistry (2009) and Physics (2010) of the University of Antwerp as a test case. We explore the usefulness of overlay mapping on a global map of science (with Web of Science subject categories) to gauge overlap of expertise and introduce a set of methods to determine an entity’s barycenter according to its publication output. Barycenters can be calculated starting from a similarity matrix of subject categories (N dimensions) or from a visualization thereof (2 dimensions). We compare the results of the N-dimensional method with those of two 2-dimensional ones (Kamada-Kawai maps and VOS maps) and find that they yield very similar results. The distance between barycenters is used as an indicator of expertise overlap. The results reveal that there is some discrepancy between the panel’s and the groups’ publications in both the Chemistry and the Physics departments. The panels were not as diverse as the groups that were assessed. The match between the Chemistry panel and the Department was better than that between the Physics panel and the Department.

Item type: Preprint
Keywords: Research assessment; Research evaluation; Expert panel; Research group; Barycenter; Overlay map; Matching research expertise; Similarity matrix; VOS-map; Kamada-Kawai map
Subjects: B. Information use and sociology of information > BB. Bibliometric methods
Depositing user: A. I. M. Jakaria Rahman
Date deposited: 10 Sep 2015 19:17
Last modified: 10 Sep 2015 19:17
URI: http://hdl.handle.net/10760/25705

References

Borg, I., & Groenen, P. J. F. (2005). Modern Multidimensional Scaling. New York, NY: Springer New York. Retrieved from http://link.springer.com/10.1007/0-387-28981-X

Bornmann, L. (2014). Assigning publications to multiple subject categories for bibliometric analysis: an empirical case study based on percentiles. Journal of Documentation, 70(1), 52–61.

Butler, L., & McAllister, I. (2011). Evaluating University research performance using metrics. European Political Science, 10(1), 44–58. http://doi.org/doi:10.1057/eps.2010.13

De Nooy, W., Mrvar, A., & Batagelj, V. (2012). Exploratory Social Network Analysis with Pajek (2nd edition). England ; New York: Cambridge University Press.

Egghe, L., & Rousseau, R. (1990). Introduction to Informetrics. Elsevier Science Publishers. Retrieved from https://uhdspace.uhasselt.be/dspace/handle/1942/587

Engels, T. C. E., Goos, P., Dexters, N., & Spruyt, E. H. J. (2013). Group size, h-index, and efficiency in publishing in top journals explain expert panel assessments of research group quality and productivity. Research Evaluation, 22(4), 224–236. http://doi.org/doi: 10.1093/reseval/rvt013

Jin, B., & Rousseau, R. (2001). An introduction to the barycentre method with an application to China’s mean centre of publication. Libri, 51(4), 225–233. http://doi.org/doi: 10.1515/LIBR.2001.225

Kamada, T., & Kawai, S. (1989). An algorithm for drawing general undirected graphs. Information Processing Letters, 31(1), 7–15. http://doi.org/10.1016/0020-0190(89)90102-6

Langfeldt, L. (2004). Expert panels evaluating research: decision-making and sources of bias. Research Evaluation, 13(1), 51–62. http://doi.org/doi: 10.3152/147154404781776536

Lawrenz, F., Thao, M., & Johnson, K. (2012). Expert panel reviews of research centers: The site visit process. Evaluation and Program Planning, 35(3), 390–397. http://doi.org/doi:10.1016/j.evalprogplan.2012.01.003

Leydesdorff, L., & Bornmann, L. (2015). The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies.” Journal of the Association for Information Science and Technology. http://doi.org/10.1002/asi.23408

Leydesdorff, L., Carley, S., & Rafols, I. (2013). Global maps of science based on the new Web-of-Science categories. Scientometrics, 94(2), 589–593. http://doi.org/10.1007/s11192-012-0784-8

Leydesdorff, L., & Rafols, I. (2009). A global map of science based on the ISI subject categories. Journal of the American Society for Information Science and Technology, 60(2), 348–362. http://doi.org/10.1002/asi.20967

Leydesdorff, L., Rafols, I., & Chen, C. (2013). Interactive overlays of journals and the measurement of interdisciplinarity on the basis of aggregated journal–journal citations. Journal of the American Society for Information Science and Technology, 64(12), 2573–2586. http://doi.org/doi:10.1002/asi.22946

Li, D. , & Agha, L. (2015). Big names or big ideas: Do peer-review panels select the best science proposals? Science, 348(6233), 434-438. DOI: 10.1126/science.aaa0185

Nedeva, M., Georghiou, L., Loveridge, D., & Cameron, H. (1996). The use of co-nomination to identify expert participants for Technology Foresight. R&D Management, 26(2), 155–168. http://doi.org/10.1111/j.1467-9310.1996.tb00939.x

Rafols, I., Porter, A. L., & Leydesdorff, L. (2010). Science Overlay Maps: A New Tool for Research Policy and Library Management. Journal of the American Society for Information Science and Technology, 61(9), 1871–1887. http://doi.org/10.1002/asi.21368

Rahman, A. I. M. J., Guns, R., Rousseau, R., & Engels, T. C. E. (2014). Assessment of expertise overlap between an expert panel and research groups. In Ed Noyons (Ed.), Context Counts: Pathways to Master Big and Little Data. Proceedings of the Science and Technology Indicators Conference 2014 Leiden (pp. 295–301). Leiden: Universiteit Leiden.

Rehn, C., Kronman, U., Gornitzki, C., Larsson, A., & Wadskog, D. (2014). Bibliometric handbook for Karolinska Institutet. Stockholm, Sweden: Karolinska Institute.

Rons, N., De Bruyn, A., & Cornelis, J. (2008). Research evaluation per discipline: a peer-review method and its outcomes. Research Evaluation, 17(1), 45–57. http://doi.org/10.3152/095820208X240208

Rousseau, R. (1989). Kinematic of statistics of scientific output. Part I: geographical approach. Revue Française de Bibliométrie, 4, 50–64.

Rousseau, R. (2008). Triad or Tetrad: another representation. ISSI Newsletter, 4(1), 5–7.

Van Eck, N. J., & Waltman, L. (2007). VOS: A New Method for Visualizing Similarities Between Objects. In R. Decker & H.-J. Lenz (Eds.), Advances in Data Analysis (pp. 299–306). Springer Berlin Heidelberg. Retrieved from http://link.springer.com/chapter/10.1007/978-3-540-70981-7_34

Van Eck, N. J., Waltman, L., Dekker, R., & van den Berg, J. (2010). A Comparison of Two Techniques for Bibliometric Mapping: Multidimensional Scaling and VOS. Journal of the American Society for Information Science and Technology, 61(12), 2405–2416. http://doi.org/10.1002/asi.21421

Van Leeuwen, T. N., & Medina, C. C. (2012). Redefining the field of economics: Improving field normalization for the application of bibliometric techniques in the field of economics. Research Evaluation, 21(1), 61–70.

Verleysen, F. T., & Engels, T. C. E. (2013). Measuring internationalisation of book publishing in the Social Sciences and Humanities using the barycentre method. In 14th International Society for Informetrics and Scientometrics Conference (15th–19th of July 2013, Vienna, Austria) (pp. 1170–1175).

Verleysen, F. T., & Engels, T. C. E. (2014). Barycenter representation of book publishing internationalization in the Social Sciences and Humanities. Journal of Informetrics, 8(1), 234–240. http://doi.org/10.1016/j.joi.2013.11.008

VSNU. (2003). Standard Evaluation Protocol 2003-2009 for Public Research Organisations. Utrecht/den Haag/Amsterdam: VSNU, NWO and KNAW.

VSNU. (2009). Standard evaluation Protocol 2009-2015: protocol for research assessment in The Netherlands. Utrecht/den Haag/Amsterdam: VSNU, NWO and KNAW.

Zhou, Q., Rousseau, R., Yang, L., Yue, T., & Yang, G. (2012). A general framework for describing diversity within systems and similarity between systems with applications in informetrics. Scientometrics, 93(3), 787–812. http://doi.org/10.1007/s11192-012-0767-9


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item