Assessment of expertise overlap between an expert panel and research groups

Rahman, A. I. M. Jakaria and Guns, Raf and Rousseau, Ronald and Engels, Tim C.E. Assessment of expertise overlap between an expert panel and research groups., 2014 . In Science and Technology Indicators Conference, Leiden (the Netherlands), 3 - 5 September 2014. [Conference paper]

[img] Text
Rahman, Guns, Rousseau, Engels_2014_ Assessment of expertise overlap between an expert panel and research groups.pdf - Published version

Download (2MB)

English abstract

Discipline-specific research evaluation exercises are typically carried out by committees of peers, expert panels. Currently, there are no available methods that can measure overlap in expertise between a panel and the units of assessment. This research in progress paper explores a bibliometric approach to determining the overlap of expertise, using the 2010 research evaluation of nine physics research groups of the University of Antwerp as a test case. Overlay maps were applied to visualize to what extent the groups and panel members publish in different Web of Science subject categories. There seems to be a moderate disparity between the panel’s and the groups’ expertise. The panel was not as diverse as the groups that needed to be assessed. Future research will focus on journal level overlay maps, similarity testing, and a comparison with other disciplines.

Item type: Conference paper
Keywords: Research assessment, Expert panel, Research group
Subjects: B. Information use and sociology of information > BB. Bibliometric methods
Depositing user: A. I. M. Jakaria Rahman
Date deposited: 07 May 2015 08:22
Last modified: 07 May 2015 08:22
URI: http://hdl.handle.net/10760/25047

References

Butler, L., & McAllister, I. (2011). Evaluating University research performance using metrics.European Political Science, 10(1), 44–58.

Engels, T. C. E., Goos, P., Dexters, N., & Spruyt, E. H. J. (2013). Group size, h-index, andefficiency in publishing in top journals explain expert panel assessments of research group quality and productivity. Research Evaluation, 22(4), 224–236.

Langfeldt, L. (2004). Expert panels evaluating research: decision-making and sources of bias. Research Evaluation, 13(1), 51–62.

Lawrenz, F., Thao, M., & Johnson, K. (2012). Expert panel reviews of research centers: The site visit process. Evaluation and Program Planning, 35(3), 390–397.

Leydesdorff, L., Carley, S., & Rafols, I. (2013). Global maps of science based on the new Web-of-Science categories. Scientometrics, 94(2), 589–593.

Leydesdorff, L., Rafols, I., & Chen, C. (2013). Interactive overlays of journals and the measurement of interdisciplinarity on the basis of aggregated journal–journal citations.

Journal of the American Society for Information Science and Technology, 64(12), 2573–2586.

Nedeva, M., Georghiou, L., Loveridge, D., & Cameron, H. (1996). The use of co-nomination to identify expert participants for Technology Foresight. R&D Management, 26(2), 155–168.


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item