Ranking institutions within a university based on their scientific performance: A percentile-based approach

Zornic, Nikola, Bornmann, Lutz, Maricic, Milica, Markovic, Aleksandar, Martic, Milan and Jeremic, Veljko Ranking institutions within a university based on their scientific performance: A percentile-based approach. El profesional de la información, 2015, vol. 24, n. 5, pp. 551-566. [Journal article (Paginated)]

[thumbnail of Research article]
Preview
Text (Research article)
38910-121943-1-PB.pdf - Published version
Available under License Creative Commons Attribution.

Download (2MB) | Preview

English abstract

Over the recent years, the subject of university rankings has attracted a significant amount of attention and sparked a scientific debate. However, few studies on this topic focus on elaborating the scientific performance of universities’ institutions, such as institutes, schools, and faculties. For this reason, the aim of this study is to design an appropriate framework for evaluating and ranking institutions within a university. The devised methodology ranks institutions based on the number of published papers, mean normalized citation score (MNCS), and four percentile-based indicators using the I-distance method. We applied the proposed framework and scrutinized the University of Belgrade (UB) as the biggest and the best-ranked university in Serbia. Thus, 31 faculties and 11 institutes were compared. Namely, an in-depth percentile-based analysis of the UB papers indexed in the Science Citation Index Expanded (SCIe) and the Social Science Citation Index (SSCI) for the period 2008-2011 is provided. The results clearly show considerable discrepancies in two occasions: first, when it comes to the question of leading author, and second, when it comes to analyzing the percentile rank classes (PRs) of groups of faculties.

Spanish abstract

En los últimos años, el tema de los rankings universitarios ha atraído mucha atención y ha provocado debates científicos. Sin embargo, pocos estudios sobre este tema se centran en la actuación científica de las instituciones de las universidades, como los institutos, escuelas y facultades. Por esta razón, el objetivo de este estudio es diseñar un marco adecuado para la evaluación y clasificación de las instituciones dentro de una universidad. La metodología ideada clasifica las instituciones según el número de trabajos publicados, la puntuación media de citación normalizada (MNCS), y cuatro indicadores basa- dos en percentiles utilizando el método de la I-distancia. Aplicamos el marco propuesto a la Universidad de Belgrado (UB), que es la universidad mayor y mejor clasificada de Serbia. Se compararon 31 facultades y 11 institutos y se proporciona un análisis basado en percentiles de los artículos de la UB indexados en el Science Citation Index Expanded (SCIe) y el Social Science Citation Index (SSCI) para el período 2008-2011. Los resultados muestran claramente discrepancias considerables en dos ocasiones: primera, cuando se trata del autor líder, y segunda, cuando se utilizan los tramos de percentil (RP) de grupos de facultades.

Item type: Journal article (Paginated)
Keywords: Bibliometrics; Percentile; Percentile rank classes; Scientific productivity; Scientific output; University rankings; Institutes; Schools; Faculties; I-distance; Bibliometría; Percentiles; Clases de rangos de percentil; Tramos de percentil; Productividad científica; Clasificación universitaria; Ranking universitario; Institutos; Facultades; I-distancia
Subjects: B. Information use and sociology of information > BB. Bibliometric methods
B. Information use and sociology of information > BG. Information dissemination and diffusion.
Depositing user: Rita Josefa Muñoz Ojeda
Date deposited: 28 Nov 2019 21:08
Last modified: 28 Nov 2019 21:08
URI: http://hdl.handle.net/10760/39285

References

Abramo, Giovanni; Cicero, Tindaro; D’Angelo, Ciriaco-Andrea (2013). “The impact of unproductive and top researchers on overall university research performance”. Journal of informetrics, v. 7, n. 1, pp. 166-175.

http://dx.doi.org/10.1016/j.joi.2012.10.006

Abramo, Giovanni; D’Angelo, Ciriaco-Andrea; Pugini, Fabio (2008). “The measurement of Italian universities’ research productivity by a non-parametric-bibliometric methodology”. Scientometrics, v. 76, n. 2, pp. 225-244.

http://dx.doi.org/10.1007/s11192-007-1942-2

Acuña, Eduardo; Espinosa, Miguel; Cancino, Jorge (2013). “Paper based productivity ranking of Chilean forestry institutions”. Bosque, v. 34 n. 2, pp. 211-219.

http://dx.doi.org/10.4067/S0717-92002013000200010

Altanopoulou, Panagiota; Dontsidou, Maria; Tselios, Nikolaos (2012) “Evaluation of ninety-three major Greek university departments using Google Scholar”. Quality in higher education, v. 18, n. 1, pp. 111-137.

http://dx.doi.org/10.1080/13538322.2012.670918

Altbach, Philip (2013). The international imperative in higher education. Rotterdam: SensePublishers. ISBN: 978 9462093379

ARWU (2014). “ARWU 2014–Methodology”. Academic Ranking of World Universities

http://www.shanghairanking.com/ARWU-Methodology-2014. html

Barnett, Bryan (1992). “Teaching and research are inescapably incompatible”. The chronicle of higher education, v. 38, n. 40

Bornmann, Lutz (2013). “How to analyze percentile impact data meaningfully in bibliometrics? The statistical analysis of distributions, percentile rank classes, and top-cited papers”. Journal of the Association for Information Science and Technology, v. 64, n. 3, pp. 587-595.

http://dx.doi.org/10.1002/asi.22792

Bornmann, Lutz; Bowman, Benjamin F.; Bauer, Johann; Marx, Werner; Schier, Hermann; Palzenberger, Margit (2014). “Bibliometric standards for evaluating research institutes in the natural sciences”. In: Cronin, Blaise; Sugimoto, Cassidy R. (eds.) (2014). Beyond bibliometrics. Massachusetts Institute of Technology, pp. 201-224 ISBN: 978 0 262 525551 0

Bornmann, Lutz; De-Moya-Anegón, Félix; Mutz, Rüdiger (2013). “Do universities or research institutions with a specific subject profile have an advantage or a disadvantage in institutional rankings?”. Journal of the Association for Information Science and Technology, v. 64, n. 11, pp. 2310-2316.

http://dx.doi.org/10.1002/asi.22923

Bornmann, Lutz; Leydesdorff, Loet; Mutz, Rüdiger (2013). “The use of percentiles and percentile rank classes in the analysis of bibliometric data: opportunities and limits”. Journal of informetrics, v. 7, n. 1, pp. 158-165.

http://arxiv.org/abs/1211.0381 http://dx.doi.org/10.1016/j.joi.2012.10.001

Bornmann, Lutz; Leydesdorff, Loet; Wang, Jian (2013). “Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation rank ap- proach (P100)”. Journal of informetrics, v. 7, n. 4, pp. 933-944.

http://arxiv.org/abs/1306.4454 http://dx.doi.org/10.1016/j.joi.2013.09.003

Bornmann, Lutz; Marx, Werner (2014). “How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations”. Scientometrics, v. 98, n. 1, pp. 487-509.

http://arxiv.org/pdf/1302.3697.pdf http://dx.doi.org/10.1007/s11192-013-1161-y

Bornmann, Lutz; Mutz, Rüdiger (2011). “Further steps towards an ideal method of measuring citation performance: The avoidance of citation (ratio) averages in field nor- malization”. Journal of informetrics, v. 5, n. 1, pp. 228-230.

http://www.lutz-bornmann.de/icons/AverageValue.pdf http://dx.doi.org/10.1016/j.joi.2010.10.009

Bornmann, Lutz; Mutz, Rüdiger (2014). “From P100 to P100’: A new citation rank approach”. Journal of the American Society for Information Science and Technology, v. 65, n. 2, pp. 1939-1943.

http://dx.doi.org/10.1002/asi.23152

Bowman, Nicholas; Bastedo, Michael (2011). “Anchoring effects in world university rankings: exploring biases in reputation scores”. Higher education, v. 61, n. 4, pp. 431-444.

http://dx.doi.org/10.1007/s10734-010-9339-1

Brew, Angela; Boud, David (1995). “Teaching and research: Establishing the vital link with learning”. Higher education, v. 29, n. 3, pp. 261-273.

http://dx.doi.org/10.1007/BF01384493

Charles, David; Kitagawa, Fumi; Uyarra, Elvira (2014). “Universities in crisis? New challenges and strategies in two English city regions”. Cambridge journal of regions, economy and society, v. 7, n. 2, pp. 327-348.

http://dx.doi.org/10.1093/cjres/rst029

Chinchilla-Rodríguez, Zaida; Arencibia-Jorge, Ricardo; De-Moya-Anegón, Félix; Corera-Álvarez, Elena (2015a) “Somes patterns of Cuban scientific publication in Scopus: the current situation and challenges”. Scientometrics, v. 103, n. 3, pp. 779-794 (in press.)

http://digital.csic.es/handle/10261/115267 http://dx.doi.org/10.1007/s11192-015-1568-8

Chinchilla-Rodríguez, Zaida; Zacca-González, Grisel; Vargas-Quesada, Benjamín; De-Moya-Anegón, Félix (2015b). “Latin American scientific output in public health: combined analysis using bibliometric, socioeconomic and health indicators”. Scientometrics, v. 102, n. 1, pp. 609-628.

http://digital.csic.es/handle/10261/108463?locale=es http://dx.doi.org/10.1007/s11192-014-1349-9

Chou, Chuing-Prudence; Lin, Hsiao-Fang; Chiu, Yun-ju (2013). “The impact of SSCI and SCI on Taiwan’s academy: an outcry for fair play”. Asia Pacific education review, v. 14, n. 1, pp. 23-31.

http://goo.gl/45cHLL http://dx.doi.org/10.1007/s12564-013-9245-1

Costas, Rodrigo; Van-Leeuwen, Thed N.; Bordons, María (2010). “A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact”. Journal of the American Society for Information Science and Technology, v. 61, n. 8, pp. 1564-1581.

http://dx.doi.org/10.1002/asi.21348

Cronin, Blaise (2001). “Hyperauthorship: A postmodern perversion or evidence of a structural shift in scholarly communication practices?”. Journal of the American Society for Information Science and Technology, v. 52, n. 7, pp. 558-569.

http://dx.doi.org/10.1002/asi.1097

Dobrota, Marina; Bulajic, Milica; Bornmann, Lutz; Jeremic, Veljko (2015a). “A new approach to QS University Ranking using composite I-distance indicator: uncertainty and sensitivity analyses”. Journal of the Association for Information Science and Technology. In press

http://dx.doi.org/10.1002/asi.23355

Dobrota, Marina; Martic, Milan; Bulajic, Milica; Jeremic, Veljko (2015b). “Two phased composite I-distance indicator approach for evaluation of countries’ information development”. Telecommunications policy, v. 39, n. 5, pp. 406-420.

http://dx.doi.org/10.1016/j.telpol.2015.03.003

Docampo, Domingo (2013). “Reproducibility of the Shanghai academic ranking of world universities”. Scientometrics, v. 94, n. 2, pp. 567-587.

http://dx.doi.org/10.1007/s11192-012-0801-y

Etzkowitz, Henry (1998). “The norms of entrepreneurial science: cognitive effects of the new university industry linkages”. Research policy, v. 27, n. 8, pp. 823-833.

http://dx.doi.org/10.1016/S0048-7333(98)00093-6

Frey, Bruno S. (2003). “Publishing as prostitution? Choosing between one’s own ideas and academic success”. Public choice, v. 116, n. 1-2, pp. 205-223.

http://www.bsfrey.ch/articles/388_03.pdf

http://dx.doi.org/10.1023/A:1024208701874

Gálvez, Carmen; De-Moya-Anegón, Félix (2006). “The unification of institutional addresses applying parametrized finite-state graphs (P-FSG)”. Scientometrics, v. 69, n. 2, pp. 323-345.

http://eprints.rclis.org/10019/ http://dx.doi.org/10.1007/s11192-006-0156-3

Garfield, Eugene (1979). “Is citation analysis a legitimate evaluation tool?”. Scientometrics, v. 1, n. 4, pp. 359-375.

http://dx.doi.org/10.1007/BF02019306

Geuna, Aldo; Martin, Ben R. (2003). “University research evaluation and funding: An international comparison”. Minerva, v. 41, n. 4, pp. 277-304.

http://dx.doi.org/10.1023/B:MINE.0000005155.70870.bd

Glänzel, Wolfgang; Schubert, András (2005). “Analysing scientific networks through co-authorship”. In: Moed, Henk; Glänzel, Wolfgang; Schmoch, Ulrich. Handbook of quantitative science and technology research. Springer Netherlands, pp. 257-276. ISBN: 978 1 4020 2755 0

Gulbrandsen, Magnus; Smeby, Jens-Christian (2005). “Industry funding and university professors’ research performance”. Research policy, v. 34, n. 6, pp. 932-950.

http://dx.doi.org/10.1016/j.respol.2005.05.004

Hazelkorn, Ellen (2007). “The impact of league tables and ranking systems on higher education decision making”. Higher education management and policy, v. 19, n. 2, pp. 87-110.

http://dx.doi.org/10.1787/hemp-v19-art12-en

Hazelkorn, Ellen (2011). Rankings and the reshaping of higher education. The battle for world class excellence. New York: Palgrave Macmillan. ISBN: 023024324X

Hazelkorn, Ellen (2014). “Reflections on a decade of global rankings: what we’ve learned and outstanding issues”. European journal of education, v. 49, n. 1, pp. 12-28.

http://arrow.dit.ie/cgi/viewcontent.cgi?article=1046&conte xt=cserart

http://dx.doi.org/10.1111/ejed.12059

Hazen, Allen (1914). “Storage to be provided in impounding reservoirs for municipal water supply”. Transactions of American Society of Civil Engineers, v. 77, pp. 1539-1640

Heck, Ronald; Lam, Wendy; Thomas, Scott (2014). “State political culture, higher education spending indicators, and undergraduate graduation outcomes”. Educational policy, v. 28, n. 1, pp. 3-39.

http://dx.doi.org/10.1177/0895904812453996

Hicks, Diana; Wouters, Paul; Waltman, Ludo; De-Rijcke, Sarah; Rafols, Ismael (2015). “The Leiden Manifesto for research metrics”. Nature, v. 520, n. 7548, pp. 429-431.

http://dx.doi.org/10.1038/520429a

Hintze, Jerry L.; Nelson, Ray D. (1998). “Violin plots: A box plot-density trace synergism”. The American Statistician, v. 52, n. 2, pp. 181-184.

https://quantixed.files.wordpress.com/2014/12/hintze_1998.pdf

http://dx.doi.org/10.1080/00031305.1998.10480559

Horstschräer, Julia (2012). “University rankings in action? The importance of rankings and an excellence competition for university choice of high hability students”. Economics of education review, v. 31, n. 6, pp. 1162-1176.

http://dx.doi.org/10.1016/j.econedurev.2012.07.018

Hurtado, Sylvia; Eagan, M. Kevin; Tran, Minh C.; Newman, Christopher B.; Chang, Mitchell J.; Velasco, Paolo (2011). “‘We do science here’: Underrepresented students’ interactions with faculty in different college contexts”. Journal of social issues, v. 67, n. 3, pp. 553-579.

http://dx.doi.org/10.1111/j.1540-4560.2011.01714.x

Ivanovic, Branislav (1977). Classification theory. Belgrade: Institute for Industrial Economics.

Ivanovic, Dragan; Ho, Yuh-Shan (2014). “Independent publications from Serbia in the Science Citation Index Expanded: a bibliometric analysis”. Scientometrics, v. 101, n. 1, pp. 603-622.

http://dx.doi.org/10.1007/s11192-014-1396-2

Jain, Sanjay; George, Gerard; Maltarich, Mark (2009). “Academics or entrepreneurs? Investigating role identity modification of university scientists involved in commercialization activity”. Research policy, v. 38, n. 6, pp. 922-935.

http://dx.doi.org/10.1016/j.respol.2009.02.007

Jeremic, Veljko; Bulajic, Milica; Martic, Milan; Radojicic Zoran (2011). “A fresh approach to evaluating the academic ranking of world universities”. Scientometrics, v. 87, n. 3, pp. 587-596.

http://dx.doi.org/10.1007/s11192-011-0361-6

Jeremic, Veljko; Jovanovic-Milenkovic, Marina; Radojicic, Zoran; Martic, Milan (2013). “Excellence with Leadership: the crown indicator of SCImago Institutions Rankings Iber report”. El profesional de la información, v. 22, n. 5, pp. 474-480.

http://dx.doi.org/10.3145/epi.2013.sep.13

Јоvаnоvic, Мilica; Јеrеmic, Veljko; Sаvic, Gordana; Bulајic, Мilica; Маrtic, Мilan (2012). “How does the normalization of data affect the ARWU ranking?”. Scientometrics, v. 93, n. 2, pp. 319-327.

http://dx.doi.org/10.1007/s11192-012-0674-0

King, Jean (1987). “A review of bibliometric and other science indicators and their role in research evaluation”. Journal of information science, v. 13, n. 5, pp. 261-276.

http://dx.doi.org/10.1177/016555158701300501

Leydesdorff, Loet (2007). “Caveats for the use of citation indicators in research and journal evaluations”. Journal of the American Society for Information Science and Technology, v. 59, n. 2, pp. 278-287.

http://onlinelibrary.wiley.com/doi/10.1002/asi.20743/epdf http://dx.doi.org/10.1002/asi.20743

Leydesdorff, Loet; Bornmann, Lutz (2011). “How fractional counting of citations affects the impact factor: Normalization in terms of differences in citation potentials among fields of science”. Journal of the American Society for Information Science and Technology, v. 62, n. 2, pp. 217-229.

http://arxiv.org/abs/1007.4749 http://dx.doi.org/10.1002/asi.21450

Leydesdorff, Loet; Bornmann, Lutz; Mutz, Rüdiger; Opthof, Tobias (2011). “Turning the tables in citation analysis one more time: Principles for comparing sets of documents”. Journal of Association for Information Science and Technology, v. 62, n. 7, pp. 1370-1381.

http://arxiv.org/abs/1101.3863 http://dx.doi.org/10.1002/asi.21534

Marope, Mmantsetsa; Wells, Peter; Hazelkorn, Ellen (2013). Rankings and accountability in higher education: Uses and misuses. Paris: Unesco. ISBN: 978 92 3 011156 7

http://unesdoc.unesco.org/images/0022/002207/220789e.pdf

McGrail, Matthew R.; Rickard, Claire M.; Jones, Rebecca (2006). “Publish or perish: a systematic review of interventions to increase academic publication rates”. Higher education research & development, v. 25, n.1, pp. 19-35.

https://www.tcd.ie/wiser/development/writing-group/documents/McGrail-et-al-2006.pdf http://dx.doi.org/10.1080/07294360500453053

Melin, Göran; Persson, Olle (1996). “Studying research collaboration using co-authorships”. Scientometrics, v. 36, n. 3, pp. 363-377.

http://federation.edu.au/ data/assets/pdf_file/0003/221268/14- melin.pdf

http://dx.doi.org/10.1007/BF02129600

Moed, Henk F. (2002). “The impact factors debate: The ISI’s uses and limits”. Nature, v. 415, pp. 731-732.

http://dx.doi.org/10.1038/415731a

Moed, Henk F.; Burger, W. J. M.; Frankfort, J. G.; Van-Raan, Anthony F. J. (1985). “The use of bibliometric data for the measurement of university research performance”. Research policy, v. 14, n. 3, pp. 131-149.

http://dx.doi.org/10.1016/0048-7333(85)90012-5

National Science Board (2012). Science and engineering indicators 2012. Arlington, VA, USA: National Science Founda- tion (NSB 12-01)

Paruolo, Paolo; Saisana, Michaela; Saltelli, Andrea (2013). “Ratings and rankings: voodoo or science?”. Journal of the Royal Statistical Society: Series A (Statistics in society), v. 176, n. 3, pp. 609-634.

http://arxiv.org/pdf/1104.3009.pdf http://dx.doi.org/10.1111/j.1467-985X.2012.01059.x

Persson, Olle; Glänzel, Wolfgang; Danell, Rickard (2004). “Inflationary bibliometric values: The role of scientific collaboration and the need for relative indicators in evaluative studies”. Scientometrics, v. 60, n. 3, pp. 421-432.

http://dx.doi.org/10.1023/B:SCIE.0000034384.35498.7d

Price, Jammie; Cotten, Shelia R. (2006). “Teaching, research, and service: Expectations of assistant professors”. The American sociologist, v. 37, n. 1, pp. 5-21.

http://dx.doi.org/10.1007/s12108-006-1011-y

Radojicic, Zoran; Jeremic, Veljko (2012). “Quantity or quality: what matters more in ranking higher education institutions?”. Current science, v. 103, n. 2, pp. 158-162.

http://www.currentscience.ac.in/Volumes/103/02/0158.pdf

Rauhvargers, Andrejs (2013). Global university rankings and their impact. Report II. Brussels, Belgium: European University Association (EUA). ISBN: 978 9078997412

http://www.eua.be/Libraries/Publications_homepage_list/ EUA_Global_University_Rankings_and_Their_Impact_-_ Report_II.sflb.ashx

Reale, Emanuela; Primeri, Emilia (2014). “Reforming universities in Italy: Towards a new paradigm?”. Reforming higher education, v. 41, pp. 39-63.

http://dx.doi.org/10.1007/978-94-007-7028-7_3

Rehn, Catharina; Kronman, Ulf; Wadskog, Daniel (2007). Bibliometric indicators. Definitions and usage at Karolinska Institutet. Stockholm, Sweden: Karolinska Institutet University Library

Saisana, Michaela; D’Hombres, Beatrice (2008). Higher education rankings: Robustness issues and critical assessment; How much confidence can we have in higher education rankings? Luxembourg: Joint Research Centre. ISBN: 978 82 79 09704 1

https://globalhighered.files.wordpress.com/2008/11/ eur23487.pdf

Saisana, Michaela; D’Hombres, Beatrice; Saltelli, Andrea (2011). “Rickety numbers: Volatility of university rankings and policy implications”. Research policy, v. 40, n. 1, pp. 165-177.

http://dx.doi.org/10.1016/j.respol.2010.09.003

Tang, Li; Walsh, John P. (2010). “Bibliometric fingerprints: name disambiguation based on approximate structure equivalence of cognitive maps”. Scientometrics, v. 84, n. 3, pp. 763-784.

https://www.thevantagepoint.com/resources/articles/TangWalsh-scientometrics-online.pdf

http://dx.doi.org/10.1007/s11192-010-0196-6

The Australian (2014). “Bogus academic claims tarnish Serbia’s ivory tower”. The Australian. Higher education, June 30.

http://goo.gl/QviDAU

Thomson Reuters (2014). Web of Knowledge

https://webofknowledge.com

University of Belgrade (2014). Faculties

http://www.bg.ac.rs/en/members/faculties/faculties.php

Van-Raan, Anthony F. J. (2003). “The use of bibliometric analysis in research performance assessment and monitoring of interdisciplinary scientific developments”. Technikfol genabschätzung – Theorie und praxis, v. 1, n. 12, pp. 20-29.

http://www.cwts.nl/TvR/documents/AvR-TFA2003.pdf

Van-Raan, Anthony F. J. (2005). “For your citations only? Hot topics in bibliometric analysis”. Measurement: Interdisciplinary research and perspectives, v. 3, n. 1, pp. 50-62.

http://www.cwts.nl/TvR/documents/AvR-ForYourCit.pdf http://dx.doi.org/10.1207/s15366359mea0301_7

Waltman, Ludo; Calero-Medina, Clara; Kosten, Joost; No- yons, Ed; Tijssen, Robert J. W.; Van-Eck, Nees-Jan; Van-Leeuwen, Thed N.; Van-Raan, Anthony F. J.; Visser, Martijn S.; Wouters, Paul (2012). “The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation”. Journal of the American Society for Information Science and Technology, v. 63, n. 12, pp 2419-2432.

http://arxiv.org/abs/1202.3941

http://dx.doi.org/10.1002/asi.22708

Waltman, Ludo; Schreiber, Michael (2013). “On the calculation of percentile-based bibliometric indicators”. Journal of the American Society for Information Science and Technol- ogy, v. 64, n. 2, pp. 372-379.

http://arxiv.org/pdf/1205.0646.pdf http://dx.doi.org/10.1002/asi.22775

Waltman, Ludo; Van-Eck, Nees-Jan; Van-Leeuwen, Thed N.; Visser, Martijn S.; Van-Raan, Anthony F. J. (2011). “Towards a new crown indicator: Some theoretical considerations”. Journal of informetrics, v. 5, n. 1, pp. 37–47.

http://arxiv.org/pdf/1003.2167.pdf http://dx.doi.org/10.1016/j.joi.2010.08.001

Weingart, Peter (2005a). “Das ritual der evaluierung und die verführbarkeit”. In: Weingart, P. (Ed.). Die wissenschaft der öffentlichkeit: Essays zum verhältnis von wissenschaft Medien und öffentlichkeit. Weilerswist, Germany, Velbrück, pp. 102-122. ISBN: 978 3 934730 03 8

Weingart, Peter (2005b). “Impact of bibliometrics upon the science system: Inadvertent consequences?”. Scientomet- rics, v. 62, n. 1, pp. 117-131.

http://dx.doi.org/10.1007/s11192-005-0007-7

Zhou, Ping; Leydesdorff, Loet (2011). “Fractional counting of citations in research evaluation: A cross-and interdisciplinary assessment of the Tsinghua University in Beijing”. Journal of informetrics, v. 5, n. 3, pp. 360-368 (2011)

http://dx.doi.org/10.1016/j.joi.2011.01.010

Zornic, Nikola; Markovic, Aleksandar; Jeremic, Veljko (2014). “How the Top 500 ARWU can provide a misleading rank”. Journal of Association for Information Science and Technology, v. 65, n. 6, pp. 1303-1304.

http://dx.doi.org/10.1002/asi.23207


Downloads

Downloads per month over past year

Actions (login required)

View Item View Item