Página 1 dos resultados de 530 itens digitais encontrados em 0.002 segundos
Resultados filtrados por Publicador: Universidade Cornell

Altmetrics (Chapter from Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact)

Priem, Jason
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 06/07/2015
Relevância na Pesquisa
27.16%
This chapter discusses altmetrics (short for "alternative metrics"), an approach to uncovering previously-invisible traces of scholarly impact by observing activity in online tools and systems. I argue that citations, while useful, miss many important kinds of impacts, and that the increasing scholarly use of online tools like Mendeley, Twitter, and blogs may allow us to measure these hidden impacts. Next, I define altmetrics and discuss research on altmetric sources--both research mapping the growth of these sources, and scientometric research measuring activity on them. Following a discussion of the potential uses of altmetrics, I consider the limitations of altmetrics and recommend areas ripe for future research.; Comment: Published in Cronin, B., & Sugimoto, C. R. (2014). Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact (1 edition). Cambridge, Massachusetts: The MIT Press. https://mitpress.mit.edu/books/beyond-bibliometrics

Comment: Bibliometrics in the Context of the UK Research Assessment Exercise

Silverman, Bernard W.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 19/10/2009
Relevância na Pesquisa
27.16%
Research funding and reputation in the UK have, for over two decades, been increasingly dependent on a regular peer-review of all UK departments. This is to move to a system more based on bibliometrics. Assessment exercises of this kind influence the behavior of institutions, departments and individuals, and therefore bibliometrics will have effects beyond simple measurement. [arXiv:0910.3529]; Comment: Published in at http://dx.doi.org/10.1214/09-STS285A the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org)

Mining Scientific Papers for Bibliometrics: a (very) Brief Survey of Methods and Tools

Atanassova, Iana; Bertin, Marc; Mayr, Philipp
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 06/05/2015
Relevância na Pesquisa
27.33%
The Open Access movement in scientific publishing and search engines like Google Scholar have made scientific articles more broadly accessible. During the last decade, the availability of scientific papers in full text has become more and more widespread thanks to the growing number of publications on online platforms such as ArXiv and CiteSeer. The efforts to provide articles in machine-readable formats and the rise of Open Access publishing have resulted in a number of standardized formats for scientific papers (such as NLM-JATS, TEI, DocBook). Our aim is to stimulate research at the intersection of Bibliometrics and Computational Linguistics in order to study the ways Bibliometrics can benefit from large-scale text analytics and sense mining of scientific papers, thus exploring the interdisciplinarity of Bibliometrics and Natural Language Processing.; Comment: 2 pages, paper accepted for the 15th International Society of Scientometrics and Informetrics Conference (ISSI)

Editorial for the First Workshop on Mining Scientific Papers: Computational Linguistics and Bibliometrics

Atanassova, Iana; Bertin, Marc; Mayr, Philipp
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 17/06/2015
Relevância na Pesquisa
27.63%
The workshop "Mining Scientific Papers: Computational Linguistics and Bibliometrics" (CLBib 2015), co-located with the 15th International Society of Scientometrics and Informetrics Conference (ISSI 2015), brought together researchers in Bibliometrics and Computational Linguistics in order to study the ways Bibliometrics can benefit from large-scale text analytics and sense mining of scientific papers, thus exploring the interdisciplinarity of Bibliometrics and Natural Language Processing (NLP). The goals of the workshop were to answer questions like: How can we enhance author network analysis and Bibliometrics using data obtained by text analytics? What insights can NLP provide on the structure of scientific writing, on citation networks, and on in-text citation analysis? This workshop is the first step to foster the reflection on the interdisciplinarity and the benefits that the two disciplines Bibliometrics and Natural Language Processing can drive from it.; Comment: 4 pages, Workshop on Mining Scientific Papers: Computational Linguistics and Bibliometrics at ISSI 2015

How to calculate the practical significance of citation impact differences? An empirical example from evaluative institutional bibliometrics using adjusted predictions and marginal effects

Bornmann, Lutz; Williams, Richard
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 09/01/2013
Relevância na Pesquisa
27.16%
Evaluative bibliometrics is concerned with comparing research units by using statistical procedures. According to Williams (2012) an empirical study should be concerned with the substantive and practical significance of the findings as well as the sign and statistical significance of effects. In this study we will explain what adjusted predictions and marginal effects are and how useful they are for institutional evaluative bibliometrics. As an illustration, we will calculate a regression model using publications (and citation data) produced by four universities in German-speaking countries from 1980 to 2010. We will show how these predictions and effects can be estimated and plotted, and how this makes it far easier to get a practical feel for the substantive meaning of results in evaluative bibliometric studies. We will focus particularly on Average Adjusted Predictions (AAPs), Average Marginal Effects (AMEs), Adjusted Predictions at Representative Values (APRVs) and Marginal Effects at Representative Values (MERVs).