About the size of Google Scholar
The emergence of academic search engines has revived and increased the interest in the size of the academic web, since their aspiration is to index the entirety of current academic knowledge.
Send us a link
The emergence of academic search engines has revived and increased the interest in the size of the academic web, since their aspiration is to index the entirety of current academic knowledge.
A study of 2011 suggests that highly tweeted articles were 11 times more likely to be highly cited than less-tweeted articles.
While social media is a valuable tool for outreach and the sharing of ideas, there is a danger that this form of communication is gaining too high a value and that we are losing sight of key metrics of scientific value, such as citation indices.
This paper provides a glimpse of genesis of altmetrics in measuring efficacy of scholarly communications. This paper also highlights available altmetric tools and social platforms linking altmetric tools, which are widely used in deriving altmetric scores of scholarly publications.
After a successful, six month pilot program, we’re pleased to roll out Altmetric today across all Wiley journals.
Measures of research impact are improving, but universities should be wary of their limits.
This release is based on citations from all articles that were indexed in Google Scholar as of mid-June 2013 and covers articles published in 2009–2013.
New citation analyses reveal a who’s who of the most impactful scientific researchers.
Whilst metrics may capture some partial dimensions of research ‘impact’, they cannot be used as any kind of proxy for measuring research ‘quality’.
The expanding economies of South America have led to a significant rise in scientific output over the past two decades, and research spending has increased in most countries. But given the region's share of the world's population and GDP, publication rates still fall short of what would be expected.
A European research collaboration aimed at understanding the ways in which researchers are evaluated by their peers and by institutions, and at assessing how the science system can be improved and enhanced.
It's a common complaint among academics: today's researchers are publishing too much, too fast. But just how fast is the mass of scientific output actually growing?
Paper challenging the perception of citations as an objective, socially unbiased measure of scientific success.
Comment on the paper Predicting publication success for biologists.
At the new Meta-Research Innovation Center at Stanford, or Metrics, John P.A. Ioannidis and Steven N. Goodman, both professors of medicine at Stanford, plan to study how research is done, and how it can be done better.
Scientists go to great lengths to ensure that data are collected and analysed properly, so why do they apply different standards to data about the number of times research papers have been cited and viewed?
An analysis of the presence and possibilities of altmetrics for bibliometric and performance analysis is carried out.
So much science, so little time. Amid an ever-increasing mountain of research articles, data sets and other output, hard-pressed research funders and employers need shortcuts to identify and reward the work that matters.
Every organization that funds research wants to support science that makes a difference. But there is no simple formula for identifying truly important research. And the job is becoming more difficult.