Tag Archive for 'journal metrics'
At the 2013 SSP (Society of Scholarly Publishers) Annual conference I was on a panel about research assessment and metrics. As part of my presentation I shared some findings from a large survey conducted by Elsevier’s Research and Academic Relations group (methodology in the slides).
One finding the Twitter back-channel picked up on was the surprising statistic that, from this random sample, only 1% of academics were familiar with altmetrics. I followed this up with a more optimistic statistic showing that both researchers under 35 and also researchers from developing nations were more likely to view different types of potential altmetrics as useful. For this section of the talk, my primary point was that we need to focus on raising awareness among this demographic if altmetrics are to gain legitimacy in the researcher community.
Also discussed are DORA, Journal Metrics (SNIP, SJR), Snowball Metrics, and more. I summarized the primary take away points as follows:
- Choose methods + metrics appropriate to level and impact type being assessed (DORA)
- Don’t confuse level with type (alms ≠ altmetrics) – Tip: Embed free Scopus Cited-By counts at article-level
- Awareness of metrics correlates to acceptance, raising awareness matters
- APAC + younger researchers open to new metrics
- Don’t use just one metric, promote a variety of metrics – Tip: Embed free SNIP/SJR on journal pages
- Choose transparent and standard methods + metrics – Tip: Learn best practices from Snowball Metrics free ebook
From article about SNIP – “Across a subject field as broad as scholarly communication, assessing journal impact by citations to a journal in a two-year time frame is obviously going to favor those subjects that cite heavily, and rapidly. Some fields, particularly those in the life sciences, tend to conform to this citation pattern better than others, leading to some widely recognized distortions.”
From article: “Prestige measured by quantity of citations is one thing, but when it is based on the quality of those citations, you get a better sense of the real value of research to a community. Research Trends talks to Prof. Félix de Moya about SCImago Journal Rank (SJR), which ranks journals based on where their citations originate.”
From article :Bibliometric indicators are not without their own controversies (1, 2) and recently there has been an explosion of new metrics, accompanying a shift in the mindset of the scientific community towards a multidimensional view of journal evaluation.”
Elsevier’s Scopus Partners with CWTS and SCImago to Offer Multidimensional Evaluation of Research Journals
From press release: “Elsevier… announced that its flagship product Scopus has successfully partnered with the Centre for Science and Technology Studies (CWTS) and the SCImago Research Group, endorsing two complementary journal metrics, SNIP and SJR. The metrics will be freely available online at www.journalmetrics.com, and integrated into Scopus, allowing researchers around the world to analyze journals within the abstract and citation database. The indicators will offer a greater currency and flexibility in journal performance measurement than any single-metric method currently available.”
Download SNIP and SJR values for Scopus journals.
“How do SJR and SNIP compare to the Impact Factor? They offer new perspectives in Journal Evaluation that look at the context in which a journal is performing and normalize for citation behaviour. Find more information at www.journalmetrics.com”
“This short video explains how SCImago Journal Rank (SJR) and Source Normalized Impact per Paper (SNIP) are calculated. SJR and SNIP are considering in which context in which a Journal is performing looking at differences in research areas with different citation behaviours. “
Looks like an interesting introductory course to the various concepts of scientific publishing and Science 2.0. (found because it linked to my Identity 2.0 post)