Impact Factor 1.0
Volume 34, 12 Issues, 2024
  Editorial     November 2022  

Beyond the Citation Counts: An Insight into Altmetrics

By Muhammed Mubarak1, Shabana Seemee2

Affiliations

  1. Department of Histopathology, Sindh Institute of Urology and Transplantation (SIUT), Karachi, Pakistan
  2. Department of Publication, College of Physicians and Surgeons Pakistan, Karachi, Pakistan
doi: 10.29271/jcpsp.2022.11.1379

An objective, unbiased and uniformly acceptable assessment of the impact of academic research and its quality has wide-ranging implications for the concerned stakeholders and users. The later were initially few in number but, of late, have increased with widening scope and broader use of research beyond academic institutions. The main stakeholders include among others academia, researchers, practitioners, funding agencies, general public, institutions, corporations, special interest groups, governments and policymakers, etc. Similarly, the scope of research output has expanded in this digital age and includes not only journal research articles, but preprints, documents, books, book chapters, datasets, clinical trial records, and news stories. Historically, the most notable output of research work was the publication or dissemination of the results of that work in the form of an article, often, in a scholarly journal. In addition, the assessment of the impact of the research relied almost entirely on citation counts. Multiple research metrics based on citation counts have been used in the past to gauge the importance of scholarly publications and many more are being developed but none has achieved universal approval.1,2

At present, the parameter most commonly and widely used to assess the impact of research is the Journal Impact Factor (JIF). It is also the oldest metric, which was initially created to select the journals for the Institute for Scientific Information (ISI) list of journals. However, it has many limitations and drawbacks. More recently, Citescore by Scopus has emerged as the main competitor.3 From 2021, Clarivate Analytics has also promulgated yet another citation-based metric, the Journal Citation Indicator (JCI). It is complementary to JIF and overcomes many of its limitations.4 Although citation counting metrics are useful, but are not sufficient/variegated. Citation counting methods are slow and have failed to meet the challenges posed by the newer forms, avenues and uses of research products. Some author-level metrics like the h-index are even more slow: an article’s first citation may take years. Citation methods have narrow scope; important articles may remain uncited. These disregard impact outside the academic field, and also overlook the background and reasons for citations.

The JIF, which measures an individual journal’s average citations per article, is wrongly used to judge the impact of individual articles. It is also disconcerting that the precise details of the JIF calculation are a trade secret and the possibility of significant gaming cannot be entirely excluded. From the above, it is obvious that the traditional citation count-based methods have failed in this era of digitisation, social media and internet connectivity, and the diverse scholarly ecosystem. The traditional metrics only tell part of the story and not the full and, in particular, the societal impact of research. They are limited in their scope and breadth, particularly as they only report academic engagement.

The number of views, reads, downloads, likes, tweets, retweets, Facebook posts and shares on social media and blogs and many more novel methods are emerging as alternative means of determining the impact of research and are being used to produce newer alternative metrics, the altmetrics. Altmetrics are defined as non-traditional metrics and qualitative indicators of online attention and engagement with digitally published research and scholarship, which are complementary to the historical, citation-based metrics, such as JIF. Altmetrics are not a single class of indicator, but are quite broad and varied and include among others, a record of interest, interaction and attention, a measure of dissemination and propagation and a measure of impact and influence. Each of these measures reflects different aspects of the impact of research. They have a much broader scope and not just journal articles. They are immediate and engage a much broader audience. By utilising more indicators, a better understanding of the holistic impact of research can be achieved.5-7

The movement for using alternative metrics started in 2010 with the burgeoning rise in online scholarly literature, in particular, as the existing metrics were either insufficient or biased. The manifesto of Altmetrics starts with “No one can read everything”.8 Altmetrics typically focus on the article rather than on the journal and the same procedures are used to evaluate authors, institutions, publishers, countries, and other entities. Altmetrics are now commonly available (Altmetric.com, Plum Analytics, PLOS Metrics, ImpactStory) and are being utilised by several publishers and displayed on websites of scientific journals. The main advantage of altmetrics is that they are immediate, since usage and interest can be measured from the time of first publication, which is now often online. They cover usage and sharing among the general public as well as scholars. Most publishers and journals have adopted altmetrics. According to one estimate, over 10,000 journals are now using altmetrics. As for other metrics, the source of the data and the calculation need to be considered for meaningful use of these metrics.9

Like other metrics, altmetrics are also not completely protected from the manipulation and gimmickry of statistics. Social media have the potential to amplify small signals. Similarly, the mass tweets, likes or mentions can be purchased or programmed easily. The value of a mention can be vague; mentioners may be unidentified or concealed behind an alias, and massively mentioned articles often feature odd titles or other features that may not possess true academic merit. The majority of mentions may be associated with very few articles and follow the familiar, tilted, Bradford-type distribution form. Altmetrics tend to favour more recent research compared to citation counts. These do not necessarily correlate with scholarly quality. Altmetrics are not a replacement, but rather complement the traditional methods of determining the impact of research. Altmetrics help expands our view of research attention and engagement. 9-11

In summary, in order to get a holistic picture of the scholarly impact, multiple forms of metrics--both traditional and alternative--should be used. Although altmetrics are in their infancy and many questions remain unanswered, it is worth investing in this novel initiative given the limitations of traditional metrics and the quick progress of scholarly communication. The sheer speed, depth, and breadth of altmetrics merit widespread validation and standardization studies to prove their full scientific impact.  

REFERENCES

  1. Mubarak M, Seemee S. Journal Prestige Index: Expanding the Horizons of Assessment of Research Impact. J Coll Physicians Surg Pak 2021; 31(11):1261-2. doi: 10.29271/ jcpsp.2021.11.1261.
  2. Kaldas M, Michael S, Hanna J, Yousef GM. Journal impact factor: a bumpy ride in an open space. J Investig Med 2020; 68(1):83-7.
  3. Baker DW. Introducing CiteScore, Our Journal's Preferred Citation Index: Moving Beyond the Impact Factor. Jt Comm J Qual Patient Saf 2020; 46(6):309-10.
  4. Mubarak M, Seemee S. The Journal Citation Indicator: Have we found the Holy Grail for optimal assessment of research impact? J Pak Med Assoc 2022; 72(7):1270-1.
  5. Gasparyan AY, Nurmashev B, Yessirkepov M, Udovik EE, Baryshnikov AA, Kitas GD. The Journal Impact Factor: Moving Toward an Alternative and Combined Sciento-metric Approach. J Korean Med Sci 2017; 32(2):173-9.
  6. Butler JS, Kaye ID, Sebastian AS, Wagner SC, Morrissey PB, Schroeder GD, Kepler CK, Vaccaro AR. The Evolution of Current Research Impact Metrics: From Bibliometrics to Altmetrics? Clin Spine Surg 2017; 30(5):226-8. doi: 10. 1097/BSD.0000000000000531.
  7. Fassoulaki A, Vassi A, Kardasis A, Chantziara V. Altmetrics versus traditional bibliometrics: Short-time lag and short-time life? Eur J Anaesthesiol 2020; 37(10):944-6. doi: 10.1097/EJA.0000000000001208.
  8. Priem J, Taraborelli D, Groth P, Neylon C (2010), Altmetrics: A manifesto, 26 October 2010. http:// altmetrics.org/manifesto. Accessed: 28 June 2022.
  9. Ali MJ. Understanding the Altmetrics. Semin Ophthalmol 2021; 36(5-6):351-3. doi: 10.1080/088 20538.2021.1930806.
  10. Dardas LA, Woodward A, Scott J, Xu H, Sawair FA. Measuring the social impact of nursing research: An insight into altmetrics. J Adv Nurs 2019; 75(7):1394-405. doi: 10.1111/jan.13921.
  11. Davies A. Citations and altmetrics attention scores: tweets matter. Eur Heart J 2020; 41(34):3226-7. doi: 10. 1093/eurheartj/ehaa413.