Skip to content

SIG-MET Events


Day 1: Saturday, October 23, 2021,  8:00 – 12:00 EDT

8:00-8:10 Opening Remarks
8:10-10:10 Paper Session I
8:10-8:30 Storyteller: The papers co-citing Sleeping Beauty and Prince before awakening
Takahiro Miura & Ichiro Sakata
8:30-8:50 Scientists without interdisciplinary knowledge background prefer interdisciplinary collaboration
Chunli Wei & Jiang Li
8:50-9:10 Evaluating the impact of collaboration enablers through metadata analytics
Jian Qin, Jeff Hemsley & Sarah Bratt
9:10-9:30 The impact of acknowledgement on co-authorship and citation: Who were acknowledged in Nobel Prize Laureates’ publications?
Zhijie Zhu, Lingxin Zhang, Jiangen He & Wen Lou
9:30-9:50 How are data repositories used to share research data? Preliminary evidence from the Public Library of Science (PLoS) data availability statements
Chenyue Jiao & Kai Li
9:50-10:00 Break
10:00-12:00 Paper Session II
10:00-10:20 Semantic provenance of innovation research
Yixuan Long, Liyue Chen, Huifang Yi & Xiwen Liu
10:20-10:40 Understanding news altmetrics: A first look at the source platform

Houqiang Yu, Jiatong Li & Xueting Cao

10:40-11:00 Specific and frequent topics of JASIST in the 21st century
Gerson Pech & Catarina Delgado
11:00-11:20 Ranking of articles using open-access citation-metadata
Bilal Butt, Muhammad Rafi, Muhammad Sabih, Aashir Iftikhar, Maaz Ahmed & Syed Faran Mustafa
11:20-11:40 Honorific awards as leading and trailing research indicators

Anthony J. Olejniczak, Michael Rohlinger & George E. Walker

11:40-12:00 Analyzing data collaborations as the ‘missing link’ in scientific collaboration indicators using metadata analytics
Sarah Bratt, Jian Qin & Jeff Hemsley


Day 2: Sunday, October 24, 2021,  8:00 – 12:00 EDT

8:00-9:40 Paper Session III
8:00-8:20 Meta-evaluation of machine translation evaluation methods
Lifeng Han
8:20-8:40 A transformer-based model for detecting algorithms from multidisciplinary scientific articles
Lianjie Xiao, Yafei Li & Kai Qin
8:40-9:00 A time dimension of paper influence evaluation research —— Improvement based on AMMAA algorithm
Na Jia & Yisheng Yu
9:00-9:20 Using machine learning and disambiguated author identifiers to improve record linkage for funding program evaluation
Brandon Sepulvado, Joshua Y. Lerner & Jennifer Hamilton
9:20-9:40 Monte Carlo modelling of confidence intervals in translation quality evaluation (TQE) and post-editing distance (PED) measurement
Alexandra Alekseeva, Serge Gladkoff, Irina Sorokina & Lifeng Han
9:40-9:50 Break
9:50-11:30 Paper Session IV
9:50-10:10 The gender disparity of postgraduate education in China
Wei Quan
10:10-10:30 ICSR Lab: Where scientometrics meets big data
Andrew Plume
10:30-10:50 Is novel research worth doing? Evidence from journal peer review
Misha Teplitskiy, Hao Peng, Andrea Blasco & Karim Lakhani
10:50-11:10 Research subjects in the international research collaboration measurement domain
Ba Xuan Nguyen
11:10-11:30 The employment gap of Canada's Indigenous community in the AI R&D induced industries: a bibliometric analysis
Amanda Kolopanis, Gita Ghiasi, Matthew Harsh, Vincent Larivière & Tanja Tajmel
11:30-12:00 Closing Remarks (including announcement of the awards)


NEWS: Extension of early-bird registration for METRICS 2021

The early-bird rate for the workshop (free for ASIS&T members, $25 for non-member, $10 for non-member students) has been extended to Oct.16.
Please register the workshop by Oct. 16, and the registration fee will increase by $25 after.


Please note that the promotion is only for the Metrics 2021 workshop. The early-bird rate for the ASIS$T Annual Meeting has expired already.
Due to the pandemic, Metrics 2021 (SIG-MET Events - Association for Information Science and Technology | ASIS&T ( will be a virtual workshop scheduled on Oct. 23 and 24. The workshop program will be available soon.


The ASIS&T Special Interest Group for Metrics (SIG/MET) invites contributions to the METRICS 2021 workshop, which will be held prior to the 84th ASIS&T Annual Meeting.

The workshop continues the successful SIG/MET workshop series held annually since 2011 by providing an opportunity to present and discuss research in the fields of informetrics, scientometrics, bibliometrics, altmetrics, quantitative science studies, and information retrieval among experienced researchers, young academics, and practitioners. We invite abstracts describing empirical or theoretical work related, but not limited to:

  • New indicators and methods and tools
  • Scholarly communication
  • Social media metrics (altmetrics)
  • Bibliometric-enhanced information retrieval
  • Open access, open science
  • Patent analysis
  • Research evaluation


The following four types of submission are accepted:

  • Research presentations, for completed or in-progress research.
  • Posters for work in early stages or best presented visually.
  • Tutorials for practical information on a tool or method.
  • Panels for discussions on a specific topic.

Please indicate the type of submission by naming the file in the following format: Metrics21_FirstAuthorLastName_SubmissionType (Presentation, Poster, Tutorial, Panel). All submissions should be in the form of a two-page extended abstract using APA style. Where appropriate, up to three figures/tables can be provided. Do not include any author names on the file you upload.

The abstracts of accepted papers and posters, as well as the presentation slides, will be published on figshare ( Figshare allocates DOIs to uploaded content and each publication will be linked from the SIG/MET website to enhance visibility and retrievability of presented research.

The accepted papers could be published in a dedicated issue of the journal Data Science and Informetrics (DSI). An extended version (at least 6 pages excluding the reference) of the accepted paper should be submitted to DSI if author(s) choose this option.

Please submit your abstract as a PDF at

Submissions will be evaluated based on their relevance to the workshop and their methodological soundness (where applicable), and brief feedback will be given in narrative format.


The best paper will be selected by a committee from all accepted (first author as a non-student) workshop papers regardless of their topic.

For the best student paper, the first author of the paper entered into this contest must be a full-time student at the time of submission, irrespective of ASIS&T or SIG/MET membership.

NOTE: If you are eligible and would like to be considered for the student paper award, please indicate (student) on the name of the file as Metrics21_FirstAuthorLastName_SubmissionType(student) , even if the co-authors are non-student.

The winners of both awards will be decided by a double-blind peer review process from the accepted submissions. The awards will be decided before presentations take place, but the authors must present at the workshop to qualify.


  • Submissions due: August 17, 2021
  • Notifications: September 15, 2021
  • Workshop: October 23–24, 2021, 8:00am – 12:00pm EDT


  • Fei Shu, Hangzhou Dianzi University, Hangzhou, China
    • Email :
  • Pei-Ying Chen, Indiana University, Bloomington, USA
    • Email:
  • Shenmeng Xu, University of North Carolina at Chapel Hill, Chapel Hill, USA
    • Email :


International Center for the Study of Research_11C_JPG