ASIS&T SIG/MET WORKSHOP SPONSORED BY ELSEVIER’S ICSR
Day 1: Saturday, October 23, 2021, 8:00 – 12:00 EDT
|8:10-10:10||Paper Session I|
|8:10-8:30||Storyteller: The papers co-citing Sleeping Beauty and Prince before awakening
Takahiro Miura & Ichiro Sakata
|8:30-8:50||Scientists without interdisciplinary knowledge background prefer interdisciplinary collaboration
Chunli Wei & Jiang Li
|8:50-9:10||Evaluating the impact of collaboration enablers through metadata analytics
Jian Qin, Jeff Hemsley & Sarah Bratt
|9:10-9:30||The impact of acknowledgement on co-authorship and citation: Who were acknowledged in Nobel Prize Laureates’ publications?
Zhijie Zhu, Lingxin Zhang, Jiangen He & Wen Lou
|9:30-9:50||How are data repositories used to share research data? Preliminary evidence from the Public Library of Science (PLoS) data availability statements
Chenyue Jiao & Kai Li
|10:00-12:00||Paper Session II|
|10:00-10:20||Semantic provenance of innovation research
Yixuan Long, Liyue Chen, Huifang Yi & Xiwen Liu
|10:20-10:40||Understanding news altmetrics: A first look at the source platform
Houqiang Yu, Jiatong Li & Xueting Cao
|10:40-11:00||Specific and frequent topics of JASIST in the 21st century
Gerson Pech & Catarina Delgado
|11:00-11:20||Ranking of articles using open-access citation-metadata
Bilal Butt, Muhammad Rafi, Muhammad Sabih, Aashir Iftikhar, Maaz Ahmed & Syed Faran Mustafa
|11:20-11:40||Honorific awards as leading and trailing research indicators
Anthony J. Olejniczak, Michael Rohlinger & George E. Walker
|11:40-12:00||Analyzing data collaborations as the ‘missing link’ in scientific collaboration indicators using metadata analytics
Sarah Bratt, Jian Qin & Jeff Hemsley
Day 2: Sunday, October 24, 2021, 8:00 – 12:00 EDT
|8:00-9:40||Paper Session III|
|8:00-8:20||Meta-evaluation of machine translation evaluation methods
|8:20-8:40||A transformer-based model for detecting algorithms from multidisciplinary scientific articles
Lianjie Xiao, Yafei Li & Kai Qin
|8:40-9:00||A time dimension of paper influence evaluation research —— Improvement based on AMMAA algorithm
Na Jia & Yisheng Yu
|9:00-9:20||Using machine learning and disambiguated author identifiers to improve record linkage for funding program evaluation
Brandon Sepulvado, Joshua Y. Lerner & Jennifer Hamilton
|9:20-9:40||Monte Carlo modelling of confidence intervals in translation quality evaluation (TQE) and post-editing distance (PED) measurement
Alexandra Alekseeva, Serge Gladkoff, Irina Sorokina & Lifeng Han
|9:50-11:30||Paper Session IV|
|9:50-10:10||The gender disparity of postgraduate education in China
|10:10-10:30||ICSR Lab: Where scientometrics meets big data
|10:30-10:50||Is novel research worth doing? Evidence from journal peer review
Misha Teplitskiy, Hao Peng, Andrea Blasco & Karim Lakhani
|10:50-11:10||Research subjects in the international research collaboration measurement domain
Ba Xuan Nguyen
|11:10-11:30||The employment gap of Canada's Indigenous community in the AI R&D induced industries: a bibliometric analysis
Amanda Kolopanis, Gita Ghiasi, Matthew Harsh, Vincent Larivière & Tanja Tajmel
|11:30-12:00||Closing Remarks (including announcement of the awards)|
NEWS: Extension of early-bird registration for METRICS 2021
CALL FOR ABSTRACTS
The ASIS&T Special Interest Group for Metrics (SIG/MET) invites contributions to the METRICS 2021 workshop, which will be held prior to the 84th ASIS&T Annual Meeting.
The workshop continues the successful SIG/MET workshop series held annually since 2011 by providing an opportunity to present and discuss research in the fields of informetrics, scientometrics, bibliometrics, altmetrics, quantitative science studies, and information retrieval among experienced researchers, young academics, and practitioners. We invite abstracts describing empirical or theoretical work related, but not limited to:
- New indicators and methods and tools
- Scholarly communication
- Social media metrics (altmetrics)
- Bibliometric-enhanced information retrieval
- Open access, open science
- Patent analysis
- Research evaluation
The following four types of submission are accepted:
- Research presentations, for completed or in-progress research.
- Posters for work in early stages or best presented visually.
- Tutorials for practical information on a tool or method.
- Panels for discussions on a specific topic.
Please indicate the type of submission by naming the file in the following format: Metrics21_FirstAuthorLastName_SubmissionType (Presentation, Poster, Tutorial, Panel). All submissions should be in the form of a two-page extended abstract using APA style. Where appropriate, up to three figures/tables can be provided. Do not include any author names on the file you upload.
The abstracts of accepted papers and posters, as well as the presentation slides, will be published on figshare (http://figshare.com). Figshare allocates DOIs to uploaded content and each publication will be linked from the SIG/MET website to enhance visibility and retrievability of presented research.
The accepted papers could be published in a dedicated issue of the journal Data Science and Informetrics (DSI). An extended version (at least 6 pages excluding the reference) of the accepted paper should be submitted to DSI if author(s) choose this option.
Please submit your abstract as a PDF at https://easychair.org/conferences/?conf=metrics2021
Submissions will be evaluated based on their relevance to the workshop and their methodological soundness (where applicable), and brief feedback will be given in narrative format.
The best paper will be selected by a committee from all accepted (first author as a non-student) workshop papers regardless of their topic.
For the best student paper, the first author of the paper entered into this contest must be a full-time student at the time of submission, irrespective of ASIS&T or SIG/MET membership.
NOTE: If you are eligible and would like to be considered for the student paper award, please indicate (student) on the name of the file as Metrics21_FirstAuthorLastName_SubmissionType(student) , even if the co-authors are non-student.
The winners of both awards will be decided by a double-blind peer review process from the accepted submissions. The awards will be decided before presentations take place, but the authors must present at the workshop to qualify.
- Submissions due: August 17, 2021
- Notifications: September 15, 2021
- Workshop: October 23–24, 2021, 8:00am – 12:00pm EDT
- Fei Shu, Hangzhou Dianzi University, Hangzhou, China
- Email : firstname.lastname@example.org
- Pei-Ying Chen, Indiana University, Bloomington, USA
- Email: email@example.com
- Shenmeng Xu, University of North Carolina at Chapel Hill, Chapel Hill, USA
- Email : firstname.lastname@example.org