Course Code Course Title L T P O L+T+P C
Research and Publication
RPE17001 1 0 1 - 2 2
Ethics
Instructional objectives Student Outcomes
At the end of this course the learner is expected:
1. To understand the philosophy of science and ethics, research
integrity and publication ethics. To identify research
At the end of the
misconduct and predatory publications.
course the student will
2. To understand indexing and citation databases, open access have awareness about
publications, research metrics (citations, h-index, impact Factor, the publication ethics
etc.). To understand the usage of plagiarism tools. and publication
misconducts
Unit I: PHILOSOPHY AND ETHICS (3 Hrs.)
Introduction to philosophy: definition, nature and scope, concept, branches - Ethics: definition,
moral philosophy, nature of moral judgements and reactions.
Unit II: SCIENTIFIC CONDUCT (5 Hrs.)
Ethics with respect to science and research - Intellectual honesty and research integrity -
Scientific misconducts: Falsification, Fabrication and Plagiarism (FFP) - Redundant
Publications: duplicate and overlapping publications, salami slicing - Selective reporting and
misrepresentation of data.
Unit III: PUBLICATION ETHICS (7 hrs)
Publication ethics: definition, introduction and importance - Best practices / standards setting
initiatives and guidelines: COPE, WAME, etc. - Conflicts of interest - Publication misconduct:
definition, concept, problems that lead to unethical behaviour and vice versa, types - Violation of
publication ethics, authorship and contributor ship - Identification of publication misconduct,
complaints and appeals - Predatory publisher and journals.
Unit IV: OPEN ACCESS PUBLISHING (4 Hrs.)
Open access publications and initiatives - SHERPA/RoMEO online resource to check publisher
copyright & self-archiving policies - Software tool to identify predatory publications developed
by SPPU - Journal finger / journal suggestion tools viz. JANE, Elsevier Journal Finder, Springer,
Journal Suggester, etc.
Unit V: PUBLICATION MISCONDUCT (4Hrs.)
Group Discussion (2 Hrs.) : a) Subject specific ethical issues, FFP, authorship b) Conflicts of
interest c) Complaints and appeals: examples and fraud from India and abroad
Software tools (2 Hrs.) : Use of plagiarism software like Turnitin, Urkund and other open source
software tools.
Unit VI: DATABASES AND RESEARCH METRICS (7Hrs.)
Databases (4 Hrs): Indexing databases, Citation databases: Web of Science, Scopus, etc.
Research Metrics (3 Hrs.): Impact Factor of journal as per Journal Citations Report, SNIP, SJR,
IPP, Cite Score - Metrics: h-index, g index, i10 Index, altmetrics.
*Units 1,2 and 3 are to be covered via Theory mode and Units 4,5 and 6 are to be covered via
practice mode
References
1. Nicholas H. Steneck. Introduction to the Responsible Conduct of Research. Office of
Research Integrity. 2007. Available at: https://ori.hhs.gov/sites/default/files/rcrintro.pdf
2. The Student's Guide to Research Ethics By Paul Oliver Open University Press, 2003
3. Responsible Conduct of Research By Adil E. Shamoo; David B. Resnik Oxford
University Press, 2003
4. Ethics in Science Education, Research and Governance Edited by Kambadur Muralidhar,
Amit Ghosh Ashok Kumar Singhvi. Indian National Science Academy, 2019. ISBN :
978-81-939482-1-7.
5. Anderson B.H., Dursaton, and Poole M.: Thesis and assignment writing, Wiley Eastern
1997.
6. Bijorn Gustavii: How to write and illustrate scientific papers? Cambridge University
Press.
7. Bordens K.S. and Abbott, B.b.: Research Design and Methods, Mc Graw Hill, 2008.
8. Graziano, A., M., and Raulin, M.,L.: Research Methods – A Process of Inquiry, Sixth
Edition, Pearson, 2007.
Course Nature : Theory and Practice
Assessment Method (Max.Marks: 100)
Assessment Group
In Tutorials Quiz Assignments Total
Tool Discussion
Semester
Marks 10 5 5 10 30%
End Semester Weightage 70%
Total 100%
1. Various forms of Scientific Misconduct
Behavior that would be considered scientific misconduct could occur at all
points in a research protocol. You could encounter different types of
scientific misconduct at different stages, right from the origination of the
research study itself to the publication of the results.
Common Types of Scientific Misconduct
Listed below are the top 10 transgressions that peer reviewers and journal
editors look for, incorporating content from both the World Association of
Medical Editors (WAME), and the US Office of Research Integrity:
1. Misappropriation of Ideas – taking the intellectual property of
others, perhaps as a result of reviewing someone else’s article or
manuscript, or grant application and proceeding with the idea as your
own.
2. Plagiarism – utilizing someone else’s words, published work,
research processes, or results without giving appropriate credit via full
citation.
3. Self-plagiarism – recycling or re-using your own work without
appropriate disclosure and/or citation.
4. Impropriety of Authorship – claiming undeserved authorship on
your own behalf, excluding material contributors from co-authorship,
including non-contributors as authors, or submitting multi-author
papers to journals without the consensus of all named authors.
5. Failure to Comply with Legislative and Regulatory
Requirements – willful violations of rules concerning the safe use of
chemicals, care of human and animal test subjects, inappropriate use
of investigative drugs or equipment, and inappropriate use of research
funds.
6. Violation of Generally Accepted Research Practices – this can
include the proposal of the research study, manipulation of
experiments to generate preferred results, deceptive statistical or
analytical practices to generate preferred results, or improper
reporting of results to present a misleading outcome.
7. Falsification of Data – rather than manipulate the experiments or the
data to generate preferred results, this transgression simply fabricates
the data entirely.
8. Failure to Support Validation of Your Research – by refusing to
supply complete datasets or research material needed to facilitate
validation of your results through a replication study.
9. Failure to Respond to Known Cases of Unsuccessful Validation
Attempts – published research that is found to be flawed should be
retracted from the journal that published it.
10. Inappropriate Behavior in Relation to Suspected
Misconduct – failure to cooperate with any claims of misconduct
made against you, failure to report known or suspected misconduct,
destruction of any evidence related to any claim of misconduct,
retaliation against any persons involved in a claim of misconduct,
knowingly making false claims of misconduct.
In terms of severity, any misconduct that damages the integrity of the
research process, specifically the steps of the Scientific Method, is
considered to be a greater transgression than any subsequent misconduct in
the publication of research results. Obviously, falsification of data is a much
larger transgression than excluding an eligible co-author. However, since
many of the instances of misconduct listed above can carry severe penalties,
including loss of licensure and imprisonment, every effort must be made to
distinguish between honest human error and deliberate intent to defraud.
Fabrication is making up results and recording or reporting them. This is
sometimes referred to as "drylabbing".[10] A more minor form of
fabrication is where references are included to give arguments the
appearance of widespread acceptance, but are actually fake, or do not
support the argument.[11]
Falsification is manipulating research materials, equipment, or processes or
changing or omitting data or results such that the research is not accurately
represented in the research record.
Plagiarism is the appropriation of another person's ideas, processes, results,
or words without giving appropriate credit. One form is the appropriation of
the ideas and results of others, and publishing as to make it appear the author
had performed all the work under which the data was obtained. A subset is
citation plagiarism – willful or negligent failure to appropriately credit other
or prior discoverers, so as to give an improper impression of priority. This
is also known as, "citation amnesia", the "disregard syndrome" and
"bibliographic negligence".[12] Arguably, this is the most common type of
scientific misconduct. Sometimes it is difficult to guess whether authors
intentionally ignored a highly relevant cite or lacked knowledge of the prior
work. Discovery credit can also be inadvertently reassigned from the
original discoverer to a better-known researcher. This is a special case of
the Matthew effect.[13]
Plagiarism-fabrication – the act of taking an unrelated figure from an
unrelated publication and reproducing it exactly in a new publication,
claiming that it represents new data.
Self-plagiarism – or multiple publication of the same content with different
titles or in different journals is sometimes also considered misconduct;
scientific journals explicitly ask authors not to do this. It is referred to as
"salami" (i.e. many identical slices) in the jargon of medical journal editors.
According to some editors this includes publishing the same article in a
different language.[14]
Redundant publication (also described as 'salami publishing'): this refers
to the situation that one study is split into several parts and submitted to two
or more journals. Or the findings have previously been published elsewhere
without proper cross-referencing, permission or justification.
Publication overlapping —the presentation of redundant ideas or data in
multiple papers by the same authors—is a practice that warrants serious
discussion. ... For example, authors may ask the same question with
different datasets, or they may ask different questions with the same dataset.
Ghostwriting – the phenomenon where someone other than the named
author(s) makes a major contribution. Typically, this is done to mask
contributions from authors with a conflict of interest.
2. Publication misconduct (COPE - Committee on Publication Ethics)
COPE was founded in 1997 to address breaches of research and publication
ethics. A voluntary body providing a discussion forum and advice for
scientific editors, it aims to find practical ways of dealing with the issues,
and to develop good practice. The guidelines were developed from a
preliminary version drafted by individual members of the committee, which
was then submitted to extensive consultation. They address:
1. study design and ethical approval,
2. data analysis,
3. authorship,
4. conflict of interests,
5. The peer review process,
6. Redundant publication,
7. Plagiarism,
8. Duties of editors,
9. Media relations,
10. Advertising
1. Study design and ethical approval
Definition: Good research should be well justified, well planned,
appropriately designed, and ethically approved. To conduct research to a
lower standard may constitute misconduct.
Action
(1) Laboratory and clinical research should be driven by protocol; pilot
studies should have a written rationale.
(2) Research protocols should seek to answer specific questions, rather than
just collect data.
(3) Protocols must be carefully agreed by all contributors and collaborators,
including, if appropriate, the participants.
(4) The final protocol should form part of the research record.
(5) Early agreement on the precise roles of the contributors and
collaborators, and on matters of authorship and publication, is advised.
2. Data analysis
Definition: Data should be appropriately analyzed, but inappropriate
analysis does not necessarily amount to misconduct. Fabrication and
falsification of data do constitute misconduct.
Action
(1) All sources and methods used to obtain and analyse data, including any
electronic pre-processing, should be fully disclosed; detailed explanations
should be provided for any exclusions.
(2) Methods of analysis must be explained in detail, and referenced, if they
are not in common use.
(3) The post hoc analysis of subgroups is acceptable, as long as this is
disclosed. Failure to disclose that the analysis was post hoc is unacceptable.
(4) The discussion section of a paper should mention any issues of bias
which have been considered, and explain how they have been dealt with in
the design and interpretation of the study.
3.Authorship
Definition: There is no universally agreed definition of authorship, although
attempts have been made. As a minimum, authors should take responsibility
for a particular section of the study.
Action
(1) The award of authorship should balance intellectual contributions to the
conception, design, analysis and writing of the study against the collection
of data and other routine work. If there is no task that can reasonably be
attributed to a particular individual, then that individual should not be
credited with authorship.
(2) To avoid disputes over attribution of academic credit, it is helpful to
decide early on in the planning of a research project who will be credited as
authors, as contributors, and who will be acknowledged.
(3) All authors must take public responsibility for the content of their paper.
The multidisciplinary nature of much research can make this difficult, but
this can be resolved by the disclosure of individual contributions.
(4) Careful reading of the target journal’s “Advice to Authors” is advised,
in the light of current uncertainties.
4.Conflicts of interest
Definition
Conflicts of interest comprise those which may not be fully apparent and
which may influence the judgment of author, reviewers, and editors. They
have been described as those which, when revealed later, would make a
reasonable reader feel misled or deceived. They may be personal,
commercial, political, academic or financial. “Financial” interests may
include employment, research funding, stock or share ownership, payment
for lectures or travel, consultancies and company support for staff.
Action
(1) Such interests, where relevant, must be declared to editors by
researchers, authors, and reviewers.
(2) Editors should also disclose relevant conflicts of interest to their readers.
If in doubt, disclose. Sometimes editors may need to withdraw from the
review and selection process for the relevant submission.
5.Peer review
Definition
Peer reviewers are external experts chosen by editors to provide written
opinions, with the aim of improving the study. Working methods vary from
journal to journal, but some use open procedures in which the name of the
reviewer is disclosed, together with the full or “edited” report.
Action
(1) Suggestions from authors as to who might act as reviewers are often
useful, but there should be no obligation on editors to use those suggested.
(2) The duty of confidentiality in the assessment of a manuscript must be
maintained by expert reviewers, and this extends to reviewers’ colleagues
who may be asked (with the editor’s permission) to give opinions on specific
sections.
(3) The submitted manuscript should not be retained or copied.
(4) Reviewers and editors should not make any use of the data, arguments,
or interpretations, unless they have the authors’ permission
6.Redundant publication
Definition
Redundant publication occurs when two or more papers, without full cross
reference, share the same hypothesis, data, discussion points, or
conclusions.
Action
(1) Published studies do not need to be repeated unless further confirmation
is required.
(2) Previous publication of an abstract during the proceedings of meetings
does not preclude subsequent submission for publication, but full disclosure
should be made at the time of submission.
(3) Re-publication of a paper in another language is acceptable, provided
that there is full and prominent disclosure of its original source at the time
of submission.
(4) At the time of submission, authors should disclose details of related
papers, even if in a different language, and similar papers in press.
7.Plagiarism
Definition
Plagiarism ranges from the unreferenced use of others’ published and
unpublished ideas, including research grant applications to submission
under “new” authorship of a complete paper, sometimes in a different
language. It may occur at any stage of planning, research, writing, or
publication: it applies to print and electronic versions.
Action
(1) All sources should be disclosed, and if large amounts of other people’s
written or illustrative material is to be used, permission must be sought.
8.Duties of editors
Definition
Editors are the stewards of journals. They usually take over their journal
from the previous editor(s) and always want to hand over the journal in good
shape. Most editors provide direction for the journal and build a strong
management team. They must consider and balance the interests of many
constituents, including readers, authors, staff, owners, editorial board
members, advertisers and the media.
Actions
(1) Editors’ decisions to accept or reject a paper for publication should be
based only on the paper’s importance, originality, and clarity, and the
study’s relevance to the remit of the journal.
(2) Studies that challenge previous work published in the journal should be
given an especially sympathetic hearing.
(3) Studies reporting negative results should not be excluded.
(4) All original studies should be peer reviewed before publication, taking
into full account possible bias due to related or conflicting interests.
9.Media relations
Definition
Medical research findings are of increasing interest to the print and
broadcast media. Journalists may attend scientific meetings at which
preliminary research findings are presented, leading to their premature
publication in the mass media.
Action
(1) Authors approached by the media should give as balanced an account of
their work as possible, ensuring that they point out where evidence ends and
speculation begins.
(2) Simultaneous publication in the mass media and a peer reviewed journal
is advised, as this usually means that enough evidence and data have been
provided to satisfy informed and critical readers.
(3) Where this is not possible, authors should help journalists to produce
accurate reports, but refrain from supplying additional data.
(4) All efforts should be made to ensure that patients who have helped with
the research should be informed of the results by the authors before the mass
media, especially if there are clinical implications.
10.Advertising
Definition
Many scientific journals and meetings derive significant income from
advertising. Reprints may also be lucrative.
Action
(1) Editorial decisions must not be influenced by advertising revenue or
reprint potential: editorial and advertising administration must be clearly
separated.
(2) Advertisements that mislead must be refused, and editors must be willing
to publish criticisms, according to the same criteria used for material in the
rest of the journal.
(3) Reprints should be published as they appear in the journal unless a
correction is to be added.
WAME (World Association of Medical Editors)
Recommendations on...
Conflict of Interest in Peer-Reviewed Medical Journals
Study Design and Ethics
Authorship
Peer Review
Editorial Decisions
Originality, Prior Publication, and Media Relations
Plagiarism
Advertising
Responding to Allegations of Possible Misconduct
Relation of the Journal to the Sponsoring Society (if applicable)
What is a predatory publisher?
A predatory publisher is an opportunistic publishing venue that exploits the
academic need to publish but offers little reward for those using their
services.
The academic "publish or perish" scenario combined with the relative ease of website
creation has inadvertently created a market ripe for the exploitation of academic
authors. Some publishers are predatory on purpose, while others may make mistakes
due to neglect, mismanagement, or inexperience. While the motivations and methods
vary predatory publishers have common characteristics:
• Their primary goal is to make money (i.e. there will be fees).
• They do not care about the quality of the work published (i.e. no or little editing
or peer-review).
• They make false claims or promises (i.e. claims of impact factors and indexing).
• They engage in unethical business practices (i.e. not as advertised).
• They fail to follow accepted standards or best practices of scholarly publishing
(various).
3. Open Access publishing
Open Access (OA) initiative emerged as a revolutionary movement that
promotes free access to scholarly publications over the Internet, removes
the price and permission barriers and ensures the widest possible
dissemination of research. OA exists where there is free, immediate and
unrestricted availability of digital content.
Some of the salient features of OA are
• Open access literature is digital, free of charge and free of copyright;
• OA is compatible with copyright, peer review, revenue, print,
preservation, prestige, career advancement, indexing and supportive
services associated with conventional scholarly literature.
• OA campaign focuses on the literature that authors give to the world
without expectation of payment;
• OA is compatible with peer review and all the major AO initiatives
for scientific & scholarly literature insist on its importance.
Some of the major open statements or declarations made during the past
decade are given below:
ARIIC Open Access Statement (Australian Research Information
Infrastructure Committee)
[www.caul.edu.au/scholcomm/OpenAccessARIICstatement.doc]
Berlin Declaration on Open Access to Knowledge in the Sciences and
Humanities [http:// oa.mpg.de/openaccess-berlin/berlindeclaration.html]
Bethesda Statement on Open Access
[www.earlham.edu/~peters/fos/bethesda.htm]
Budapest Open Access Initiative Statement
[www.soros.org/openaccess/]
ERCIM Statement on Open Access (European Research Consortium
for Informatics and Mathematics)
[www.ercim.org/publication/Ercim_News/enw64/ercim-oa.html]
IFLA Statement on Open Access to Scholarly Literature and Research
Documentation
OECD Declaration on Access to Research Data from Public Funding
Washington DC Principles for Free Access to Science: A Statement
from Not-for-Profit Publishers [www.dcprinciples.org/statement.htm]
Wellcome Trust Position Statement in support of open and unrestricted
access to published research
[www.wellcome.ac.uk/doc_WTD002766.html]
WSIS Declaration of Principles and Plan of Action (World Summit on
the Information Society)
[www.itu.int/wsis/docs/geneva/official/dop.html &
http://www.itu.int/wsis/docs/geneva/ official/poa.html]
About Sherpa Romeo
Sherpa Romeo is an online resource that aggregates and presents
publisher and journal open access policies from around the world. Every
registered publisher or journal held in Romeo is carefully reviewed and
analyzed by our specialist team who provide summaries of self-archiving
permissions and conditions of rights given to authors on a journal-by-
journal basis where possible.
The policy information provided through this service primarily aims to
serve the academic research community. Since the service launched over
15 years ago, publisher policies and the open access sector have changed
a lot. Open access policy can be complex and varies according to
geographical location, the institution, and the various routes to open
access — all of which affects how and where you can publish your
research.
JANE
The Journal/Author Name Estimator (JANE) is a free online
bibliographic journal selection tool. Journal selection tools, also known
as journal matching or journal comparison tools, are popular resources
that help authors determine the most appropriate in scope journal to
publish their manuscripts. JANE is one of the earliest journal selection
tools, debuting in 2007 [1]. The resource is web-based and allows users
to input keywords, abstract text, or author names and view related articles
based on user-supplied terms. At the time of this writing, no formal
mobile app or browser extension has been developed to utilize the
resource. There is an application programming interface (API) freely
available in beta version that is available to users who want to integrate
JANE into their own applications.
Features
JANE’s simple search interface allows users to easily input data into an
open text box. The home page search field defaults to a larger expanded
“Title and/or Abstract” search box. Users can also select the keyword
link to be taken to a smaller text box where keyword terms can be
searched. Keyword searching also works when text is entered in the
“Title and/or Abstract” search box.
Both search boxes include a “Show extra options” button, where users
can limit results by language (English, French, German, Italian,
Japanese, Russian, and Spanish); publication type (case reports, various
phases of clinical trials, meta-analyses, reviews, etc.); open access
journal options; and journals only indexed for PubMed Central.
Each search box also includes options to either “Find journals,” “Find
authors,” or “Find articles” depending on the query users want to search.
“Find journals” retrieves a list of journals that are most similar to the
user’s input terms. Journals are sorted by confidence, with Eigenfactor
article influence (AI) metrics displayed when available. A “Show
articles” link is also displayed, which retrieves a list of relevant articles
from each journal, listed by confidence (Figure 2). Users can also select
individual articles from the results list, which opens a new browser tab
taking users to the full record in PubMed.
The “Find authors” link displays a list of published authors based on the
input data. There is an email option to directly contact each author. The
“Show articles” option works similarly to the “Find journals” option and
retrieves a list of articles published by the relevant authors, based on
input data. Again, articles link directly out to PubMed in a separate
browser tab. The “Find articles” option retrieves articles that are most
similar to input.
Elsevier journal finder
Enter title and abstract of your paper to easily find journals that could be
best suited for publishing. Journal Finder uses smart search technology
and field-of-research specific vocabularies to match your paper to
scientific journals.
4. Plagiarism checking tool
• Turnitin
• Unicheck.
• PlagScan.
• Grammarly Business.
• PlagiarismCheck.org.
• ProWritingAid.
• Noplag.
• URKUND.
• Copyleaks Plagiarism Checker.
• Duplichecker
• Viper
To use Turnitin:
1. Go to Assignments.
2. Click Add. This displays the Add Assignment form.
3. Under Assignment, name the assignments, add dates and select "Single
Uploaded File Only" ...
4. Under Turnitin Service, checkmark "Use Turnitin". ...
5. Select the appropriate Turnitin options. ...
6. Complete the Assignment form, then click Post.
How do I use the Turnitin plagiarism service to check the originality of
student submitted papers?
Tufts University has a University-wide contract with the Turnitin plagiarism detection
service.
Instructors can arrange to have papers submitted to the Canvas Assignments tool
checked, by Turnitin, for potential unoriginal content by comparing submitted papers
to several databases using a proprietary algorithm.
Turnitin scans the Internet, its own databases, and also has licensing agreements
with large academic proprietary databases. Instructors receive (in the Canvas
Assignments tool) an “Originality Report” from Turnitin for each student submission.
The Tufts contract with Turnitin is an instructor only service. Students cannot check
their own papers via Turnitin without an instructor created assignment.
Note:
When Turnitin is used on an assignment, students will have a note on their
assignment form indicating that the assignment will be checked by the Turnitin
plagiarism service.
The Turnitin interface will not allow a student to resubmit to Turnitin on the same
assignment.
Go to Assignments.
Click Add assignment (+)
This displays the Create New Assignment form.
Add the Assignment title and directions
Under Submission Type, click in the dropdown box and
select External tool
Click Find
This displays the list of Canvas External tools
Click Turnitin, then click Select
This connects the Assignment submissions to the Turnitin Service.
Add the desired assign dates/times, then click Save (or
Save and Publish)
This creates an Assignment in which student submissions will be sent to Turnitin and
an Originality Report will be returned back to the Canvas Assignment.
After Saving the Assignment, go back to the Assignment
tool and click on the name of the assignment
This displays the details page for this assignment including a Turnitin Properties
Panel
Click on the Turnitin Menu Icon
Click Settings
This displays the Turnitin Settings for this Assignment.
Click Optional Settings
This displays the Turnitin Optional Settings for this Assignment.
Select sources to compare
Select Originality Report generation and resubmissions
Note: The default is "Generate Reports immediately (resubmissions ar not allowed)"
Select exclusions
Decide if students will receive a copy the the Originality
Report
Allow students to view report: If checked, students will be able to see a copy of
the Turnitin originality report (in addition to the instructor). A link to the originality
report (when it is completed) will be located on the student's view of the completed
assignment.
Note: Students will know in advance of their submitting their assignment file if the
instructor is using the Turnitin service regardless of their access to the returned
originality report.
Decide what should happen with the paper once the
Originality Report is completed
Note:
Selecting "Do not store" means that after the originality check has been performed
on the paper, the paper will not go into Turnitin's paper repository. This might be a
good choice if the assignment is for a "draft" paper. If this option is selected, the
second version of the paper submitted will not be checked against the draft version
the same paper.
Selecting "Standard Paper Repository" means that after the originality check has
been performed on the paper, the paper will go into Turnitin's paper repository.
Other papers submitted to Turnitin (anywhere in the world) will be checked against it.
Click Submit (do not make any additional edits to
Optional Settings)
Urkund:
Step 5: Enter username/email id with password
Step 6: Click on upload documents tab in right side
Step 7: Click on “Select analysis address or enter below”. The address can be drawn from the tab or
typed. It is same which was sent to the registered mail. It is confidential.
Step 8: Subject and message can be entered but it is not mandatory.
Step 9: Upload the document through the tab “Drop files here or click”. Submit the report. Analysis
report
Step 10: Click on URKUND logo reflected in left side on top to go in home screen
Step 11: When the URKUND analysis is complete, a link to the analysis report is automatically sent to
the supervisor either via e‐mail or through the home screen
• The Analysis view of Urkund aligns the submitted document (1) and the sources (2) left to
right. Start by reading through the highlight on the document side and compare it with the
text of the source side.
• The highlighted text (3) means that it is also present in another source. The colour of the
highlight represents the approximate similarity of the two texts.
• A highlight can be either active or inactive. If a highlight is considered correct or irrelevant it
is easily deactivated by removing the check from the” Active”‐box (4). Important! Leave it
active if there are good reasons and the selection needs to/can be discussed with the student,
or if the highlight needs to be moved to the next step in the examination process.
• The easiest way then to move between the different highlights is using the arrow buttons (5)
on the function bar. Arrows are used to move the review mode from one highlight to the next
or previous in the analyzed document.
5. RESEARCH METRICS
Bibliometrics, or research impact, is the quantitative method of
citation and content analysis for scholarly journals, books and
researchers. The quantitative impact of a given publication is
appraised by measuring the amount of times a certain work is cited by
other resources.
How to Find Citation Impact?
To tell your impact story, you need to find the citation count of your research papers
through citation databases .They are key instruments that allow a user to understand the
impact of an individual published paper or of a researcher's body of work. Citation databases
can be used for the following:
•To show the impact an article has by showing the number of times it has been cited since it
was published.
•To compile the references that the author of the publication used.
•To identify and read the most influential publications in a particular field.
•To find related work and to track the development of a certain publication.
WHAT IS ALTMETRICS?
The term almetrics was coined by Jason Priem in
2010, as a generalization of article level metrics, and
rooted in the twitter #almetrics hashtag.
Altmetric is the study of new metrics for analysing
and informing scholarship based on the social web.
SOME DEFINITIONS
According to Adie, Euan the founder of Altmetric.com; “Altmetrics
indicate the quantity and quality of online attention in multiple channels,
including social media, blog posts, and news coverage”.
According to Australian Open Access Support Group (AOASG),
Altmetrics are quantitative indicators of public reach and influence it
provides a more comprehensive understanding of impact across sectors,
including public impact.
Alternative metrics (called altmetrics to distinguish them from
bibliometrics) are considered an interesting option for assessing the
social impact of research, as they offer new ways to measure (public)
engagement with research output.
Altmetrics is a term to describe web-based metrics for the impact of
scholarly material, with an emphasis on social media outlets as sources
of data.
WHAT ALTMETRICS MEASURE?
Altmetrics measure
the number of times a research output
get cited, tweeted about, liked, shared, bookmarked,
viewed, downloaded, mentioned, favourite,
reviewed, or discussed on various kind of web
platforms.
It harvests those web influence data from a wide
variety of web sources and platforms including open
access journal platforms, scholarly citation
databases, web-based research sharing services, and
social media.
ALTMETRICS COLLECTS DATA
Altmetric collects data from: Twitter, Facebook,
Google+, policy documents, mainstream media,
blogs, Mendeley, CiteULike, PubPeer, Pub Ions,
Reddit, Wikipedia, sites running Stack Exchange
(Q&A), reviews on F1000, and YouTube.
ALTMETRICS CLASSIFICATION
Altmetrics are a very broad group of metrics, capturing
various parts of impact a paper or work can have. A
classification of altmetrics was proposed by ImpactStory in
September 2012, and a very similar classification is used by
the Public Library of Science:
Viewed – HTML views and PDF downloads
Discussed – journal comments, science blogs, Wikipedia,
Twitter, Facebook and other social media
Saved – Mendeley, CiteULike and other social bookmarks
Cited – citations in the scholarly literature, tracked by Web
of Science, Scopus, CrossRef and others
Recommended – for example used by F1000Prime
Example
EXAMPLE
EXAMPLE
ALTMETRICS TELLA STORY
What type of attention is this research receiving?
Where has this article received the most traction
Which countries are engaging most with the content
Has this article influenced policy, spurred new research, or engaged a
new audience?
Are reaction to the article positive or negative?
ALTMETRICS TOOLS
Altmetric.com (www.altmetric.com):
Altmetrics.org
ImpactStory (http://impactstory.org)
Plum X (https:z//plu.mx/)
ReaderMeter (http://readermeter.org)
PLoS Article Level Metrics (http://articie-level-metrics.PLoS.org
Publish or perish (http://www.harzing.com/pop.htm)
ScienceCard (www.sciencecard.org)
PaperCritic (http://www.papercritics.com)
Crowdometer (www.crowdometer.org)
ALTMETRIC.COM
Altmetric.com (www.altmetric.com) born as a London based
start-up founded by Euan Adie in 2011.
Their misson is “to make article level metrics easy”.
The portal provides:
1. Explorer
2. Altmetric for Institution
3. Bookmarklet
4. Badge
Individual users and librarians can use Altmetric.com
with a free account, while a commercial licence is required
in the case of publishers, funders and Institutions.
USE OF ALTMETRICS IN LIBRARIES
Academic Libraries: Altmetrics provides opportunities to spot
trends and make informed decisions based on deep quantitative
evidence for the academic libraries.
Collection Development: Almetrics can help in collection
development. Capture bookmarks, favourites on slideshare,
followers on GitHub, groups in Mendeley, etc. Usage downloads.
Reviews on Amazon, SourceForge, links from Wikipedia,
comments on YouTube, etc. Social Media-Tweets, shares,
recommendations on Figshare, ratings on SourceForge, etc.
Institution Support: Altmetrics can make the libraries and
librarians central in new educational role. Thus, it is helping the
researchers and institutions to understand and manipulate their
own impact of research and scholarly communication.
ADVANTAGES OF ALTMETRICS
A more nuanced understanding of impact, showing us
which scholarly products are read, discussed, saved and
recommended as well as cited.
Often more timely data, showing evidence of impact in
days instead of years.
A window on the impact of web-native scholarly
products like datasets, software, blog posts, videos and
more.
Indications of impacts on diverse audiences including
scholars but also practitioners, clinicians, educators and
the general public.
DISADVANTAGES OF ALTMETRICS
Altmetrics don’t tell the Whole story
Like any metric, there’s a potential for gaming of
Altmetrics.
Altmetrics are relatively new, more research into their
use is needed.
CONCLUSION
The Altmetrics tool is boon for every researcher,
institution and publishers. Altmetrics is the only metrics used
in article level in the social networking environment.
Altmetrics is useful and may well be considered reliable. It
can represent an interesting and relevant complement to
citations. Hence Librarian and Research Scholars should
know about the altmetrics and to identify the important
research articles in respective field. Librarians could play
an active role.