The Crossref Grant Linking System (GLS) has been facilitating the registration, sharing and re-use of open funding metadata for six years now, and we have reached some important milestones recently! What started as an interest in identifying funders through the Open Funder Registry evolved to a more nuanced and comprehensive way to share and re-use open funding data systematically. That’s how, in collaboration with the funding community, the Crossref Grant Linking System was developed. Open funding metadata is fundamental for the transparency and integrity of the research endeavour, so we are happy to see them included in the Research Nexus.
Crossref and the Public Knowledge Project (PKP) have been working closely together for many years, sharing resources and supporting our overlapping communities of organisations involved in communicating research. Now we’re delighted to share that we have agreed on a new set of objectives for our partnership, centred on further development of the tools that our shared community relies upon, as well as building capacity to enable richer metadata registration for organisations using the Open Journal Systems (OJS).
To mark Crossref’s 25th anniversary, we launched our first Metadata Awards to highlight members with the best metadata practices.
GigaScience Press, based in Hong Kong, was the leader among small publishers, defined as organisations with less than USD 1 million in publishing revenue or expenses. We spoke with Scott Edmunds, Ph.D., Editor-in-Chief at GigaScience Press, about how discoverability drives their high metadata standards.
What motivates your organisation/team to work towards high-quality metadata? What objectives does it support for your organisation?
Our objective is to communicate science openly and collaboratively, without barriers, to solve problems in a data- and evidence-driven manner through Open Science publishing. High-quality metadata helps us address these objectives by improving the discoverability, transparency, and provenance of the work we publish. It is an integral part of the FAIR principles and UNESCO Open Science Recommendation, playing a role in increasing the accessibility of research for both humans and machines. As one of the authors of the FAIR principles paper and an advisor of the Make Data Count project, I’ve also personally been very conscious to practice what I preach.
To work out which version you’re on, take a look at the website address that you use to access iThenticate. If you go to ithenticate.com then you are using v1. If you use a bespoke URL, https://crossref-[your member ID].turnitin.com/ then you are using iThenticate 2.0.
Use doc-to-doc comparison to compare a primary uploaded document with up to five comparison uploaded documents. Any documents that you upload to doc-to-doc comparison will not be indexed and will not be searchable against any future submissions.
Uploading a primary document to doc-to-doc comparison will cost you a single document submission, but the comparison documents uploaded will not cost you any submissions.
Start from Folders, go to the Submit a document menu, and click Doc-to-Doc Comparison.
The doc-to-doc comparison screen allows you to choose one primary document and up to five comparison documents. Choose the destination folder for the documents you will upload. The Similarity Report for the comparison will be added to the same folder.
For your primary document, provide the author’s first name, last name, and document title. If you do not provide these details, the filename will be used for the title, and the author details will stay blank.
If you have administrator permissions, you can assign the Similarity Report for the comparison to a reporting group by selecting one from the Reporting Group drop-down. Learn more about reporting groups.
Click Choose File, and select the file you want to upload as your primary document. See the file requirements for both the primary and comparison documents on the right of the screen.
You can choose up to five comparison documents to check against your primary document. These do not need to be given titles and author details. Each of the filenames must be unique. Click Choose Files, and select the files you would like to upload as comparison documents. To remove a file from the comparison before you upload it, click the X icon next to the file. To upload your files for comparison, click Upload.
Once your document has been uploaded and compared against the comparison documents, it will appear in your chosen destination folder.
This upload will have ‘Doc-to-Doc Comparison’ beneath the document title to show that this is a comparison upload and has not been indexed.
The upload will be given a Similarity Score against the selected comparison documents, which is also displayed in the report column. Click the similarity percentage to open the doc-to-doc comparison in the Document Viewer.
The Document Viewer is separated into three sections:
Along the top of the screen, the paper information bar shows details about the primary document, including document title, author, date the report was processed, word count, number of comparison documents provided, and how many of those documents matched with the primary document.
On the left panel is the paper text - this is the text of your primary document. Matching text is highlighted in red.
Your comparison documents will appear in the sources panel to the right, showing instances of matching text within the submitted documents.
By default, the doc-to-doc comparison will open the Document Viewer in the All Sources view. This view lists all the comparison documents you uploaded. Each comparison document has a percentage showing the amount of content within them that is similar to the primary document. If a comparison document has no matching text with the primary document, it has 0% next to it.
Doc-to-doc comparison can also be viewed in Match Overview mode. In this view, the comparison documents are listed with highest match percentage first, and all the sources are shown together, color-coded, on the paper text.
Page maintainer: Kathleen Luschek Last updated: 2020-May-19