SocArXiv joins preprint services in endorsing OSTP memo

DallE open science collage

The SocArXiv steering committee joins the preprint services arXiv and ioRxiv/medRxiv in their recent statements in support of the U.S. Office of Science and Technology Policy (OSTP) memo that directs the federal government to make outputs from government-funded research publicly accessible without charge or embargo. We endorse these statements, and reproduce them below.

arXiv OSTP memorandum response

April 11, 2023

The recent Office of Science and Technology Policy “Nelson Memorandum” on “Ensuring Free, Immediate, and Equitable Access to Federally Funded Research”1 is a welcome affirmation of the public right to access government funded research results, including publication of articles describing the research, and the data behind the research. The policy is likely to increase access to new and ongoing research, enable equitable access to the outcome of publicly funded research efforts, and enable and accelerate more research. Improved immediate access to research results may provide significant general social and economic benefits to the public.

Funding Agencies can expedite public access to research results through the distribution of electronic preprints of results in open repositories, in particular existing preprint distribution servers such as arXiv,2 bioRxiv,3 and medRxiv.4 Distribution of preprints of research results enables rapid and free accessibility of the findings worldwide, circumventing publication delays of months, or, in some cases, years. Rapid circulation of research results expedites scientific discourse, shortens the cycle of discovery and accelerates the pace of discovery.5

Distribution of research findings by preprints, combined with curation of the archive of submissions, provides universal access for both authors and readers in perpetuity. Authors can provide updated versions of the research, including “as accepted,” with the repositories openly tracking the progress of the revision of results through the scientific process. Public access to the corpus of machine readable research manuscripts provides innovative channels for discovery and additional knowledge generation, including links to the data behind the research, open software tools, and supplemental information provided by authors.

Preprint repositories support a growing and innovative ecosystem for discovery and evaluation of research results, including tools for improved accessibility and research summaries. Experiments in open review and crowdsourced commenting can be layered over preprint repositories, providing constructive feedback and alternative models to the increasingly archaic process of anonymous peer review.

Distribution of research results by preprints provides a well tested path for immediate, free, and equitable access to research results. Preprint archives can support and sustain an open and innovative ecosystem of tools for research discovery and verification, providing a long term and sustainable approach for open access to publicly funded research.

1White House OSTP Public Access Memo

2arXiv website

3bioRxiv website

4medRxiv website

5NIH Preprint Pilot“The Pace of Artificial Intelligence Innovations: Speed, Talent, and Trial-and-Error”


bioRxiv and medRxiv response to the OSTP memo – an open letter to US funding agencies

2023-04-11

The preprint servers bioRxiv and medRxiv welcome the recent Office of Science and Technology Policy (OSTP) memo advising US government agencies to make publications and data from research funded by US taxpayers publicly accessible immediately, without embargo or cost. This new policy will stimulate research, increase equitability, and generate health, environmental and social benefits not only in the US but all around the world.

Agencies can enable free public access to research results simply by mandating that reports of federally funded research are made available as “preprints” on servers such as arXiv, bioRxiv, medRxiv, and chemRxiv, before being submitted for journal publication. This will ensure that the findings are freely accessible to anyone anywhere in the world. An important additional benefit is the immediate availability of the information, avoiding the long delays associated with evaluation by traditional scientific journals (typically around one year). Scientific inquiry then progresses faster, as has been particularly evident for COVID research during the pandemic.

Prior access mandates in the US and elsewhere have focused on articles published by academic journals. This complicated the issue by making it a question of how to adapt journal revenue streams and led to the emergence of new models based on article-processing charges (APCs). But APCs simply move the access barrier to authors: they are a significant financial obstacle for researchers in fields and communities that lack the funding to pay them. A preprint mandate would achieve universal access for both authors and readers upstream, ensuring the focus remains on providing access to research findings, rather than on how they are selected and filtered.

Mandating public access to preprints rather than articles in academic journals would also future-proof agencies’ access policies. The distinction between peer-reviewed and non-peer-reviewed material is blurring as new approaches make peer review an ongoing process rather than a judgment made at a single point in time. Peer review can be conducted independently of journals through initiatives like Review Commons. And traditional journal-based peer review is changing: for example, eLife, supported by several large funders, peer reviews submitted papers but no longer distinguishes accepted from rejected articles. The author’s “accepted” manuscript that is the focus of so-called Green Open Access policies may therefore no longer exist. Because of such ongoing change, mandating the free availability of preprints would be a straightforward and strategically astute policy for US funding agencies.

A preprint mandate would underscore the fundamental, often overlooked, point that it is the results of research to which the public should have access. The evaluation of that research by journals is part of an ongoing process of assessment that can take place after the results have been made openly available. Preprint mandates from the funders of research would also widen the possibilities for evolution within the system and avoid channeling it towards expensive APC-based publishing models. Furthermore, since articles on preprint servers can be accompanied by supplementary data deposits on the servers themselves or linked to data deposited elsewhere, preprint mandates would also provide mechanisms to accomplish the other important OSTP goal: availability of research data.

Richard Sever and John Inglis
Co-Founders, bioRxiv and medRxiv
Cold Spring Harbor Laboratory, New York, NY11724

Harlan Krumholz and Joseph Ross
Co-founders, medRxiv
Yale University, New Haven, CT06520

Tag papers on SocArXiv to create communities of scholarship

Even with the very simple technique of tagging papers, we can facilitate enhanced collaboration among scholars and public sharing of scholarship — and when those two goals are met together, it is to the benefit of both.

When you submit a paper to SocArXiv, you have the opportunity to add tags. (You can also do this after your paper is posted, by going back to Edit Paper.) Those tags are then easily searchable for you or others. For example if you go to socarxiv.org and type:

tags:("covid-19")

into the search bar, you get this page, with the clunky URL:

https://osf.io/preprints/socarxiv/discover?q=tags%3A(%22covid-19%22)

which lists the 600+ papers that have used the COVID-19 tag. (Unfortunately, the tags aren’t clickable links on SocArXiv paper pages, but you can search for them.)

Scholarship communities

This simple tagging tool allows for relatively spontaneous grouping of scholarship, as when someone says, “We need to organize the recent work on police violence,” and people start uploading and tagging their work. But it just as well facilitates more organized efforts. Just as such groupings use Twitter hashtags to pull people together, we can do the same thing with scholarship using SocArXiv. Groups that might benefit from this tool include:

  • Working groups on a research topic
  • Panels for an upcoming conference
  • Departments or groups within departments
  • Sections of a professional association
  • Scholar-activist groups

Any such group can simply share the instructions above and notify participants of the associated tag. For example, if you are organizing a workshop or conference, you can make the meeting more productive by encouraging people to post their papers in advance, and use a common tag, such as “CairoMeeting2023,” or even “CairoMeeting2023-session-12”. Then you can share the URL for the search on that tag as in your preparatory materials or program.

We’re happy to help get you off the ground with your collaborative work. Feel free to contact us.

Mexico City ivermectin updates

Imagen TV screenshot

We posted our decision to withdraw the paper, “Ivermectin and the odds of hospitalization due to COVID-19″, here. Since then there have been new developments. I will update this page if needed. -Philip Cohen

As of February 5.

Statements

  • The lead author on the paper, José Merino, tweeted a link to a letter to me over the names of six of the original seven authors. The letter called the decision to withdraw their paper “unethical, colonialist, and authoritarian,” and demanded my resignation. You can read the statement here.
  • The Secretaría de Salud de la Ciudad de México posted a statement (in Spanish; English translation here), arguing that the use of ivermectin to treat COVID-19 was “supported by the scientific evidence available worldwide in 2020,” before the availability of vaccines, due its documented effectiveness, low cost, and lack of harmful effects. Distributing the medicine was not an experiment, they wrote. In addition, about SocArXiv, they wrote: “This study was kept on the SocArxiv portal for almost a year, it always had code and data available for replication, and its conclusions are very similar to other works (Ascencio-Montiel et al. 2022).” (Note, that study, which used the same data from the Mexico City COVID-19 health kit distribution, acknowledged that the kids “included, besides an information brochure and a pulse oxymeter, medications such as azithromycin, ivermecin, acetaminophen and aspirin” — and the study made no claims about the effects of ivermectin itself, and the data doesn’t indicate who took which medicines.)

News

Opinion

On withdrawing “Ivermectin and the odds of hospitalization due to COVID-19,” by Merino et al

On withdrawing “Ivermectin and the odds of hospitalization due to COVID-19,” by Merino et al.

4 February 2022

Preamble by Philip N. Cohen, director of SocArXiv

SocArXiv’s steering committee has decided to withdraw the paper, “Ivermectin and the odds of hospitalization due to COVID-19: evidence from a quasi-experimental analysis based on a public intervention in Mexico City,” by Jose Merino, Victor Hugo Borja, Oliva Lopez, José Alfredo Ochoa, Eduardo Clark, Lila Petersen, and Saul Caballero. [10.31235/osf.io/r93g4]

The paper is a report on a program in Mexico City that gave people medical kits when they tested positive for COVID-19, containing, among other things, ivermectin tablets. The conclusion of the paper is, “The study supports ivermectin-based interventions to assuage the effects of the COVID-19 pandemic on the health system.”

The lead author of the paper, José Merino, head of the Digital Agency for Public Innovation (DAPI), a government agency in Mexico City, tweeted about the paper: “Es una GRAN noticia poder validar una política pública que permitió reducir impactos en salud por covid19” (translation: “It is GREAT news to be able to validate a public policy that allowed reducing health impacts from covid19”). The other authors are officials at the Mexican Social Security Institute and the Mexico City Ministry of Health, and employees at the DAPI.

We have written about this paper previously. We wrote, in part:

“Depending on which critique you prefer, the paper is either very poor quality or else deliberately false and misleading. PolitiFact debunked it here, partly based on this factcheck in Portuguese. We do not believe it provides reliable or useful information, and we are disappointed that it has been very popular (downloaded almost 10,000 times so far). … We do not have a policy to remove papers like this from our service, which meet submission criteria when we post them but turn out to be harmful. However, we could develop one, such as a petition process or some other review trigger. This is an open discussion.”

The paper has now been downloaded more than 11,000 times, among our most-read papers of the past year. Since we posted that statement, the paper has received more attention. In particular, an article in Animal Politico in Mexico reported that the government of Mexico City has spent hundreds of thousands of dollars on ivermectin, which it still distributes (as of January 2022) to people who test positive for COVID-19. In response, University of California-San Diego sociology professor Juan Pablo Pardo-Guerra posted an appeal to SocArXiv asking us to remove the “deeply problematic and unethical” paper and ban its authors from our platform. The appeal, in a widely shared Twitter thread, argued that the authors, through their agency dispensing the medication, unethically recruited experimental subjects, apparently without informed consent, and thus the study is an unethical study; they did not declare a conflict of interest, although they are employees of agencies that carried out the policy. The thread was shared or liked by thousands of people. The article and response to the article prompted us to revisit this paper. On February 1, I promised to bring the issue to our Steering Committee for further discussion.

I am not a medical researcher, although I am a social scientist reasonably well-versed in public health research. I won’t provide a scholarly review of research on ivermectin. However, it is clear from the record of authoritative statements by global and national public health agencies that, at present, ivermectin should not be used as a treatment or preventative for COVID-19 outside of carefully controlled clinical studies, which this clearly was not. These are some of those statements, reflecting current guidance as of 3 February 2022.

  • World Health Organization: “We recommend not to use ivermectin, except in the context of a clinical trial.”
  • US Centers for Disease Control and Prevention: “ivermectin has not been proven as a way to prevent or treat COVID-19.”
  • US National Institutes of Health: “There is insufficient evidence for the COVID-19 Treatment Guidelines Panel (the Panel) to recommend either for or against the use of ivermectin for the treatment of COVID-19.”
  • European Medicines Agency: “use of ivermectin for prevention or treatment of COVID-19 cannot currently be recommended outside controlled clinical trials.”
  • US Food and Drug Administration: “The FDA has not authorized or approved ivermectin for use in preventing or treating COVID-19 in humans or animals. … Currently available data do not show ivermectin is effective against COVID-19.”

For reference,  the scientific flaws in the paper are enumerated  at the links above from PolitiFact, partly based on this factcheck from Estado in Portuguese, which included expert consultation. I also found this thread from Omar Yaxmehen Bello-Chavolla useful.

In light of this review, a program to publicly distribute ivermectin to people infected with COVID-19, outside of a controlled study, seems unethical. The paper is part of such a program, and currently serves as part of its justification.

To summarize, there remains insufficient evidence that ivermectin is effective in treating COVID-19; the study is of minimal scientific value at best; the paper is part of an unethical program by the government of Mexico City to dispense hundreds of thousands of doses of an inappropriate medication to people who were sick with COVID-19, which possibly continues to the present; the authors of the paper have promoted it as evidence that their medical intervention is effective. This review is intended to help the SocArXiv Steering Committee reach a decision on the request to remove the paper (we set aside the question of banning the authors from future submissions, which is reserved for people who repeatedly violate our rules). The statement below followed from this review.

SocArXiv Steering Committee statement on withdrawing the paper by Merino et al. (10.31235/osf.io/r93g4).

This is the first time we have used our prerogative as service administrators to withdraw a paper from SocArXiv. Although we reject many papers, according to our moderation policy, we don’t have a policy for unilaterally withdrawing papers after they have been posted. We don’t want to make policy around a single case, but we do want to respond to this situation.

We are withdrawing the paper, and replacing it with a “tombstone” that includes the paper’s metadata. We are doing this to prevent the paper from causing additional harm, and taking this incident as an impetus to develop a more comprehensive policy for future situations. The metadata will serve as a reference for people who follow citations to the paper to our site.

Our grounds for this decision are several:

  1. The paper is spreading misinformation, promoting an unproved medical treatment in the midst of a global pandemic.
  2. The paper is part of, and justification for, a government program that unethically dispenses (or did dispense) unproven medication apparently without proper consent or appropriate ethical protections according to the standards of human subjects research.
  3. The paper is medical research – purporting to study the effects of a medication on a disease outcome – and is not properly within the subject scope of SocArXiv.
  4. The authors did not properly disclose their conflicts of interest.

We appreciate that of the thousands of papers we have accepted and now host on our platform, there may be others that have serious flaws as well.

We are taking this unprecedented action because this particular bad paper appears to be more important, and therefore potentially more harmful, than other flawed work. In administering SocArXiv, we generally err on the side of inclusivity, and do not provide peer review or substantive vetting of the papers we host. Taking such an approach suits us philosophically, and also practically, since we don’t have staff to review every paper fully. But this approach comes with the responsibility to respond when something truly harmful gets through. In light of demonstrable harms like those associated with this paper, and in response to a community groundswell beseeching us to act, we are withdrawing this paper.

We reiterate that our moderation process does not involve peer review, or substantive evaluation, of the research papers that we host. Our moderation policy confirms only that papers are (1) scholarly, (2) in research areas that we support, (3) are plausibly categorized, (4) are correctly attributed, (5) are in languages that we moderate, and (6) are in text-searchable formats. Posting a paper on SocArXiv is not in itself an indication of good quality – but it is often a sign that researchers are acting in good faith and practicing open scholarship for the public good. We urge readers to consider this incident in the context of the greater good that open science and preprints in general, and our service in particular, do for researchers and the communities they serve.

We welcome comments and suggestions from readers, researchers, and the public. Feel free to email us at socarxiv@gmail.com, or contact us on our social media accounts at Twitter or Facebook.

When SocArXiv gets bad papers

Detail from AI-generated art using the prompt “bad paper” with Wombo.

Two recent incidents at SocArXiv prompted the Steering Committee to offer some comment on our process and its outcomes.

Ivermectin research

On May 4, 2021, our moderators accepted a paper titled, “Ivermectin and the odds of hospitalization due to COVID-19: evidence from a quasi-experimental analysis based on a public intervention in Mexico City,” by a group of authors from the Mexican Social Security Institute, Ministry of Health in Mexico City, and Digital Agency for Public Innovation in Mexico City. The paper reports on a “quasi-experimental” analysis purporting to find “significant reduction in hospitalizations among [COVID-19] patients who received [a] ivermectin-based medical kit” in Mexico City. The paper is a “preprint” insofar as the paper was not peer reviewed or published in a peer-reviewed journal at the time it was submitted, but because it has not subsequently been published in such a venue, it is really just a “paper.” (We call all the papers on SocArXiv “papers,” and let authors describe their status themselves, either on the title page, or by linking to a version published somewhere else.)

Depending on which critique you prefer, the paper is either very poor quality or else deliberately false and misleading. PolitiFact debunked it here, partly based on this factcheck in Portuguese. We do not believe it provides reliable or useful information, and we are disappointed that it has been very popular (downloaded almost 10,000 times so far).

This has prompted us to clarify that our moderation process does not involve peer review, or substantive evaluation, of the research papers that we host. From our Frequently Asked Questions page:

Papers are moderated before they appear on SocArXiv, a process we expect to take less than two days. Our policy involves a six-point checklist, confirming that papers are (1) scholarly, (2) in research areas that we support, (3) are plausibly categorized, (4) are correctly attributed, (5) are in languages that we moderate, and (6) are in text-searchable formats (such as PDF or docx). In addition, we seek to accept only papers that authors have the right to share, although we do not check copyrights in the moderation process. For details, view the moderation policy.

Posting a paper on SocArXiv is not in itself an indication of good quality. We host many papers of top quality – and their inclusion in SocArXiv is a measure of good practice. But there are bad papers as well, and the system does not explicitly differentiate them for readers. In addition to not verifying the quality of the papers we host, we also don’t evaluate the supporting materials authors provide. In the case of the ivermectin paper, the authors declared that their data is publicly available with a link to a Google sheet (as well as a Github repository that is no longer available). They also declared no conflict of interest.

We do not have a policy to remove papers like this from our service, which meet submission criteria when we post them but turn out to be harmful. However, we could develop one, such as a petition process or some other review trigger. This is an open discussion.

Fraudulent papers

To our knowledge, the ivermectin paper is not fraudulent. However, we do not verify the identities of authors who submit papers. The submitting author must have an account on the Open Science Framework, our host platform, but getting an OSF account just requires a working email address. OSF users can enter ORCID or social media account handles on their profiles, but to our knowledge these are not verified by OSF. OSF does allow logins with ORCID or institutional identities, but as moderators at SocArXiv we don’t have a way of knowing how a user has created their account or logged in. Our submission process requires authors to affirm that they have permission to post the paper, but we don’t independently verify the connections between authors.

In short, both OSF and SocArXiv are vulnerable to people posting work that is not their own, or using fake identities. The unvarnished truth is that we don’t have the resources of the government, the coercive power of an employer, or the capital of a big company necessary to police this issue.

Recently, someone posted one fraudulent paper on SocArXiv, and attempted to post another, before we detected the fraud in our moderation process. The papers submitted listed a common author, but different (apparently) fake co-authors. In one case, we contacted the listed co-author (a real person) who confirmed that they were not aware of the paper and had not consented to its being posted. With a little research, we found papers under the name of this author at SSRN, ResearchGate, arXiv, and Paperswithcode, which also seem to be fake. (We reported this to the administrators of OSF, who deleted the related accounts.)

It did not appear that these papers had any important content, but rather just existed to be papers, maybe to establish someone’s fake identity, test AI algorithms or security systems, or whatever. Their existence doesn’t hurt real researchers much, but they could be part of either a specific plan that would be more harmful, or a general degradation of the research communication ecosystem.

With regard to this kind of fraud, we do not have a consistently applied defense in our moderation workflow. If we suspect foul play, we poke around and then reject the papers and report it if we find something bad. But, again, we don’t have the resources to fully prevent this happening. However, we are developing a new policy that will require all papers to have at least one author linked to a real ORCID account. Although this will add time to the moderation process of each paper (since OSF does not attach ORCIDs to specific papers), we plan to experiment with this approach to see if it helps without adding too much time and effort. (As always, we are looking for more volunteer moderators — just contact us!)

User responses

We do offer several ways for readers to communicate to us and to each other about the quality of papers on our system. Readers may annotate or comment on papers using the Hypothesis tool, or they may endorse papers using the Plaudit button. (Both of these are free with registration, using ORCID for identification.) If you read a paper you believe is good, just click the Plaudit button — that will tell future readers that you have endorsed it. Neither of these tools generates automatic notifications to SocArXiv or to the authors, however — they just communicate to the next reader. If you see something that you suspect is fraudulent or harmful, feel free to email us directly at socarxiv@gmail.com.

We encourage readers to take advantage of these affordances. And we are open to suggestions.

Using SocArXiv to improve the impact of your conference paper

When you upload your conference paper, you give your audience the opportunity to engage with your work more seriously: read the paper, study the research materials you attach to it, and cite it — giving you formal precedence for your work and increasing its reach and impact. Later, if you publish it in a journal or some other venue, you can post a new version and people using the link will automatically be directed to the latest version (and see a link to the journal version, if there is one).

In addition, for the conference itself, you can use tags when you upload your paper to create communities of scholarship. Give it the ASA2021 tag for the American Sociological Association conference, for example. Then people can browse all the open uploaded conference papers as they prepare the schedules, at this link: https://osf.io/preprints/socarxiv/discover?q=tags%3A(%22ASA2021%22).

Or, get the members of your panel to all use a tag like ASA2021-101 (for session 101, e.g.), and give out this URL for a link to all the papers: https://osf.io/preprints/socarxiv/discover?q=tags%3A(%22ASA2021-101%22).

(To make your own tag link, just go to SocArXiv.org, enter tags:("your tag") in the search bar, and copy the URL on the results page.)

If you give out a link directly to your paper, or a tag to your panel session, before the conference, you encourage a deeper level of engagement during your session, and your signal your embrace of transparent and accountable social science. (You can also upload your slides in the associated project if you want to share those.) Then, share a link directly to your paper, or all the papers, at the session itself.

Scholarship communities

Beyond one conference, this simple tagging allows for relatively spontaneous grouping of scholarship, as when someone says, “We need to organize the recent work on police violence,” and people start uploading and tagging their work. But it just as well facilitates more organized efforts. Just as such groupings use Twitter hashtags to pull people together, we can do the same thing with scholarship using SocArxiv. Groups that might benefit from this tool include:

  • Working groups on a research topic
  • Panels for an upcoming conference
  • Departments or groups within departments
  • Sections of the American Sociological Association or others
  • Scholar-activist groups

Any such group can simply share the instructions above and notify participants of the associated tag. The link to the tag will always generate a web page listing the associated papers.

This simple functionality is already very powerful, but we are always looking for ways to improve it and offer more options. People trying it out now will help with this development process. We hope you’ll try it out.

Don’t wait for your association to act

Yes, it would be better if the American Sociological Association (or other lagging associations) would provide SocArXiv’s level of service to conference participants, with archiving, DOIs, permanent links, versioning, commenting, and supporting materials. But you don’t have to wait for them to catch up. We provide this for free thanks to support from the University of Maryland Libraries, the nonprofit Center for Open Science, and the volunteers who work on our service.

University of Maryland Libraries becomes the institutional home of SocArXiv

the word 'sustainable' over an image of green beans.
photo flickr cc: https://flic.kr/p/7T3X56.

This announcement comes from the UMD Libraries.

The University of Maryland (UMD) Libraries is pleased to announce that it has become the institutional home of SocArXiv, an interdisciplinary, open access repository of scholarship. The new partnership between the Libraries and SocArXiv ensures the future development and sustainability of the repository, which had previously received seed funding from the libraries at the University of California, Los Angeles, (UCLA) and the Massachusetts Institute of Technology (MIT), with additional support from the Sloan Foundation, the Open Societies Foundation, and the College of Behavioral and Social Sciences at UMD. Working with partners, the UMD Libraries will sponsor SocArXiv to help sustain shared infrastructure for open scholarship and to provide equitable access to this diverse collection of research for scholars at UMD and around the world.

Founded in 2016, SocArXiv is a digital repository of research papers which is free for authors and readers alike. SocArXiv is governed by a volunteer steering committee of scholars and library community leaders, with University of Maryland sociology professor Philip N. Cohen as the founding director. As of April 2021, the repository holds almost 8,000 papers in all fields of social and behavioral sciences, arts and humanities, education, and law. It provides a shared platform for social scientists and other scholars to upload working papers, preprints, and published papers, with the option to link data and code. Papers in multiple languages are moderated by an international team of volunteer academic researchers. Since the COVID-19 pandemic began, the pace of new papers posting at SocArXiv has increased, and there are now more than 500 papers related to the pandemic.

SocArXiv is based on the Open Science Framework (OSF) platform of the nonprofit Center for Open Science (COS). This arrangement will continue under the new partnership between the UMD Libraries and SocArXiv. Eric Olson, Institutional Product Owner at COS, said: “We believe that transparent and reproducible research and research products lead to better outcomes. By helping to sustain SocArXiv, UMD Libraries and its partners will continue to advance the shared platforms, tools, and systems that promote open science and open access in the research community.”

“We are delighted to be joining the Libraries at UMD, which is a leader in the growing movement for open scholarship,” said Cohen. “As the first preprint service available on the COS platform, SocArXiv has been an innovator in this arena during an exciting period. We are grateful for the Libraries’ support and look forward to working with the team here to build the future of academy-owned scholarly communication infrastructure.”

“SocArXiv fits into the UMD Libraries’ strategies related to enhancing open access and supporting academy-owned infrastructure for scholarly communication,” added Adriene Lim, Dean of Libraries. “It has an outstanding reputation in the field and we’re proud to be the institutional home and sustain this valuable resource for the entire research community. We look forward to working with Dr. Cohen, COS, and SocArXiv’s steering committee in the future to enhance equitable access for research, teaching, and learning.”

The Libraries also manages the Digital Repository at the University of Maryland (DRUM), which hosts material from UMD researchers, including theses and dissertations as well as research articles. In the future, SocArXiv hopes to integrate submission of Maryland researchers’ content with DRUM, extending the reach of UMD’s research output, as well as leveraging other benefits offered by SocArXiv.

To learn more about SocArXiv, visit SocOpen.org and SocArXiv.org.

ABOUT THE UMD LIBRARIES
As the largest university library system in the Washington D.C.-Baltimore area, the University of Maryland (UMD) Libraries serve more than 41,000 students and 14,000 faculty and staff of the flagship College Park campus. The Libraries’ extensive collections, programs, and services enable student success, support teaching, research, and creativity, and enrich the intellectual and cultural life of the community. A member of the Big Ten Academic Alliance and the Association of Research Libraries, the UMD Libraries was honored with the 2020 Excellence in Academic Libraries award in the university category from the Association of College and Research Libraries.
Last update: May 05, 2021

Talk: How we know: COVID-19, preprints, and the information ecosystem

I recorded a 16-minute talk on the scientific process, science communication, and how preprints fit in to the information ecosystem around COVID-19.

It’s called, “How we knowCOVID-19, preprints, and the information ecosystem.” The video is on YouTube here, also embedded below, and the slides, with references, are up here.

Happy to have your feedback, in the comments or any other way.

Disciplines and fields served at the SocArXiv 5k mark

As SocArXiv approaches 5000 papers (there are 4895 at this moment), here is a snapshot of the disciplines represented on our server:

5k disciplines

Sociology accounts for one-third of our papers. The original steering committee group consisted of sociologists and library leaders, and much of our outreach was in sociology, so that is not surprising. In addition, there are other paper servers active in other areas. Nevertheless, we are happy to have this diversity, and welcome papers from all the fields we cover — social sciences, arts and humanities, education, and law.

Within the field of sociology, we have a wide representation across subfields. Here note we use the categories generated by the American Sociological Association’s list of sections:

5k subfields

 

SocArXiv policy on withdrawing papers

The Center for Open Science has added withdrawal functionality to its preprint service platform. We are glad to have this capacity, but we will not be permitting the withdrawal of papers in routine cases. Withdrawing is a convenient option if an author makes an error in the submission process, for example accidentally submitting the wrong version; if a paper has not yet been approved, we are happy to accommodate such requests. However, if a paper has already been accepted, and thus entered the scholarly record, we will follow the policy below.

Unfortunately, authors now see a large “Withdraw Paper” button on the page where they edit their paper entries. We are working with COS to change how this option is presented to authors, and also to make users aware of our policy. Posting a paper on SocArXiv is easy, which brings great benefit to the thousands of people who have shared their work. However, authors should be aware that posting papers is generally nonreversible. We offer this policy and its explanation to help further this understanding.

Dog leaping fearlessly off a dock into water
Photo by Emery Way https://flic.kr/p/5JMYz7


SocArXiv Withdrawal Policy

May 25, 2019

In case of revision, the current version will be found here.

The Center for Open Science (COS), which hosts SocArXiv, has enabled the withdrawal of papers from its paper services. Authors who wish to withdraw their papers may request a withdrawal from the SocArXiv moderators, according to the terms of this policy.

Permission for withdrawal will only be granted in the very rare circumstance in which we have a legal obligation to remove a paper, such as if it contains private personal information or it is subject to a substantiated copyright claim. In cases where a paper is withdrawn, it will be replaced by a “tombstone” page (here is an example), which includes the original paper’s metadata (author, title, abstract, DOI, etc.), and the reason for withdrawal. After that point, the paper will be locked to further modification.

Papers that infringe on copyrights will be removed in accordance with the Digital Millennium Copyright Act, under the Center for Open Science terms of use, available here.

If authors wish to withdraw papers for other reasons — for example, if they are not confident of the findings or otherwise no longer endorse the paper — they should post a new “version” of the paper that is a single page announcing the withdrawal. They may, for example, request that readers do not further cite, use, or distribute previous versions (which will remain available under the list of previous versions). Instructions on how to post a new version are available here; we are happy to help authors do this.

This policy is very similar to the retraction of an article by an academic journal, which only rarely involves removal of access to the original paper, instead generally relying on a notification of retraction in its place.

Instructions for request a withdrawal are available here: http://help.osf.io/m/preprints/l/1069374-withdrawing-a-preprint

Why doesn’t SocArXiv let authors decide when to withdraw a paper?

Papers on SocArXiv are part of the scholarly record. Upon being posted, they are given a Digital Object Identifier (DOIs), and a persistent URL from COS. The link is automatically tweeted by our announcement account, and the system also generates a citation reference. The document is immediately citable and retrievable by human or machine agents. In short, posting a paper on SocArXiv is a research event that cannot be undone by deleting the document. Preserving the scholarly record is our obligation to the scholarly community.

Authors who post papers on SocArXiv are notified, at the final point of submission, that they will be “unable to delete the preprint file, but [they] can update or modify it.” Authors also are required to confirm that all contributors have agreed to share the paper, and that they have the right to share it. (All co-authors have the same rights to distribute a copyrighted work, unless a subsequent agreement has intervened, so an objection to the posting by a co-author is not the basis for removal.)

The Internet has made it possible to distribute work without relinquishing the original digital file, which makes it possible to delete the version readers access — a privilege that was not available when research was distributed in printed form. However, the Internet has also made it difficult or impossible to remove all traces or copies of a digital document. This is a challenging environment for authors.

We are sympathetic to the desire of some authors to remove copies of their earlier work from circulation, for a variety of reasons, and we appreciate that our policy may cause frustration. We hope authors will carefully consider it before they post their work.

Our policy is very similar to that employed by the older and more established preprint servers, arxiv and bioRxiv.

bioRxiv’s FAQ page reads:

Can I remove an article that has already posted on bioRxiv?

No. Manuscripts posted on bioRxiv receive DOI’s and thus are citable and part of the scientific record. They are indexed by services such as Google Scholar, Microsoft Academic Search, and Crossref, creating a permanent digital presence independent of bioRxiv records. Consequently, bioRxiv’s policy is that papers cannot be removed. Authors may, however, have their article marked as “Withdrawn” if they no longer stand by their findings/conclusions or acknowledge fundamental errors in the article. In these cases, a statement explaining the reason for the withdrawal is posted on the bioRxiv article page to which the DOI defaults; the original article is still accessible via the article history tab. In extremely rare, exceptional cases, papers are removed for legal reasons.

At this writing, just 32 out of 50,401 preprints on bioRxiv have been withdrawn, a rate of 6 per 10,000.

On arXiv, the instructions read:

Articles that have been announced and made public cannot be completely removed. A withdrawal creates a new version of the paper marked as withdrawn. That new version displays the reason for the withdrawal and does not link directly to the full text. Previous versions will still be accessible, including the full text.

On the other hand, at least one paper service, Elsevier’s SSRN (formerly the Social Science Research Network), allows authors to delete their papers from their repository immediately for any reason (FAQ). Similarly, some authors choose to distribute their work on their own websites, where they have more complete control over the contents. We believe these approaches put the needs of the author of over those of the research community. While a reasonable choice in some cases, this represents a philosophy different from ours.

We want an open, equitable, inclusive scholarly ecosystem in which people are free to share and use information as freely as possible. We have created this policy to serve that goal.