SocArXiv moratorium on AI-topic papers, policy in formation

In light of record submission rates and a large volume of AI-generated slop papers, SocArXiv recently implemented a policy requiring ORCID accounts linked in the OSF profile of submitting authors, and narrowing our focus to social science subjects (see this announcement). Today we are taking two more steps:

1. We are pausing new submissions about AI topics for 90 days. That is, papers about AI models, testing AI models, proposing AI models, theories about the future of AI and so on. We will make exceptions for papers that are already accepted for publication (or already published) in peer-reviewed scholarly journals. And we will make exceptions for empirical social science research about AI in society – for example, a study on how AI use affects workers in an organization – on a case-by-case basis. The purpose of this pause is to make it faster and easier for moderators to reject these papers, and encourage these authors to find other ways of distributing their work.

If your empirical social science research paper on an AI topic is rejected and you would like to appeal, please email us a link to the paper at socarxiv@gmail.com with a short note of explanation. We apologize for requiring this step.

2. We are developing a policy for AI-related work. We need a better, formal policy on AI-generated and LLM-assisted content. We have formed a committee of volunteers from our social science and library science networks to gather existing policies from other services and publications, and decide what policy is right for us. This includes the values we want to support, the work we are able to do, and the technical needs and requirements we have in doing our moderation and hosting. We hope this policy will be ready to implement when the 90-day pause on AI-related papers ends.

If you have expertise or suggestions for us in this work, we would appreciate hearing from you.

SocArXiv submission rule changes

Context

SocArXiv is experiencing record high submission rates. In addition, now that we have paper versioning – which is great – our moderators have to approve every paper revision. As a result, our volunteer workload is increasing.

In addition we are receiving many non-research, spam, and AI-generated submissions. We do not have a technological way of identifying these, and it is time-consuming to read and assess them according to our moderation rules.

We also don’t have moderation workflow tools that allow us to, for example, sort incoming papers by subject, to get them to specific expert moderators. So all our moderators look at all papers as they come in. That encourages us to think about narrowing the range of subjects we accept.

The two rule changes below are intended to help manage the increased moderator burden. More policy changes may follow if the volume keeps increasing.

1. ORCID requirement

We require the submitting author to have a publicly accessible ORCID linked from the OSF profile page, with a name that matches that on the paper and the OSF account.

In the case of non-bibliographic submittors (e.g., a research assistant submitting for a supervisor), the first author must have an ORCID. We can make exceptions for institutional submitters upon request, such as journals that upload their papers for authors.

At present we are not requiring additional verification or specific trust markers on the ORCID (such as email or employer verification), just the existence of an account that lists the author’s name. It’s not a foolproof identity verification, obviously, but it adds a step for scammers, and also helps identify pseudonymous authors, which we do not permit. We may take advantage of ORCID’s trust markers program in the future and require additional elements on the ORCID record.

We are happy to host papers by independent scholars, but a disproportionate share of non-research, spam, and AI-generated submissions come from independent scholars, many of whom do not have ORCIDs. For those scholars with institutional affiliations, we urge you to get an ORCID. This is a good practice that we should all endorse.

2. Focus on social sciences

At its founding, SocArXiv did not want to maintain disciplinary boundaries. It was our intention to be the big paper server for all of social sciences, and we couldn’t draw an easy line between social sciences and some humanities subjects, especially history, philosophy, religious studies, and some area studies, which are humanities in the taxonomy we use, but have significant overlap with social sciences. It was more logical just to accept them all.

As the volume has increased, this has become less practical. In addition, a lot of junk and AI submissions are in the areas of religion, philosophy, and various language studies. We also don’t have moderators working in arts and humanities, and our moderators trained in social sciences are not expert at reviewing these papers. Finally, there is an excellent, open humanities archive: Knowledge Commons (KC Works), which is freely available for humanities scholars. With approval from that service, we will now direct authors to their site for papers we are rejecting in arts and humanities subjects.

We continue to accept papers in education and law, which are also generally adjacent to social science.

For a limited time we will accept revisions of papers we already host in arts and humanities, but urge those authors to include links to Knowledge Commons or somewhere else that can host their work in the future.

We will assess papers that include arts/humanities as well as social science subject identifiers, and if we determine they are principally in art/humanities, reject them.

We will continue to host all work we have already accepted.

SocArXiv wins Kohli Foundation for Sociology 2025 Infrastructure Prize


University of Maryland Libraries news release

May 7, 2025

The Kohli Foundation for Sociology has announced that SocArXiv is the winner of its 2025 Infrastructure Prize for Sociology. This honor especially recognizes the leadership of SocArXiv founder Professor Philip N. Cohen of the Department of Sociology.

SocArXiv (pronounced “sosh-archive”) is an innovative open archive of the social sciences, providing a free, nonprofit, open access platform for social scientists to upload working papers, preprints, and published papers, with the option to link research materials such as data and code.

SocArXiv now hosts 17,000 papers across all disciplines in social sciences, arts, humanities, education, and law, in many languages. It is institutionally housed by the UMD libraries and operates on the Open Science Platform of the Center for Open Science.

“With the support of the UMD Libraries, and our platform at the Center for Open Science, we do it all at no charge for authors or readers, on a surprisingly small annual budget and a dedicated team of volunteers working a couple of hours a week,” Cohen said.

“The UMD Libraries and SocArXiv are partners in supporting open and equitable access to research for the public good. We are proud to serve as SocArXiv’s institutional home and congratulate SocArXiv on this well-deserved recognition,” said Interim Dean of the University Libraries Daniel Mack.

This Kohli Infrastructure Prize honors scholars, projects, and organizations that have made exceptional contributions to the development of substantial infrastructures that advance sociological knowledge, and comes with an award of 10,000 euros, more than $11,000.

SocArXiv is one of the efforts in which Cohen is involved that focuses on reforming the system of scholarly communication. He often speaks on the topic of how scholars can productively engage with public audiences, to improve work and deepen its impact. He explores related topics in his new book, “Citizen Scholar: Public Engagement for Social Scientists” (Columbia University Press, 2025).

“SocArXiv has been a labor of love for me since 2016, helping thousands of researchers to get their work done, but the efforts of myself and our volunteers often go unnoticed. So it’s wonderful to have this recognition,” Cohen said. “SocArXiv is a nonprofit, academy-owned and directed, collaborative project of volunteers from sociology and other social sciences, and members of the library community. This recognition honors that collective effort to pave a new way forward in scholarly communication. There is a lot we can do ourselves—faster, better, and cheaper than we can under the big academic publishers—and we’re proving it every day.”

“These papers are often available before journal publication, allowing them to disseminate further, and faster,” Cohen said.

The platform’s many-hands-on-deck approach has facilitated knowledge sharing even during difficult times. For example, a draft paper on learning losses amid COVID-related school closures was uploaded to SocArXiv and downloaded tens of thousands of times, and was therefore serving as a resource and point of reference to countless scholars and researchers before it was ultimately published in PNAS. This kind of open exchange of information and ideas has helped researchers to be more nimble and more productive.

Looking ahead, Cohen said he and his collaborators are working to make the repository a place for hosting peer review projects as well, so that people who want to conduct peer review in a transparent setting—with evolving versions, reviews and replies—can integrate their work onto SocArXiv.

As collaboration and knowledge sharing are priorities of the College of Behavioral and Social Sciences, Dean Susan Rivera said this recognition of Cohen and the networks that SocArXiv has forged is especially meaningful.

“Our college community congratulates Professor Cohen and his collaborators. It is fitting that their impactful work promoting open science is being honored in this significant way,” Rivera said. “This important platform is a point of pride for BSOS.”

Cohen has been invited to represent SocArXiv at the foundation’s awards ceremony in November at the European University Institute in Fiesole, Florence, Italy. Cohen has also been invited to be a guest on a Kohli Foundation podcast.

“A key priority of SocArXiv is to open up social science, to reach more people more effectively, to improve research, and build the future of scholarly communication,” Cohen said. “We hope that the visibility this award brings will draw more scholars and users to the platform.”

New versioning capacity at SocArXiv

News! SocArXiv host OSF now supports better versioning of papers, making it easier to cite, share, and update papers – and new versions are moderated. This will also support new, more transparent publishing models. For details and guidance, see this guide from our hosts at OSF: https://help.osf.io/article/673-update-a-new-version-of-a-preprint. Ask us if you need help!

SocArXiv joins preprint services in endorsing OSTP memo

DallE open science collage

The SocArXiv steering committee joins the preprint services arXiv and ioRxiv/medRxiv in their recent statements in support of the U.S. Office of Science and Technology Policy (OSTP) memo that directs the federal government to make outputs from government-funded research publicly accessible without charge or embargo. We endorse these statements, and reproduce them below.

arXiv OSTP memorandum response

April 11, 2023

The recent Office of Science and Technology Policy “Nelson Memorandum” on “Ensuring Free, Immediate, and Equitable Access to Federally Funded Research”1 is a welcome affirmation of the public right to access government funded research results, including publication of articles describing the research, and the data behind the research. The policy is likely to increase access to new and ongoing research, enable equitable access to the outcome of publicly funded research efforts, and enable and accelerate more research. Improved immediate access to research results may provide significant general social and economic benefits to the public.

Funding Agencies can expedite public access to research results through the distribution of electronic preprints of results in open repositories, in particular existing preprint distribution servers such as arXiv,2 bioRxiv,3 and medRxiv.4 Distribution of preprints of research results enables rapid and free accessibility of the findings worldwide, circumventing publication delays of months, or, in some cases, years. Rapid circulation of research results expedites scientific discourse, shortens the cycle of discovery and accelerates the pace of discovery.5

Distribution of research findings by preprints, combined with curation of the archive of submissions, provides universal access for both authors and readers in perpetuity. Authors can provide updated versions of the research, including “as accepted,” with the repositories openly tracking the progress of the revision of results through the scientific process. Public access to the corpus of machine readable research manuscripts provides innovative channels for discovery and additional knowledge generation, including links to the data behind the research, open software tools, and supplemental information provided by authors.

Preprint repositories support a growing and innovative ecosystem for discovery and evaluation of research results, including tools for improved accessibility and research summaries. Experiments in open review and crowdsourced commenting can be layered over preprint repositories, providing constructive feedback and alternative models to the increasingly archaic process of anonymous peer review.

Distribution of research results by preprints provides a well tested path for immediate, free, and equitable access to research results. Preprint archives can support and sustain an open and innovative ecosystem of tools for research discovery and verification, providing a long term and sustainable approach for open access to publicly funded research.

1White House OSTP Public Access Memo

2arXiv website

3bioRxiv website

4medRxiv website

5NIH Preprint Pilot“The Pace of Artificial Intelligence Innovations: Speed, Talent, and Trial-and-Error”


bioRxiv and medRxiv response to the OSTP memo – an open letter to US funding agencies

2023-04-11

The preprint servers bioRxiv and medRxiv welcome the recent Office of Science and Technology Policy (OSTP) memo advising US government agencies to make publications and data from research funded by US taxpayers publicly accessible immediately, without embargo or cost. This new policy will stimulate research, increase equitability, and generate health, environmental and social benefits not only in the US but all around the world.

Agencies can enable free public access to research results simply by mandating that reports of federally funded research are made available as “preprints” on servers such as arXiv, bioRxiv, medRxiv, and chemRxiv, before being submitted for journal publication. This will ensure that the findings are freely accessible to anyone anywhere in the world. An important additional benefit is the immediate availability of the information, avoiding the long delays associated with evaluation by traditional scientific journals (typically around one year). Scientific inquiry then progresses faster, as has been particularly evident for COVID research during the pandemic.

Prior access mandates in the US and elsewhere have focused on articles published by academic journals. This complicated the issue by making it a question of how to adapt journal revenue streams and led to the emergence of new models based on article-processing charges (APCs). But APCs simply move the access barrier to authors: they are a significant financial obstacle for researchers in fields and communities that lack the funding to pay them. A preprint mandate would achieve universal access for both authors and readers upstream, ensuring the focus remains on providing access to research findings, rather than on how they are selected and filtered.

Mandating public access to preprints rather than articles in academic journals would also future-proof agencies’ access policies. The distinction between peer-reviewed and non-peer-reviewed material is blurring as new approaches make peer review an ongoing process rather than a judgment made at a single point in time. Peer review can be conducted independently of journals through initiatives like Review Commons. And traditional journal-based peer review is changing: for example, eLife, supported by several large funders, peer reviews submitted papers but no longer distinguishes accepted from rejected articles. The author’s “accepted” manuscript that is the focus of so-called Green Open Access policies may therefore no longer exist. Because of such ongoing change, mandating the free availability of preprints would be a straightforward and strategically astute policy for US funding agencies.

A preprint mandate would underscore the fundamental, often overlooked, point that it is the results of research to which the public should have access. The evaluation of that research by journals is part of an ongoing process of assessment that can take place after the results have been made openly available. Preprint mandates from the funders of research would also widen the possibilities for evolution within the system and avoid channeling it towards expensive APC-based publishing models. Furthermore, since articles on preprint servers can be accompanied by supplementary data deposits on the servers themselves or linked to data deposited elsewhere, preprint mandates would also provide mechanisms to accomplish the other important OSTP goal: availability of research data.

Richard Sever and John Inglis
Co-Founders, bioRxiv and medRxiv
Cold Spring Harbor Laboratory, New York, NY11724

Harlan Krumholz and Joseph Ross
Co-founders, medRxiv
Yale University, New Haven, CT06520

Tag papers on SocArXiv to create communities of scholarship

Even with the very simple technique of tagging papers, we can facilitate enhanced collaboration among scholars and public sharing of scholarship — and when those two goals are met together, it is to the benefit of both.

When you submit a paper to SocArXiv, you have the opportunity to add tags. (You can also do this after your paper is posted, by going back to Edit Paper.) Those tags are then easily searchable for you or others. For example if you go to socarxiv.org and type:

tags:("covid-19")

into the search bar, you get this page, with the clunky URL:

https://osf.io/preprints/socarxiv/discover?q=tags%3A(%22covid-19%22)

which lists the 600+ papers that have used the COVID-19 tag. (Unfortunately, the tags aren’t clickable links on SocArXiv paper pages, but you can search for them.)

Scholarship communities

This simple tagging tool allows for relatively spontaneous grouping of scholarship, as when someone says, “We need to organize the recent work on police violence,” and people start uploading and tagging their work. But it just as well facilitates more organized efforts. Just as such groupings use Twitter hashtags to pull people together, we can do the same thing with scholarship using SocArXiv. Groups that might benefit from this tool include:

  • Working groups on a research topic
  • Panels for an upcoming conference
  • Departments or groups within departments
  • Sections of a professional association
  • Scholar-activist groups

Any such group can simply share the instructions above and notify participants of the associated tag. For example, if you are organizing a workshop or conference, you can make the meeting more productive by encouraging people to post their papers in advance, and use a common tag, such as “CairoMeeting2023,” or even “CairoMeeting2023-session-12”. Then you can share the URL for the search on that tag as in your preparatory materials or program.

We’re happy to help get you off the ground with your collaborative work. Feel free to contact us.

Mexico City ivermectin updates

Imagen TV screenshot

We posted our decision to withdraw the paper, “Ivermectin and the odds of hospitalization due to COVID-19″, here. Since then there have been new developments. I will update this page if needed. -Philip Cohen

As of February 5.

Statements

  • The lead author on the paper, José Merino, tweeted a link to a letter to me over the names of six of the original seven authors. The letter called the decision to withdraw their paper “unethical, colonialist, and authoritarian,” and demanded my resignation. You can read the statement here.
  • The Secretaría de Salud de la Ciudad de México posted a statement (in Spanish; English translation here), arguing that the use of ivermectin to treat COVID-19 was “supported by the scientific evidence available worldwide in 2020,” before the availability of vaccines, due its documented effectiveness, low cost, and lack of harmful effects. Distributing the medicine was not an experiment, they wrote. In addition, about SocArXiv, they wrote: “This study was kept on the SocArxiv portal for almost a year, it always had code and data available for replication, and its conclusions are very similar to other works (Ascencio-Montiel et al. 2022).” (Note, that study, which used the same data from the Mexico City COVID-19 health kit distribution, acknowledged that the kids “included, besides an information brochure and a pulse oxymeter, medications such as azithromycin, ivermecin, acetaminophen and aspirin” — and the study made no claims about the effects of ivermectin itself, and the data doesn’t indicate who took which medicines.)

News

Opinion

On withdrawing “Ivermectin and the odds of hospitalization due to COVID-19,” by Merino et al

On withdrawing “Ivermectin and the odds of hospitalization due to COVID-19,” by Merino et al.

4 February 2022

Preamble by Philip N. Cohen, director of SocArXiv

SocArXiv’s steering committee has decided to withdraw the paper, “Ivermectin and the odds of hospitalization due to COVID-19: evidence from a quasi-experimental analysis based on a public intervention in Mexico City,” by Jose Merino, Victor Hugo Borja, Oliva Lopez, José Alfredo Ochoa, Eduardo Clark, Lila Petersen, and Saul Caballero. [10.31235/osf.io/r93g4]

The paper is a report on a program in Mexico City that gave people medical kits when they tested positive for COVID-19, containing, among other things, ivermectin tablets. The conclusion of the paper is, “The study supports ivermectin-based interventions to assuage the effects of the COVID-19 pandemic on the health system.”

The lead author of the paper, José Merino, head of the Digital Agency for Public Innovation (DAPI), a government agency in Mexico City, tweeted about the paper: “Es una GRAN noticia poder validar una política pública que permitió reducir impactos en salud por covid19” (translation: “It is GREAT news to be able to validate a public policy that allowed reducing health impacts from covid19”). The other authors are officials at the Mexican Social Security Institute and the Mexico City Ministry of Health, and employees at the DAPI.

We have written about this paper previously. We wrote, in part:

“Depending on which critique you prefer, the paper is either very poor quality or else deliberately false and misleading. PolitiFact debunked it here, partly based on this factcheck in Portuguese. We do not believe it provides reliable or useful information, and we are disappointed that it has been very popular (downloaded almost 10,000 times so far). … We do not have a policy to remove papers like this from our service, which meet submission criteria when we post them but turn out to be harmful. However, we could develop one, such as a petition process or some other review trigger. This is an open discussion.”

The paper has now been downloaded more than 11,000 times, among our most-read papers of the past year. Since we posted that statement, the paper has received more attention. In particular, an article in Animal Politico in Mexico reported that the government of Mexico City has spent hundreds of thousands of dollars on ivermectin, which it still distributes (as of January 2022) to people who test positive for COVID-19. In response, University of California-San Diego sociology professor Juan Pablo Pardo-Guerra posted an appeal to SocArXiv asking us to remove the “deeply problematic and unethical” paper and ban its authors from our platform. The appeal, in a widely shared Twitter thread, argued that the authors, through their agency dispensing the medication, unethically recruited experimental subjects, apparently without informed consent, and thus the study is an unethical study; they did not declare a conflict of interest, although they are employees of agencies that carried out the policy. The thread was shared or liked by thousands of people. The article and response to the article prompted us to revisit this paper. On February 1, I promised to bring the issue to our Steering Committee for further discussion.

I am not a medical researcher, although I am a social scientist reasonably well-versed in public health research. I won’t provide a scholarly review of research on ivermectin. However, it is clear from the record of authoritative statements by global and national public health agencies that, at present, ivermectin should not be used as a treatment or preventative for COVID-19 outside of carefully controlled clinical studies, which this clearly was not. These are some of those statements, reflecting current guidance as of 3 February 2022.

  • World Health Organization: “We recommend not to use ivermectin, except in the context of a clinical trial.”
  • US Centers for Disease Control and Prevention: “ivermectin has not been proven as a way to prevent or treat COVID-19.”
  • US National Institutes of Health: “There is insufficient evidence for the COVID-19 Treatment Guidelines Panel (the Panel) to recommend either for or against the use of ivermectin for the treatment of COVID-19.”
  • European Medicines Agency: “use of ivermectin for prevention or treatment of COVID-19 cannot currently be recommended outside controlled clinical trials.”
  • US Food and Drug Administration: “The FDA has not authorized or approved ivermectin for use in preventing or treating COVID-19 in humans or animals. … Currently available data do not show ivermectin is effective against COVID-19.”

For reference,  the scientific flaws in the paper are enumerated  at the links above from PolitiFact, partly based on this factcheck from Estado in Portuguese, which included expert consultation. I also found this thread from Omar Yaxmehen Bello-Chavolla useful.

In light of this review, a program to publicly distribute ivermectin to people infected with COVID-19, outside of a controlled study, seems unethical. The paper is part of such a program, and currently serves as part of its justification.

To summarize, there remains insufficient evidence that ivermectin is effective in treating COVID-19; the study is of minimal scientific value at best; the paper is part of an unethical program by the government of Mexico City to dispense hundreds of thousands of doses of an inappropriate medication to people who were sick with COVID-19, which possibly continues to the present; the authors of the paper have promoted it as evidence that their medical intervention is effective. This review is intended to help the SocArXiv Steering Committee reach a decision on the request to remove the paper (we set aside the question of banning the authors from future submissions, which is reserved for people who repeatedly violate our rules). The statement below followed from this review.

SocArXiv Steering Committee statement on withdrawing the paper by Merino et al. (10.31235/osf.io/r93g4).

This is the first time we have used our prerogative as service administrators to withdraw a paper from SocArXiv. Although we reject many papers, according to our moderation policy, we don’t have a policy for unilaterally withdrawing papers after they have been posted. We don’t want to make policy around a single case, but we do want to respond to this situation.

We are withdrawing the paper, and replacing it with a “tombstone” that includes the paper’s metadata. We are doing this to prevent the paper from causing additional harm, and taking this incident as an impetus to develop a more comprehensive policy for future situations. The metadata will serve as a reference for people who follow citations to the paper to our site.

Our grounds for this decision are several:

  1. The paper is spreading misinformation, promoting an unproved medical treatment in the midst of a global pandemic.
  2. The paper is part of, and justification for, a government program that unethically dispenses (or did dispense) unproven medication apparently without proper consent or appropriate ethical protections according to the standards of human subjects research.
  3. The paper is medical research – purporting to study the effects of a medication on a disease outcome – and is not properly within the subject scope of SocArXiv.
  4. The authors did not properly disclose their conflicts of interest.

We appreciate that of the thousands of papers we have accepted and now host on our platform, there may be others that have serious flaws as well.

We are taking this unprecedented action because this particular bad paper appears to be more important, and therefore potentially more harmful, than other flawed work. In administering SocArXiv, we generally err on the side of inclusivity, and do not provide peer review or substantive vetting of the papers we host. Taking such an approach suits us philosophically, and also practically, since we don’t have staff to review every paper fully. But this approach comes with the responsibility to respond when something truly harmful gets through. In light of demonstrable harms like those associated with this paper, and in response to a community groundswell beseeching us to act, we are withdrawing this paper.

We reiterate that our moderation process does not involve peer review, or substantive evaluation, of the research papers that we host. Our moderation policy confirms only that papers are (1) scholarly, (2) in research areas that we support, (3) are plausibly categorized, (4) are correctly attributed, (5) are in languages that we moderate, and (6) are in text-searchable formats. Posting a paper on SocArXiv is not in itself an indication of good quality – but it is often a sign that researchers are acting in good faith and practicing open scholarship for the public good. We urge readers to consider this incident in the context of the greater good that open science and preprints in general, and our service in particular, do for researchers and the communities they serve.

We welcome comments and suggestions from readers, researchers, and the public. Feel free to email us at socarxiv@gmail.com, or contact us on our social media accounts at Twitter or Facebook.

When SocArXiv gets bad papers

Detail from AI-generated art using the prompt “bad paper” with Wombo.

Two recent incidents at SocArXiv prompted the Steering Committee to offer some comment on our process and its outcomes.

Ivermectin research

On May 4, 2021, our moderators accepted a paper titled, “Ivermectin and the odds of hospitalization due to COVID-19: evidence from a quasi-experimental analysis based on a public intervention in Mexico City,” by a group of authors from the Mexican Social Security Institute, Ministry of Health in Mexico City, and Digital Agency for Public Innovation in Mexico City. The paper reports on a “quasi-experimental” analysis purporting to find “significant reduction in hospitalizations among [COVID-19] patients who received [a] ivermectin-based medical kit” in Mexico City. The paper is a “preprint” insofar as the paper was not peer reviewed or published in a peer-reviewed journal at the time it was submitted, but because it has not subsequently been published in such a venue, it is really just a “paper.” (We call all the papers on SocArXiv “papers,” and let authors describe their status themselves, either on the title page, or by linking to a version published somewhere else.)

Depending on which critique you prefer, the paper is either very poor quality or else deliberately false and misleading. PolitiFact debunked it here, partly based on this factcheck in Portuguese. We do not believe it provides reliable or useful information, and we are disappointed that it has been very popular (downloaded almost 10,000 times so far).

This has prompted us to clarify that our moderation process does not involve peer review, or substantive evaluation, of the research papers that we host. From our Frequently Asked Questions page:

Papers are moderated before they appear on SocArXiv, a process we expect to take less than two days. Our policy involves a six-point checklist, confirming that papers are (1) scholarly, (2) in research areas that we support, (3) are plausibly categorized, (4) are correctly attributed, (5) are in languages that we moderate, and (6) are in text-searchable formats (such as PDF or docx). In addition, we seek to accept only papers that authors have the right to share, although we do not check copyrights in the moderation process. For details, view the moderation policy.

Posting a paper on SocArXiv is not in itself an indication of good quality. We host many papers of top quality – and their inclusion in SocArXiv is a measure of good practice. But there are bad papers as well, and the system does not explicitly differentiate them for readers. In addition to not verifying the quality of the papers we host, we also don’t evaluate the supporting materials authors provide. In the case of the ivermectin paper, the authors declared that their data is publicly available with a link to a Google sheet (as well as a Github repository that is no longer available). They also declared no conflict of interest.

We do not have a policy to remove papers like this from our service, which meet submission criteria when we post them but turn out to be harmful. However, we could develop one, such as a petition process or some other review trigger. This is an open discussion.

Fraudulent papers

To our knowledge, the ivermectin paper is not fraudulent. However, we do not verify the identities of authors who submit papers. The submitting author must have an account on the Open Science Framework, our host platform, but getting an OSF account just requires a working email address. OSF users can enter ORCID or social media account handles on their profiles, but to our knowledge these are not verified by OSF. OSF does allow logins with ORCID or institutional identities, but as moderators at SocArXiv we don’t have a way of knowing how a user has created their account or logged in. Our submission process requires authors to affirm that they have permission to post the paper, but we don’t independently verify the connections between authors.

In short, both OSF and SocArXiv are vulnerable to people posting work that is not their own, or using fake identities. The unvarnished truth is that we don’t have the resources of the government, the coercive power of an employer, or the capital of a big company necessary to police this issue.

Recently, someone posted one fraudulent paper on SocArXiv, and attempted to post another, before we detected the fraud in our moderation process. The papers submitted listed a common author, but different (apparently) fake co-authors. In one case, we contacted the listed co-author (a real person) who confirmed that they were not aware of the paper and had not consented to its being posted. With a little research, we found papers under the name of this author at SSRN, ResearchGate, arXiv, and Paperswithcode, which also seem to be fake. (We reported this to the administrators of OSF, who deleted the related accounts.)

It did not appear that these papers had any important content, but rather just existed to be papers, maybe to establish someone’s fake identity, test AI algorithms or security systems, or whatever. Their existence doesn’t hurt real researchers much, but they could be part of either a specific plan that would be more harmful, or a general degradation of the research communication ecosystem.

With regard to this kind of fraud, we do not have a consistently applied defense in our moderation workflow. If we suspect foul play, we poke around and then reject the papers and report it if we find something bad. But, again, we don’t have the resources to fully prevent this happening. However, we are developing a new policy that will require all papers to have at least one author linked to a real ORCID account. Although this will add time to the moderation process of each paper (since OSF does not attach ORCIDs to specific papers), we plan to experiment with this approach to see if it helps without adding too much time and effort. (As always, we are looking for more volunteer moderators — just contact us!)

User responses

We do offer several ways for readers to communicate to us and to each other about the quality of papers on our system. Readers may annotate or comment on papers using the Hypothesis tool, or they may endorse papers using the Plaudit button. (Both of these are free with registration, using ORCID for identification.) If you read a paper you believe is good, just click the Plaudit button — that will tell future readers that you have endorsed it. Neither of these tools generates automatic notifications to SocArXiv or to the authors, however — they just communicate to the next reader. If you see something that you suspect is fraudulent or harmful, feel free to email us directly at socarxiv@gmail.com.

We encourage readers to take advantage of these affordances. And we are open to suggestions.

Using SocArXiv to improve the impact of your conference paper

When you upload your conference paper, you give your audience the opportunity to engage with your work more seriously: read the paper, study the research materials you attach to it, and cite it — giving you formal precedence for your work and increasing its reach and impact. Later, if you publish it in a journal or some other venue, you can post a new version and people using the link will automatically be directed to the latest version (and see a link to the journal version, if there is one).

In addition, for the conference itself, you can use tags when you upload your paper to create communities of scholarship. Give it the ASA2021 tag for the American Sociological Association conference, for example. Then people can browse all the open uploaded conference papers as they prepare the schedules, at this link: https://osf.io/preprints/socarxiv/discover?q=tags%3A(%22ASA2021%22).

Or, get the members of your panel to all use a tag like ASA2021-101 (for session 101, e.g.), and give out this URL for a link to all the papers: https://osf.io/preprints/socarxiv/discover?q=tags%3A(%22ASA2021-101%22).

(To make your own tag link, just go to SocArXiv.org, enter tags:("your tag") in the search bar, and copy the URL on the results page.)

If you give out a link directly to your paper, or a tag to your panel session, before the conference, you encourage a deeper level of engagement during your session, and your signal your embrace of transparent and accountable social science. (You can also upload your slides in the associated project if you want to share those.) Then, share a link directly to your paper, or all the papers, at the session itself.

Scholarship communities

Beyond one conference, this simple tagging allows for relatively spontaneous grouping of scholarship, as when someone says, “We need to organize the recent work on police violence,” and people start uploading and tagging their work. But it just as well facilitates more organized efforts. Just as such groupings use Twitter hashtags to pull people together, we can do the same thing with scholarship using SocArxiv. Groups that might benefit from this tool include:

  • Working groups on a research topic
  • Panels for an upcoming conference
  • Departments or groups within departments
  • Sections of the American Sociological Association or others
  • Scholar-activist groups

Any such group can simply share the instructions above and notify participants of the associated tag. The link to the tag will always generate a web page listing the associated papers.

This simple functionality is already very powerful, but we are always looking for ways to improve it and offer more options. People trying it out now will help with this development process. We hope you’ll try it out.

Don’t wait for your association to act

Yes, it would be better if the American Sociological Association (or other lagging associations) would provide SocArXiv’s level of service to conference participants, with archiving, DOIs, permanent links, versioning, commenting, and supporting materials. But you don’t have to wait for them to catch up. We provide this for free thanks to support from the University of Maryland Libraries, the nonprofit Center for Open Science, and the volunteers who work on our service.