Science Publishing DAO: Difference between revisions

From DAO Governance Wiki
Jump to navigation Jump to search
(Posted problem section of SPD paper. More to come.)
 
 
Line 2: Line 2:


=== "Publish or Perish" ===
=== "Publish or Perish" ===
Big Science Publishing not only owns the journals that disseminate scientific knowledge, but control the workflow of science itself (Figure 1[[Science Publishing DAO#%20ftn1|[1]]]). They manage the reputation system that determines a scientist’s career and impact, as well as the funding potential available to them, and they own the overwhelming majority of scientific copyrights. Any scientist must work within their system – the result of years of labor, passion, and thought – to survive, let alone have any impact.
Big Science Publishing not only owns the journals that disseminate scientific knowledge, but control the workflow of science itself (Figure 1)<ref>Bjorn Brembs, “Replacing Academic Journals” Available at https://doi.org/10.5281/zenodo.5564003&#x3E;&#x20;9/2021 (Retrieved 2023 April 20)</ref>. They manage the reputation system that determines a scientist’s career and impact, as well as the funding potential available to them, and they own the overwhelming majority of scientific copyrights. Any scientist must work within their system – the result of years of labor, passion, and thought – to survive, let alone have any impact.


However, science is meant to be decentralized and distributed. '''Decentralization and distribution are not only important on the principle of more people having access to science, but are required for any successful scientific endeavor'''. It is no secret that In order to discover findings that are truly universal, findings must be verified and replicated by as many people as possible across geographic, governmental, and institutional lines. A decentralized community of researchers seeking deeper truths is necessary for scientific knowledge to flourish.
However, science is meant to be decentralized and distributed. '''Decentralization and distribution are not only important on the principle of more people having access to science, but are required for any successful scientific endeavor'''. It is no secret that In order to discover findings that are truly universal, findings must be verified and replicated by as many people as possible across geographic, governmental, and institutional lines. A decentralized community of researchers seeking deeper truths is necessary for scientific knowledge to flourish.
Line 8: Line 8:


=== Boycotts ===
=== Boycotts ===
The current system is compromising the scientific method as we know it. This compromise is felt within science: numerous Nobel Laureates and Fields Medalists have used their position to boycott the state of publishing and use of bibliometrics. An example of these boycotts is the Cost of Knowledge[[Science Publishing DAO#%20ftn1|[2]]] protest of Elsevier, the largest for-profit journal in the world, which was initiated in 2012 by Fields Medalist Timothy Gowers of the University of Oxford. Over 80% of the nearly 16,000 signatories pledged not to publish in Elsevier. However, four years after the signing of the Cost of Knowledge, only 38% of the original signatories maintain their commitments to their pledge, an indication of how difficult it is for even established scientists and institutions to reform the system. Despite the difficulties, attempts continue to be made to combat the problems with Big Publishing: in 2019, the entire University of California (UC) system cancelled their subscriptions to Elsevier, followed by MIT, University of North Carolina, and the State University of New York (SUNY) system. This push-and-pull is characteristic of the current landscape. Many scientists and research institutions want reform, but cannot maintain their momentum against the forces of Big Publishing. In the meantime, the quality of research has lowered, public trust in science has faltered, and scientific careers are full of painful moral compromises simply to exist. ''In its current state, the primary winners in the ecosystem are the publishers and university administrators.''
The current system is compromising the scientific method as we know it. This compromise is felt within science: numerous Nobel Laureates and Fields Medalists have used their position to boycott the state of publishing and use of bibliometrics. An example of these boycotts is the Cost of Knowledge<ref>The Cost of Knowledge: See Wikipedia < <nowiki>https://en.wikipedia.org/wiki/The_Cost_of_Knowledge</nowiki>> and the organization’s website <<nowiki>http://thecostofknowledge.com/</nowiki>>.</ref> protest of Elsevier, the largest for-profit journal in the world, which was initiated in 2012 by Fields Medalist Timothy Gowers of the University of Oxford. Over 80% of the nearly 16,000 signatories pledged not to publish in Elsevier. However, four years after the signing of the Cost of Knowledge, only 38% of the original signatories maintain their commitments to their pledge, an indication of how difficult it is for even established scientists and institutions to reform the system. Despite the difficulties, attempts continue to be made to combat the problems with Big Publishing: in 2019, the entire University of California (UC) system cancelled their subscriptions to Elsevier, followed by MIT, University of North Carolina, and the State University of New York (SUNY) system. This push-and-pull is characteristic of the current landscape. Many scientists and research institutions want reform, but cannot maintain their momentum against the forces of Big Publishing. In the meantime, the quality of research has lowered, public trust in science has faltered, and scientific careers are full of painful moral compromises simply to exist. ''In its current state, the primary winners in the ecosystem are the publishers and university administrators.''


The major ways that publishing companies exert a negative influence are:
The major ways that publishing companies exert a negative influence are:
Line 21: Line 21:
Governments around the globe have also been funding Open Access programs such as SPARC, MIT’s Center for Collective Intelligence (MITCCI), Center for Open Science (COS), Open Research Europe (ORE), Research Center for Open Science and Data Platform (RCOS) etc.  The US’s Federal Research Public Access Act of 2006 required any US Federal Government Agency with a research budget greater than $100M to create an open access repository – this applies to 11 US agencies including NASA, EPA etc. With greater and greater frequency, universities are mandating their researchers to exclusively publish in Open Access Journals, some of which are government funded. These initiatives incite pushback from the commercial sector, including the instantiation of policies like SOPA, PROTECT IP Act, and Research Works Act, while evading the “Open Access” political barrage by using their brands to create high-profit and profile Open Access Journals.
Governments around the globe have also been funding Open Access programs such as SPARC, MIT’s Center for Collective Intelligence (MITCCI), Center for Open Science (COS), Open Research Europe (ORE), Research Center for Open Science and Data Platform (RCOS) etc.  The US’s Federal Research Public Access Act of 2006 required any US Federal Government Agency with a research budget greater than $100M to create an open access repository – this applies to 11 US agencies including NASA, EPA etc. With greater and greater frequency, universities are mandating their researchers to exclusively publish in Open Access Journals, some of which are government funded. These initiatives incite pushback from the commercial sector, including the instantiation of policies like SOPA, PROTECT IP Act, and Research Works Act, while evading the “Open Access” political barrage by using their brands to create high-profit and profile Open Access Journals.


ORE was started and continues to be funded by the European Commission (EC). On behalf of ORE, the EC contractually pays F1000, a publisher, a flat fee of 780 EUR/paper to market ORE published papers on F1000’s platform.  Usually, publishing in Open Access Journals costs $10+k/paper[[Science Publishing DAO#%20ftn1|[3]]] - normally papers cost up to $8k. This focus on '''Open Science/Access, though romantic, has from a macro perspective inadvertently hurt not-for-profit Professional Societies and shifted the costs from the readers to the authors while barely denting the wallets or position of Big Publishing.'''
ORE was started and continues to be funded by the European Commission (EC). On behalf of ORE, the EC contractually pays F1000, a publisher, a flat fee of 780 EUR/paper to market ORE published papers on F1000’s platform.  Usually, publishing in Open Access Journals costs $10+k/paper<ref>Open Research Europe: <<nowiki>https://open-research-europe.ec.europa.eu/</nowiki>></ref> - normally papers cost up to $8k. This focus on '''Open Science/Access, though romantic, has from a macro perspective inadvertently hurt not-for-profit Professional Societies and shifted the costs from the readers to the authors while barely denting the wallets or position of Big Publishing.'''


=== Decentralized Science Movement ===
=== Decentralized Science Movement ===
Line 35: Line 35:


=== Publishing Incentives ===
=== Publishing Incentives ===
Work published in high end journals like Nature favor “illuminating, unexpected, surprising” positive results, as they state outright in their submission requirements[[Science Publishing DAO#%20ftn6|[4]]].  '''The decentralized and distributed nature of science suffers when it depends on this unfortunate yet understandable for-profit, centralized, marketing-focused, news-media internet business model. ''' Data and interpretation-sharing amongst scientists is essential for scientific work, for education, and for the democratization of the knowledge.  ''However, “illuminating, unexpected, surprising” positive results only account for a very small percentage of useable research.'' Most hypotheses tested in research will have negative results, or will start with single observations. At present, these only account for ~10% of all published works.  A narrative can be constructed only after a string of research has been completed.  However, to get published, the story must be “big enough.”  '''In order to get published, scientists will sit on research for years even after it is complete, spending their time crafting an exciting narrative; or, they oversell the significance of their research.''' We might be familiar with such narratives from clickbait articles such as “Studies show chocolate can help you lose weight,” but the problem affects all the scientific disciplines, even harder sciences such as physics.
Work published in high end journals like Nature favor “illuminating, unexpected, surprising” positive results, as they state outright in their submission requirements.<ref>Springer Nature’s editorial criteria and processes page: https://www.nature.com/nature/for-authors/editorial-criteria-and-processes</ref>  '''The decentralized and distributed nature of science suffers when it depends on this unfortunate yet understandable for-profit, centralized, marketing-focused, news-media internet business model. ''' Data and interpretation-sharing amongst scientists is essential for scientific work, for education, and for the democratization of the knowledge.  ''However, “illuminating, unexpected, surprising” positive results only account for a very small percentage of useable research.'' Most hypotheses tested in research will have negative results, or will start with single observations. At present, these only account for ~10% of all published works.  A narrative can be constructed only after a string of research has been completed.  However, to get published, the story must be “big enough.”  '''In order to get published, scientists will sit on research for years even after it is complete, spending their time crafting an exciting narrative; or, they oversell the significance of their research.''' We might be familiar with such narratives from clickbait articles such as “Studies show chocolate can help you lose weight,” but the problem affects all the scientific disciplines, even harder sciences such as physics.


This hyperfocus on purely validated, positive results is a serious problem with the current publishing system: single observation papers, negative results papers, and replication papers are not profitable but provide important context, – not attention grabbing and hard manage publication volume with current “quality” standards. Single observation studies are much more easily checked and distributable.  Negative results provide important contextual information about a body of work. These two types of research are essential players in the scientific research ecosystem, but they are effectively being discouraged by the nature of the system. '''By using a reputation structure that incentivizes these types of research, the Scientific Publishing DAO will provide avenues for affordability, functionality, and replicability, improving public trust in scientific research and the health of falsifiable disciplines in general.'''
This hyperfocus on purely validated, positive results is a serious problem with the current publishing system: single observation papers, negative results papers, and replication papers are not profitable but provide important context, – not attention grabbing and hard manage publication volume with current “quality” standards. Single observation studies are much more easily checked and distributable.  Negative results provide important contextual information about a body of work. These two types of research are essential players in the scientific research ecosystem, but they are effectively being discouraged by the nature of the system. '''By using a reputation structure that incentivizes these types of research, the Scientific Publishing DAO will provide avenues for affordability, functionality, and replicability, improving public trust in scientific research and the health of falsifiable disciplines in general.'''


== References: ==
== Notes and references ==
[[Science Publishing DAO#%20ftnref1|[1]]] Brembs, Bjorn.  “Replacing Academic Journals” <<nowiki>https://doi.org/10.5281/zenodo.5564003</nowiki>> 9/2021>
 
[[Science Publishing DAO#%20ftnref1|[2]]] The Cost of Knowledge: See Wikipedia < <nowiki>https://en.wikipedia.org/wiki/The_Cost_of_Knowledge</nowiki>> and the organization’s website <<nowiki>http://thecostofknowledge.com/</nowiki>>.
 
[[Science Publishing DAO#%20ftnref1|[3]]] Open Research Europe: <<nowiki>https://open-research-europe.ec.europa.eu/</nowiki>>
 
[[Science Publishing DAO#%20ftnref6|[4]]] Springer Nature’s editorial criteria and processes page: <nowiki>https://www.nature.com/nature/for-authors/editorial-criteria-and-processes</nowiki>

Latest revision as of 19:46, 19 April 2023

Problems with Publishing[edit | edit source]

"Publish or Perish"[edit | edit source]

Big Science Publishing not only owns the journals that disseminate scientific knowledge, but control the workflow of science itself (Figure 1)[1]. They manage the reputation system that determines a scientist’s career and impact, as well as the funding potential available to them, and they own the overwhelming majority of scientific copyrights. Any scientist must work within their system – the result of years of labor, passion, and thought – to survive, let alone have any impact.

However, science is meant to be decentralized and distributed. Decentralization and distribution are not only important on the principle of more people having access to science, but are required for any successful scientific endeavor. It is no secret that In order to discover findings that are truly universal, findings must be verified and replicated by as many people as possible across geographic, governmental, and institutional lines. A decentralized community of researchers seeking deeper truths is necessary for scientific knowledge to flourish.

Figure 1: Basic sketch of the status quo in Academic Publishing post-internet. Publishers have an outsized control of the ecosystem for the value they offer: digital marketing.  This has inadvertent negative effects on the efficacy and health of the Scientific Commons.

Boycotts[edit | edit source]

The current system is compromising the scientific method as we know it. This compromise is felt within science: numerous Nobel Laureates and Fields Medalists have used their position to boycott the state of publishing and use of bibliometrics. An example of these boycotts is the Cost of Knowledge[2] protest of Elsevier, the largest for-profit journal in the world, which was initiated in 2012 by Fields Medalist Timothy Gowers of the University of Oxford. Over 80% of the nearly 16,000 signatories pledged not to publish in Elsevier. However, four years after the signing of the Cost of Knowledge, only 38% of the original signatories maintain their commitments to their pledge, an indication of how difficult it is for even established scientists and institutions to reform the system. Despite the difficulties, attempts continue to be made to combat the problems with Big Publishing: in 2019, the entire University of California (UC) system cancelled their subscriptions to Elsevier, followed by MIT, University of North Carolina, and the State University of New York (SUNY) system. This push-and-pull is characteristic of the current landscape. Many scientists and research institutions want reform, but cannot maintain their momentum against the forces of Big Publishing. In the meantime, the quality of research has lowered, public trust in science has faltered, and scientific careers are full of painful moral compromises simply to exist. In its current state, the primary winners in the ecosystem are the publishers and university administrators.

The major ways that publishing companies exert a negative influence are:

·      Paywalls – restrictions that require payment for access i.e., subscriptions - that prevent access to the papers that Publishers did not produce.

·      Poor management of the reputation structure via inappropriate use of indicators like IF and h-index

The problem with paywalls has been widely recognized and re-spawned the Open Science Movement (Figure 2) – which includes the Open Access Movement and Guerilla Open Access Movement. Projects such as ArXiv (preprint journal), PLOS One (major Open Access journal), Allen Institute (privately-owned Open Science Center), and Sci-Hub (pirated access to journals) are examples of non-traditional ways of publishing and accessing knowledge.

Government Involvement[edit | edit source]

Governments around the globe have also been funding Open Access programs such as SPARC, MIT’s Center for Collective Intelligence (MITCCI), Center for Open Science (COS), Open Research Europe (ORE), Research Center for Open Science and Data Platform (RCOS) etc.  The US’s Federal Research Public Access Act of 2006 required any US Federal Government Agency with a research budget greater than $100M to create an open access repository – this applies to 11 US agencies including NASA, EPA etc. With greater and greater frequency, universities are mandating their researchers to exclusively publish in Open Access Journals, some of which are government funded. These initiatives incite pushback from the commercial sector, including the instantiation of policies like SOPA, PROTECT IP Act, and Research Works Act, while evading the “Open Access” political barrage by using their brands to create high-profit and profile Open Access Journals.

ORE was started and continues to be funded by the European Commission (EC). On behalf of ORE, the EC contractually pays F1000, a publisher, a flat fee of 780 EUR/paper to market ORE published papers on F1000’s platform.  Usually, publishing in Open Access Journals costs $10+k/paper[3] - normally papers cost up to $8k. This focus on Open Science/Access, though romantic, has from a macro perspective inadvertently hurt not-for-profit Professional Societies and shifted the costs from the readers to the authors while barely denting the wallets or position of Big Publishing.

Decentralized Science Movement[edit | edit source]

Since the invention of decentralized autonomous organizations (DAO) in 2016, there have been several attempts to address the problems of Open Science. A project called Open Access DAO recently launched (11/2021), with the goal of crowdsourcing funds to buy journals to quickly make their contents open-access. Most of these blockchain-based solutions address the paper access issue, but very rarely do any of the current projects address the governance or the reputation infrastructure of academic science. These neglected issues are arguably more responsible for the state of science today.  Previous attempts to change the reputation and governance metrics via gamification have failed (see Competitive/Collaborative DeSci Landscape section).

Impacts of Paper Pirating[edit | edit source]

Thus far, the pirating site Sci-Hub (launched 2011) from the Guerilla Open Access Movement has been the most effective tool for the Open Science Movement.  Sci-Hub is heavily used by scientific communities in developing countries like China, India, Iran, Kazakhstan etc. to bypass the untenable paywalls. Elsevier, American Chemical Society, and Wiley etc. all have lawsuits against Sci-Hub and have gotten the site blocked in many countries including the US and many Western countries since 2015. However, as long as one country maintains the site, everybody with a WiFi connection can access and its repository – made easier with VPN technology. In 2016, Sci-Hub founder Alexandra Elbakyan was listed #6 on the “Nature’s 10” – Springer Nature’s annual list of ten “people who mattered” in science. The most effective solution for the Open Access Movement so far is an illegal one; we must do better.

Biblio- and Sciento- metrics[edit | edit source]

In the highly competitive world of academic research, scientists are forced to play the publisher’s game to survive as scientists. The h-index and impact factor (IF) of the journals are key elements of a scientist’s CV. The result is that researchers learn to game metrics by maximizing citations by getting published in major journals. This in turn has led to collusion: situations where, for instance, professors agree to cite each other’s papers as much as possible (citation rings), p-hacking, paper mills, etc.

Bibliometrics is the field of metrics used for analysis of publications and their properties. In science, it is a discipline in and of itself.  Neither IF nor the h-index are respected metrics by the bibliometric community and, at this point, most of the scientific community; yet these “invalid indicators” continue to be used. In the present moment, university presidents and managers are generally more concerned about “branding” and the endless search for funding than adhering to academic principles and values. Attracting students and making money are no small tasks. But keeping up with the rhetoric surrounding the globalization of the university market has kept administrators (non-scientists) busy and generally less attentive to the nuances of the metrics used to evaluate and hire their STEM professors. Administrators are using the culture’s current obsession with rankings and reductive composite single figures – simple for prospective students to digest – to direct the perceived meaning of these invalid indicators to a preferred narrative. However, though bureaucrats maintain some blame, the proliferation of the h-index has largely been a grassroots phenomenon. A deeper psychosocial analysis involves looking at the reasons why the prestige economy is perpetuated by scientists themselves, and providing the right tools to replace it.

Publishing Incentives[edit | edit source]

Work published in high end journals like Nature favor “illuminating, unexpected, surprising” positive results, as they state outright in their submission requirements.[4]  The decentralized and distributed nature of science suffers when it depends on this unfortunate yet understandable for-profit, centralized, marketing-focused, news-media internet business model.  Data and interpretation-sharing amongst scientists is essential for scientific work, for education, and for the democratization of the knowledge.  However, “illuminating, unexpected, surprising” positive results only account for a very small percentage of useable research. Most hypotheses tested in research will have negative results, or will start with single observations. At present, these only account for ~10% of all published works.  A narrative can be constructed only after a string of research has been completed.  However, to get published, the story must be “big enough.”  In order to get published, scientists will sit on research for years even after it is complete, spending their time crafting an exciting narrative; or, they oversell the significance of their research. We might be familiar with such narratives from clickbait articles such as “Studies show chocolate can help you lose weight,” but the problem affects all the scientific disciplines, even harder sciences such as physics.

This hyperfocus on purely validated, positive results is a serious problem with the current publishing system: single observation papers, negative results papers, and replication papers are not profitable but provide important context, – not attention grabbing and hard manage publication volume with current “quality” standards. Single observation studies are much more easily checked and distributable.  Negative results provide important contextual information about a body of work. These two types of research are essential players in the scientific research ecosystem, but they are effectively being discouraged by the nature of the system. By using a reputation structure that incentivizes these types of research, the Scientific Publishing DAO will provide avenues for affordability, functionality, and replicability, improving public trust in scientific research and the health of falsifiable disciplines in general.

Notes and references[edit | edit source]

  1. Bjorn Brembs, “Replacing Academic Journals” Available at https://doi.org/10.5281/zenodo.5564003> 9/2021 (Retrieved 2023 April 20)
  2. The Cost of Knowledge: See Wikipedia < https://en.wikipedia.org/wiki/The_Cost_of_Knowledge> and the organization’s website <http://thecostofknowledge.com/>.
  3. Open Research Europe: <https://open-research-europe.ec.europa.eu/>
  4. Springer Nature’s editorial criteria and processes page: https://www.nature.com/nature/for-authors/editorial-criteria-and-processes