Designed to replace journals and papers as the place to establish priority and record your work in full detail, Octopus is free to use and publishes all kinds of scientific work, whether it is a hypothesis, a method, data, an analysis or a peer review.
Publication is instant. Peer review happens openly. All work can be reviewed and rated.
Your personal page records everything you do and how it is rated by your peers.
Octopus puts authors in control of what they publish. There are no gatekeepers to the research record. But anyone logged into Octopus can rate publications, review them or red flag them if they have serious concerns.
Readers can see how many people, and who, have rated a publication and use these as cues of quality. They can also read reviews. This open peer review process allows a much more transparent scrutiny and evaluation of work than the current system.
Reviews are treated as an original publication in their own right and can also be rated. This incentivizes insightful, collaborative critiquing.
Octopus encourages meritocracy, collaboration and a fast and effective scientific process.
Currently a demo version of Octopus is available at:https://octopuspublishing.org
Goals and intentions
Octopus is designed to replace journals and papers as the primary research record. The traditional system is not only slow and expensive, but the concept of "papers" is not a good way of disseminating scientific work in the 21st century. By forcing people to share their work only when they get to the end of what can be a very long research process, it slows down the spread of scientific knowledge, and encourages "questionable research practices" in order for researchers to produce seemingly easy, clear narratives that will get their work widely read. Good science isn't necessarily a good story. Good science can be the careful collection of a small amount of data, or careful analysis of data collected by someone else, or a good hypothesis (regardless of whether data later supports it or not).
Publishing in Octopus is free, fast, and fair. Why hold on to a hypothesis? Publish it now and establish priority – once it"s out in Octopus it"s yours. Why hold onto your data? Publish that now and regardless of what analyses are done by you or others later, the credit for that data is yours.
Just like work put on preprint servers, publishing in Octopus doesn't stop you publishing an old-fashioned paper later.
Octopus has two systems of evaluation of publications by readers:
Rating. Every publication in Octopus can be rated by logged-in readers (i.e. people with an ORCiD). Each type of publication has 3 pre-defined criteria on which readers can rate it. These allow us as a scientific community to define what we consider "good science", and allow authors to get truly meritocratic feedback and reward for their work.
Every publication (including reviews) can be rated by others, and these will add to an author's individual page which is available for all individuals, institutions and funding bodies to see. Publishing quickly and well, and good collaborative reviewing is therefore rewarded.
It will be possible to see who has rated whom, to shed light on any poor practices.
Reviewing. In Octopus, reviews are a publication type in their own right, which any author can write, linking them to any other publication within Octopus. A review author does not have to use the 'ratings' system as well (although many probably will). Like all other publications, authorship of a review will be open, and the review will appear automatically on the author's individual page as part of their work record.
As mentioned above, reviews (like any other type of publication) can be rated by other readers.
Additionally any publication can be "red flagged" if a reader has serious concerns. This will be visible to other readers and will alert the authors to allow them to resolve any issues.
Review of code or data
Anyone can read anything on Octopus. Those logged in with an ORCiD can write and rate publications.
Tags or badges
Criteria for inclusion
The rating system will be a type of tag, showing the mean score of a publication on the three pre-defined criteria associated with that type of publication (including reviews).