O'Peer is a "toy" demo of a cybernetic, democratic approach to Open Peer Review. It assumes all submissions are already freely available on the Internet; it recognizes (but may or may not reveal) the identity of every author/reviewer; and it (eventually) accumulates the "credibility index" of all registered participants by keeping a running average of their evaluations by others, weighted by the others' credibility in the aspects evaluated. See http://opeer.org/oPeer-foo.pdf
The original idea (insofar as it IS original) is depicted as a presentaion at
http://opeer.org/oPeer-foo.pdf
The possible extension of the concept to broader societal impact is discussed at
http://jick.ca/?cat=3
and at https://medium.com/predict/trust-2c9841f33994
The entire process is managed cybernetically as a database of submissions, reviews and "credibility indices" or reviewers. All authors and referees are free to either identify themselves publicly or remain anonymous to readers, but the computer knows who they are. This is necessary in order to accumulate a "credibility index" for every author/reviewer based on others' evaluations of their submissions, reviews and comments. Algorithms will have to be refined and protected constantly to discourage "gaming" of the system.
All submissions and reviews will be open access on the Internet. Rather than "tags and badges", each registered participant will, over time, accumulate detailed "credibility indices" based on others' evaluations of their submissions, reviews and comments.
So far almost no one has used my "toy" demo site -- luckily for me, as I'm running it on my own server for free! It's meant to merely illustrate how this might be done on a grand scale if the idea catches on. With luck, someone will "steal" my rudimentary implementation and turn it into a game-changing service. Whether they make money from that or get the government to fund it is of no concern to me.
Add a comment