In a nutshell

Hypothesis is a new effort to implement an old idea:  A conversation layer over the entire web that works everywhere, without needing implementation by any underlying site.  Our team creates open source software, pushes for standards, and fosters community. Using annotation, we enable sentence-level note taking or critique on top of news, blogs, scientific articles, books, terms of service, ballot initiatives, legislation and more.  Everything we build is guided by our principles -- that it be open, neutral, and lasting. We are a non-profit organization funded through the generosity of sponsors like the Knight, Mellon, Shuttleworth, Sloan, Helmsley, and Omidyar Foundation. Our efforts are based on the annotation standards for digital documents developed by the W3C Web Annotation Working Group. We are partnering boradly with developers, publishers, academic institutions, researchers, journalists, and individuals to develop a platform for the next generation of read-write web applications.

Overview
Goals and intentions

Open annotation enables a conversation across web-based content through the creation of private, group, or public in-line annotations. Integration with manuscript submission systems can enable editors, reviewers, and authors to engage in all types of peer review (from traditional single-blind or double-blind to open) in either pre-publication or post-publication form. The tool can also be used outside of a manuscript submission system by publications who wish to undertake peer review experiments. One example is In Review, via which Biomed Central and Research Square offer authors the ability to opt into community feedback in parallel with traditional peer review. Feedback on pre-prints is an ideal use case for providing suggestions (public or in private groups) on early versions of articles. Because it works anywhere on the web, open annotation is an ideal tool to enable overlay journals. It can also raise the visibility of peer review reports posted as supplementary materials.

What is reviewed
  • Review of data or code

    Yes

  • Manuscript hosting

    No

Video
Review features
  • Notes

    -Works with either pre-publication or post-publication peer review

    -Enables open peer review

    -Enables traditional peer review either within manuscript submission system integrations or as a stand alone. Publisher can assign reviewer accounts for reviewer anonymity.

    -Can be used to post peer review reports or summaries

    -Can be used to provide feedback on content anywhere on the web in html, PDF, EPBU, or data formats.

  • Eligible reviewers/editors

    This depends on the site integration (publisher, preprint service, etc.). They may select an Open Group (world-readable, world-writeable) or a Restricted Group (world-readable, but writeable only by those indicated by the group owner).

  • Tags or badges

    Yes

  • Criteria for inclusion

    Anyone can use the tool through implementation of public or private groups. Should a publisher wish to host their own branded and moderated layer, there is a cost. Please contact us for details.

  • Explanation of cost

    We offer a “freemium” service for those wanting to sign up and use the basic annotation tool. If publishers wish to host their own branded and moderated annotation layer, there is a charge. There is also a charge to integrate with SSO accounts. We also offer commercial products and services including SaaS hosting, software development and consulting services for organizations and enterprises wanting to implement a more robust digital annotation solution. We charge an annual SaaS hosting fee that covers 12 months of service. We use the number of documents added per year as a proxy for publisher size. Publishers can deploy across all content or specify content types or specific journals, for example. Deployment is back to volume 1, issue 1 or earliest copyright year for books. Pricing for software development and consulting services is negotiable, based on the deliverables and resource requirements of the proposed scope of work. For more information on pricing, contact Heather Staines, Director of Partnerships, at heather@hypothes.is

Results
  • Number of scholarly outputs commented on

    10,000+

  • Metrics

    We use Metabase, a data analytics tool, to track usage and participation of the Hypothesis tool on an aggregated basis, by sector, and by customer including total # of annotations, # of active annotators, # of new registered users, and# of new groups created. We also generate reports for customers that include information on publicly visible annotations (annotator, date, content annotated, text selection, annotation content) as well as private and group annotations (date, content annotated). We can also provide information on the top public annotators, top annotated documents, and more. We constantly communicate with our partners to see how they are using the tool and to solicit feedback for improvements and new features.

  • Results summary

    Hypothesis has integrations with all major hosting platforms, but an integration is not necessary for the tool to be utilized by end users. Content which is currently annotated (4.8M as of 2/28/2019) includes scholarly content and main stream web content.

mood_bad
  • No comments yet.
  • Add a comment