What this registry is, and what it is not
We are living through a profound shift in how written content is produced. AI tools can now generate journalism, academic papers, blog posts, and social media content that is statistically indistinguishable from human writing. This is not a future problem. It is happening now, at major newspapers, at fact-checking organisations, at universities, in newsletters, and on social media. Award-winning publications and ethically rigorous institutions have been found using AI to draft content without disclosure.
In this environment, institutional affiliation and prestige are no longer reliable evidence of human authorship. A journalist at a respected outlet, an academic at a leading university, a blogger with a long track record, any of these may or may not be writing their own work. The question has to be asked about the work and the voice, not about the badge on the masthead.
The HumanProof Registry exists to answer that question honestly. We are a public directory of works, authors, and publications where we have found specific, assessable evidence that a human being was responsible for the creative and editorial decisions, the ideas, the voice, the argument, the judgment about what stays and what goes.
What “human-directed authorship” means to us: A human made the substantive creative and intellectual decisions in the work. Tools may have assisted, spell-checkers, grammar tools, research aids, translation software, but a human being shaped the content. The thinking is human. The voice is human. The responsibility is human.
What we can claim: That at the time of review, we found specific evidence supporting a reasonable inference of human authorship, and that our editors assessed and agreed with that inference.
What we cannot claim: That any work was produced without any software assistance whatsoever. That our assessment is permanent and cannot be revised. That absence from this registry means a work is AI-generated. The registry documents the affirmative case for human authorship — it does not adjudicate everything else.
The honest caveat: Inclusion in the HumanProof Registry means we found positive evidence of human authorship and our editors were satisfied by that evidence. It does not constitute a guarantee. We do not use AI detection tools, which are unreliable and routinely produce false positives against human writing. If credible evidence emerges that a listed work involved undisclosed AI generation, we investigate and may remove the entry.
The three units of the registry
The registry contains three distinct types of entries. Each is clearly labelled and has different evidence requirements.
Works
A Work is a specific, individual piece of writing: an article, essay, blog post, newsletter issue, social media thread, academic paper, or book. This is the most precise unit of the registry. When a Work is listed, we are making a claim about that specific text, not about the author’s entire output or the publication’s general standards.
Works are listed with: title, author name, publication or platform, date, URL or reference, content type, and the evidence pathway that justified inclusion. Every Work entry displays which evidence pillars supported its inclusion.
For Works, the Recognised designation means: HPR editors reviewed the work and found that it is stylistically consistent with a demonstrable prior body of human-authored writing by the same author, and that no credible evidence of AI generation was found.
For Works, the Certified designation means: In addition to editorial review, the author submitted process documentation, drafts, revision history, notes, or other evidence of the human creative process, which was reviewed and verified by HPR’s peer reviewers.
Authors
An Author entry covers a person, a writer, journalist, essayist, academic, or other creator, whose body of work demonstrates a sustained, consistent human voice over time. An Author entry does not certify every piece that person has ever written. It represents our assessment that this person has a demonstrable, long-running human creative practice.
Authors are listed with: name, primary publication or platform, content type, years of active publishing, a brief description of their work and voice, links to representative works, and the evidence pathway for inclusion.
For Authors, the Recognised designation means: HPR editors reviewed a substantial sample of the author’s work across multiple years and found consistent evidence of a sustained human voice, distinctive stylistic patterns, personal references, idiosyncratic arguments, and continuity of voice that are characteristic of a human creative practice.
For Authors, the Certified designation means: The author has additionally submitted to HPR’s full verification process, providing process documentation for one or more representative works, confirming their authorship under our declaration process.
Publications
A Publication entry covers an outlet, blog, newsletter, or platform, not an individual author. Publication entries are the broadest unit of the registry. They represent our assessment that a publication has maintained human editorial direction and human-authored content over a sustained period.
Publications are listed with: name, URL, content type, founding date or years active, description, editorial structure, and evidence pathway.
For Publications, the Recognised designation means: HPR editors reviewed the publication’s output across multiple years and found evidence of consistent human editorial voice and direction. This typically requires a long archive of content predating the widespread availability of AI writing tools (before early 2022).
For Publications, the Certified designation means: The publication has additionally provided formal documentation of its editorial process and human authorship standards, reviewed by HPR’s peer reviewers.
The two evidence pillars
Every entry in the registry must satisfy both pillars. Supporting evidence strengthens the case but does not replace either pillar.
Pillar 1. Stylistic consistency with a pre-AI body of work
The core question is: does this work read as part of a continuous, identifiable human creative practice that predates the availability of AI writing tools?
What HPR editors assess:
Voice and register. Every writer develops a characteristic voice, a way of opening sentences, a preference for certain rhythms, a tendency toward particular kinds of analogies or asides. Human voices are consistent over time in ways that are not easily replicated by AI, which tends toward generic register. Editors read for the specific texture of the voice.
Personal and biographical specificity. Human writers refer to their own lives, their own locations, their own histories and relationships. These references accumulate into a portrait of a specific person. Editors look for this specificity, the kind of detail that requires a body and a life, not a language model.
Argumentative idiosyncrasy. Human thinkers have intellectual obsessions, contradictions, and recurring preoccupations. They return to the same ideas from different angles. They disagree with themselves over time. Editors look for the fingerprints of individual thought across a body of work.
Continuity across years. A writer with ten years of archived work has a record that extends far before AI writing tools were publicly available. If the voice and style are consistent across that arc, the current work is likely continuous with that human practice. Editors verify this arc.
Absence of AI-characteristic patterns. While HPR does not use AI detection tools as primary evidence, editors are trained to recognise the generic sentence structures, hedging patterns, and organisational conventions that are characteristic of AI-generated text.
Minimum requirements for Pillar 1: At least three years of publicly archived prior work by the same author or publication, with the earliest work predating January 2022. The voice across that archive must be assessably consistent with the work under review. Works with no retrievable archive prior to 2022 cannot satisfy Pillar 1 alone and require stronger supporting evidence.
What Pillar 1 cannot prove: It cannot prove that the specific work under review was written without AI assistance. A human writer could maintain a long human voice archive and then use AI for one piece. Pillar 1 establishes the general inference; Pillar 2 (editorial review) and supporting evidence (process documentation) address the specific work.
Pillar 2. HPR editorial review
Every entry passes a human editorial review by at least two members of HPR’s editorial team before it is listed.
What editors review: The work or body of work itself, the evidence of prior publishing history, any submitted process documentation, and any available contextual information about the author or publication. Editors are looking for a coherent, consistent case for human authorship, not perfection, but a persuasive preponderance of evidence.
What constitutes a pass: Editors agree that the available evidence supports a reasonable inference of human authorship and that no significant red flags exist. They do not need to be certain, they need to be satisfied.
What constitutes a fail: Significant inconsistency between the claimed prior work and the submission; absence of verifiable prior archive; evidence of AI-characteristic content without a plausible alternative explanation; or credible specific allegations of AI use that cannot be addressed by the available evidence.
Disagreement between editors: If two editors disagree, a third editor reviews independently. If the third review does not resolve the disagreement, the entry is declined pending additional evidence. HPR errs on the side of caution, a contested entry is not listed until the evidence is clear.
The role of judgment: Editorial review is not a mechanical checklist. It requires judgment, and we acknowledge that judgment can be wrong. This is why the registry is explicit about the evidence underlying each entry, why we have a review and removal process, and why we publish this methodology openly for scrutiny.
Supporting evidence types
The following types of evidence strengthen an entry’s case but do not replace the two pillars.
Process documentation: drafts, revision history, notes, recorded voice memos, editorial correspondence, provides direct evidence of the human creative process behind a specific work. This is required for Certified entries and encouraged for Recognised entries. It is the strongest supplementary evidence available.
Author’s public declaration: a public, attributable statement that the author does not use AI to generate their writing. Declarations carry more weight when they are specific, recent, and made in contexts where false declaration carries reputational or legal risk. A declaration made in a signed contract with a publisher carries more weight than a tweet.
Community corroboration: attestation from editors, colleagues, readers, or professional peers who can speak to an author’s writing process. This is difficult to systematise but can be decisive in borderline cases.
Institutional context: where an author publishes can be relevant context, but it is not treated as evidence of human authorship. An author’s institutional affiliation does not satisfy either pillar. It may inform the broader assessment but cannot substitute for direct evidence of the work or voice.
How entries are selected: the search methodology
The most important transparency question for any curated directory is: why these and not others? Our answer is honest: the registry is not comprehensive, cannot be comprehensive, and does not claim to be. It is a curated set of entries where we found sufficient evidence to make a reasonable claim of human authorship.
Proactive editorial scouting is the primary source of entries. HPR editors actively look for human voices, reading widely, following nominations from the journalism and writing community, and systematically working through the scouting sources described in Document 3. Scouting starts with sources that are likely to yield human voices but does not treat those sources as proxies for inclusion. Every candidate found through scouting must still satisfy both evidence pillars independently.
Community nominations, from readers, writers, editors, and the general public, are the second primary source. Anyone can nominate a Work, Author, or Publication. Authors may nominate themselves. Nominations are reviewed against the same evidence requirements as scouted entries.
Diversity of coverage is an explicit editorial goal. HPR actively works to include independent and smaller voices alongside well-known ones; writers from outside the English-speaking mainstream (while remaining English-language for this phase); writers from non-institutional settings (bloggers, newsletter writers, independent journalists) alongside those at major outlets; and writers across topics, not just the technology and media criticism fields that first motivated the registry.
Why the registry does not claim comprehensiveness: The number of human-authored works published every year vastly exceeds what any editorial team can review. Many excellent human writers will not appear in this registry, not because their work is not human, but because HPR has not yet reviewed it. Absence from the registry implies nothing about a work’s authorship.
Exclusion criteria
Publications with documented undisclosed AI content are excluded in their entirety for the period during which undisclosed AI was used. Works published before the documented period may remain under review on a case-by-case basis.
Works where credible specific evidence of AI generation exists, AI detection analysis combined with other evidence, whistleblower testimony, author admission, or documented tool usage, are placed under review and temporarily suspended pending investigation. The bar for suspension is “credible specific evidence,” not vague allegation. Vague or bad-faith flags do not trigger suspension.
Anonymous authors without verifiable history cannot satisfy Pillar 1 and are excluded unless there is exceptional alternative evidence.
Content primarily consisting of aggregation, summarisation, or reproduction of other sources without substantial original contribution is excluded. The registry is for original human creative and intellectual work.
Contested entries are handled as follows: the entry is flagged as Under Review and temporarily removed from public display; the editorial team investigates within 30 days; the decision (reinstate, modify, or remove) is documented in the public changelog with reasoning.
Maintenance and integrity
Annual review cycle. All Recognised entries are reviewed on a 12-month rolling cycle to confirm that the underlying evidence remains valid and no new information has emerged that changes the assessment. Certified entries are reviewed every 24 months, supplemented by annual self-declaration renewal from the author or publication.
Flagging concerns. Anyone may submit a concern about a listed entry via the registry’s contact form. Concerns must include: the specific entry, the nature of the concern, and any supporting evidence. Vague concerns without evidence are logged but do not trigger formal review.
Removal and changelog. When an entry is removed, the removal is recorded in a public changelog with the date, entry name, and reason. Removed entries remain in internal records for audit purposes but are no longer publicly displayed.
Conflict of interest. HPR editors may not review entries in which they have a personal, professional, or financial interest. Editors may not nominate their own work or the work of immediate family members. All editorial staff affiliations are publicly disclosed on the About page.
Limitations and honest caveats
The registry has known biases: it is English-language; it is more easily populated with authors who have long public online archives; it is more accessible to writers who know about HPR than to those who don’t; it reflects the editorial team’s own reading and network as a starting point. We acknowledge these biases and work actively against them, but we do not pretend they don’t exist.
The registry will become harder to maintain as AI use spreads and becomes less detectable. We expect the evidentiary standards to evolve; process documentation will become more important as stylistic evidence becomes less reliable. We commit to publishing updates to this methodology when our standards change, with a public record of what changed and why.
The registry is a beginning, not a solution. Human authorship cannot be fully protected by a directory. It requires cultural, legal, and institutional responses that go far beyond what HPR can provide. We are one instrument among many that are needed.
