S T A T E O F N E W Y O R K
________________________________________________________________________
10909
I N A S S E M B L Y
April 8, 2026
___________
Introduced by M. of A. BURROUGHS -- read once and referred to the
Committee on Governmental Operations
AN ACT to amend the executive law, in relation to enacting the "Shirley
Myers White Right To Reconciliation and Digital Identity Repair Act"
THE PEOPLE OF THE STATE OF NEW YORK, REPRESENTED IN SENATE AND ASSEM-
BLY, DO ENACT AS FOLLOWS:
Section 1. This act shall be known and may be cited as the "Shirley
Myers White Right To Reconciliation and Digital Identity Repair Act".
§ 2. Legislative findings. The legislature finds and declares that:
1. In modern society, individuals are increasingly governed not only
by formal legal outcomes, but by persistent narratives introduced and
amplified through journalism, digital platforms, automated systems, and
institutional records.
2. When allegations, arrests, or adverse events are reported or
recorded, they often receive prominent and lasting visibility, while
resolutions such as acquittals, dismissals, or exonerations receive
little or no corresponding prominence.
3. The continued circulation of outdated, incomplete, or disproven
narratives can cause ongoing and disproportionate harm, including but
not limited to:
(a) barriers to employment, education, and housing;
(b) compromised personal safety and social standing;
(c) psychological distress and reputational damage; and
(d) structural exclusion through automated screening systems.
4. Advances in artificial intelligence, search algorithms, and auto-
mated decision-making tools have intensified this harm by scaling
outdated information, often without transparency, context, or a mech-
anism for correction.
5. Modern indexing and artificial intelligence systems materially
affect how narratives persist and influence decision-making.
6. Search engines, automated background checks, data aggregators, and
screening tools rely on indexed digital information to assess credibili-
ty, eligibility, and risk. These systems do not independently verify
EXPLANATION--Matter in ITALICS (underscored) is new; matter in brackets
[ ] is old law to be omitted.
LBD14691-01-6
A. 10909 2
accuracy, context, or legal resolution; they surface and prioritize
information based on availability and perceived relevance.
7. When arrests, allegations, or adverse events are indexed without
corresponding updates reflecting resolution or correction, automated
systems may continue to treat incomplete narratives as current truth. In
this environment, unresolved narratives can quietly govern access to
employment, housing, education, safety, and social participation long
after legal systems have corrected themselves.
8. In some instances, individuals or institutions vested with authori-
ty continue to assert or maintain narratives that contradict adjudicated
legal outcomes, resulting in narrative abuse by authority and undermin-
ing trust in public systems.
9. Social workers, legal practitioners, educators, and community advo-
cates routinely encounter individuals whose lives remain constrained by
narratives that no longer reflect legal truth or present reality.
10. Accountability and public safety are essential; however, justice
does not end at disposition, and ethical systems must distinguish
between accountability and permanent narrative punishment. The absence
of a clear, consistent right to reconciliation and digital identity
repair across systems creates inequity, erodes trust, and perpetuates
harm inconsistent with principles of dignity, proportionality, and
integrity.
§ 3. Legislative purpose. 1. The purpose of this act is to:
(a) establish a right to reconciliation and digital identity repair
for individuals whose legal matters have been resolved, or whose circum-
stances have materially changed;
(b) ensure that narratives introduced by institutions are accompanied
by mechanisms for accuracy, context, update, and completion;
(c) prevent individuals from being permanently governed by biased,
outdated, incomplete, or authority-enforced narratives;
(d) create system-appropriate obligations across distinct domains
where narrative harm occurs, while preserving accountability, public
safety, and freedom of the press; and
(e) affirm reconciliation as a matter of systems integrity, not pref-
erential treatment.
2. This act further recognizes that reconciliation must extend beyond
formal legal disposition into the indexed and automated systems that now
shape real-world outcomes. When institutions correct themselves, those
corrections must be capable of traveling with the narrative into the
systems that rely on digital identity to make consequential decisions.
§ 4. The executive law is amended by adding a new section 296-e to
read as follows:
§ 296-E. RIGHT TO RECONCILIATION AND DIGITAL IDENTITY REPAIR. 1. FOR
PURPOSES OF THIS SECTION, THE FOLLOWING TERMS SHALL HAVE THE FOLLOWING
MEANINGS:
(A) "NARRATIVE" MEANS ANY PUBLIC OR INSTITUTIONAL REPRESENTATION,
EITHER WRITTEN, VISUAL, DIGITAL, OR AUTOMATED, THAT DESCRIBES, CHARAC-
TERIZES, OR IMPLIES AN INDIVIDUAL'S CONDUCT, RISK, CHARACTER, OR STATUS.
(B) "NARRATIVE HARM" MEANS MATERIAL HARM RESULTING FROM THE PERSIST-
ENCE, AMPLIFICATION, OR ENFORCEMENT OF A NARRATIVE THAT IS INACCURATE,
OUTDATED, INCOMPLETE, OR DISPROPORTIONATE TO VERIFIED FACTS OR ADJUDI-
CATED OUTCOMES.
(C) "RECONCILIATION" MEANS THE PROCESS BY WHICH NARRATIVES ARE
CORRECTED, CONTEXTUALIZED, OR COMPLETED TO REFLECT CURRENT LEGAL TRUTH
AND MATERIAL REALITY.
A. 10909 3
(D) "DIGITAL IDENTITY REPAIR" MEANS REASONABLE MEASURES TAKEN TO
ENSURE THAT AN INDIVIDUAL'S DIGITAL PRESENCE DOES NOT MATERIALLY MISREP-
RESENT THEIR LEGAL STATUS OR FACTUAL CIRCUMSTANCES.
(E) "NARRATIVE ABUSE BY AUTHORITY" MEANS THE CONTINUED ASSERTION OR
CIRCULATION OF A DISPROVEN, EXAGGERATED, OR OUTDATED NARRATIVE BY AN
INDIVIDUAL OR INSTITUTION VESTED WITH FORMAL AUTHORITY, DESPITE EVIDENCE
OR ADJUDICATION TO THE CONTRARY.
(F) "DISPOSITION" MEANS AN ACQUITTAL, DISMISSAL, EXONERATION,
COMPLETION OF SENTENCE, OR ANY FINAL LEGAL RESOLUTION.
2. THE RIGHT TO RECONCILIATION SHALL APPLY ACROSS THE FOLLOWING
DOMAINS, EACH WITH SYSTEM-APPROPRIATE OBLIGATIONS:
(A) JOURNALISM AND MEDIA; NARRATIVE INTRODUCTION WITHOUT COMPLETION.
(I) MEDIA ENTITIES THAT PUBLISH INFORMATION REGARDING ARRESTS, ALLEGA-
TIONS, OR CRIMINAL PROCEEDINGS SHALL ESTABLISH PROCEDURES TO:
(1) UPDATE OR CONTEXTUALIZE COVERAGE WHEN LEGAL OUTCOMES MATERIALLY
CHANGE;
(2) ENSURE THAT RESOLUTIONS RECEIVE REASONABLE VISIBILITY COMPARABLE
TO INITIAL REPORTING; AND
(3) MITIGATE ONGOING HARM WHEN CONTINUED PUBLICATION NO LONGER SERVES
A LEGITIMATE PUBLIC INTEREST.
(II) THIS SECTION SHALL NOT REQUIRE REMOVAL OF HISTORICAL RECORDS, BUT
SHALL AFFIRM THE RESPONSIBILITY TO RESTORE ACCURACY AND CONTEXT.
(B) ARTIFICIAL INTELLIGENCE AND AUTOMATED SYSTEMS; NARRATIVE AMPLIFI-
CATION WITHOUT STEWARDSHIP. ENTITIES UTILIZING AUTOMATED TOOLS FOR
SCREENING, RANKING, OR DECISION-MAKING SHALL:
(I) PROVIDE NOTICE WHEN ADVERSE DECISIONS RELY ON AUTOMATED NARRATIVE
SOURCES;
(II) ALLOW FOR THE CORRECTION OF MATERIALLY INACCURATE OR OUTDATED
INFORMATION; AND
(III) ENSURE HUMAN REVIEW WHEN NARRATIVE HARM IS ALLEGED.
(C) HIRING, HOUSING AND OPPORTUNITY SYSTEMS; PRE-ADJUDICATION EXCLU-
SION. (I) EMPLOYERS, HOUSING PROVIDERS, AND INSTITUTIONS SHALL NOT RELY
SOLELY ON UNCONTEXTUALIZED NARRATIVES WHERE LEGAL RESOLUTION OR
CORRECTION IS AVAILABLE.
(II) INDIVIDUALS SHALL HAVE THE RIGHT TO:
(1) UNDERSTAND ADVERSE DECISIONS BASED ON NARRATIVE INFORMATION;
(2) SUBMIT DOCUMENTATION OF RECONCILIATION OR RESOLUTION; AND
(3) REQUEST RECONSIDERATION WHERE APPROPRIATE.
(D) LAW ENFORCEMENT AND JUDICIAL ADJACENCY; NARRATIVE ABUSE BY AUTHOR-
ITY. (I) FOLLOWING LEGAL DISPOSITION, NO AUTHORITY SHALL KNOWINGLY MAIN-
TAIN OR DISSEMINATE NARRATIVES THAT MATERIALLY CONTRADICT ADJUDICATED
OUTCOMES.
(II) AGENCIES AND ENTITIES SHALL IMPLEMENT SAFEGUARDS TO ENSURE
RECORDS, STATEMENTS, AND COMMUNICATIONS REFLECT LEGAL TRUTH AND PROPOR-
TIONALITY.
(E) DIGITAL LIFE, RELATIONSHIPS AND SOCIAL OPPORTUNITY; NARRATIVE
COMPRESSION IN EVERYDAY LIFE. (I) INDIVIDUALS SHALL HAVE ACCESS TO
REASONABLE MECHANISMS TO CONTEST AND CORRECT MATERIALLY HARMFUL DIGITAL
NARRATIVES THAT MISREPRESENT RESOLVED MATTERS.
(II) DIGITAL PLATFORMS SHALL BE ENCOURAGED TO SUPPORT CONTEXTUALIZA-
TION AND DISPUTE RESOLUTION CONSISTENT WITH THIS SECTION.
(III) SOCIAL MEDIA PLATFORMS, AS SUCH TERM IS DEFINED PURSUANT TO
SUBDIVISION FIVE OF SECTION ELEVEN HUNDRED OF THE GENERAL BUSINESS LAW,
SHALL REMOVE ANY ACCUSATIONS ABOUT AN INDIVIDUAL THAT ARE NOT FACTUAL
AND MAY JEOPARDIZE SUCH INDIVIDUAL'S REPUTATION.
3. (A) NOTHING IN THIS SECTION SHALL BE CONSTRUED TO:
A. 10909 4
(I) ERASE ACCOUNTABILITY FOR HARM;
(II) INTERFERE WITH LAWFUL INVESTIGATIONS;
(III) SUPPRESS LAWFUL SPEECH;
(IV) REQUIRE FALSE STATEMENTS OR MISREPRESENTATION OF FACTS;
(V) REGULATE OR RESTRICT FREEDOM OF SPEECH, PERSONAL EXPRESSION, OR
PRIVATE COMMUNICATION; OR
(VI) REGULATE SOCIAL MEDIA PLATFORMS, SEARCH ENGINES, OR THIRD-PARTY
TECHNOLOGY PROVIDERS, OR TO REQUIRE SUCH ENTITIES TO REMOVE, SUPPRESS,
OR ALTER CONTENT.
(B) RESPONSIBILITIES UNDER THIS SECTION SHALL APPLY SOLELY TO THE
INSTITUTION, AGENCY, OR ENTITY THAT ORIGINATED, AUTHORIZED, OR MAIN-
TAINED THE OFFICIAL NARRATIVE OR RECORD AT ISSUE, INCLUDING CONTENT
PUBLISHED ON OFFICIAL WEBSITES, PRESS RELEASES, PUBLIC RECORDS, OR
INSTITUTIONAL SOCIAL MEDIA ACCOUNTS.
4. THE DIVISION OF HUMAN RIGHTS SHALL DEVELOP GUIDANCE CONSISTENT WITH
THIS SECTION AND SHALL CONSULT WITH STAKEHOLDERS, INCLUDING SOCIAL WORK
PROFESSIONALS, LEGAL EXPERTS, AND CIVIL LIBERTIES ORGANIZATIONS IN THE
DEVELOPMENT OF SUCH GUIDANCE.
§ 5. This act shall take effect immediately.