A08129 Summary:

BILL NOA08129
 
SAME ASSAME AS S08209
 
SPONSORVanel
 
COSPNSRBichotte Hermelyn, Hyndman, Blumencranz
 
MLTSPNSRLevenberg
 
Add Art IV §§401 - 409, St Tech L
 
Enacts the New York artificial intelligence bill of rights to provide residents of the state with rights and protections to ensure that any system making decisions without human intervention impacting their lives do so lawfully, properly, and with meaningful oversight.
Go to top    

A08129 Actions:

BILL NOA08129
 
10/13/2023referred to science and technology
01/03/2024referred to science and technology
Go to top

A08129 Committee Votes:

Go to top

A08129 Floor Votes:

There are no votes for this bill in this legislative session.
Go to top

A08129 Memo:

NEW YORK STATE ASSEMBLY
MEMORANDUM IN SUPPORT OF LEGISLATION
submitted in accordance with Assembly Rule III, Sec 1(f)
 
BILL NUMBER: A8129
 
SPONSOR: Vanel
  TITLE OF BILL: An act to amend the state technology law, in relation to enacting the New York artificial intelligence bill of rights   PURPOSE OR GENERAL IDEA OF BILL: Enacts the New York artificial intelligence bill of rights.   SUMMARY OF SPECIFIC PROVISIONS: § 401: Sets forth key terms. § 402: States that the rights within this article apply to New York residents and govern developers of systems impacting civil rights, liberties, privacy, opportunities, and essential services. § 403: Explains that the rights in this article should be interpreted as a whole, and not exclusive of one another. § 404: Right to safe and effective systems § 405: Right to not face algorithmic discrimination § 406: Right to not face abusive data practices; Right to have agency over one's data § 407: Right to know when an automated system is being used; Right to understand how any why an automated system contributed to outcomes that impacted a person § 408: Right to opt out of an automated system; Right to work with a human in the place of an automated system. § 409: Provides that operators of systems who violate any of the rights may be liable to the people of the state for a penalty not less than three times such damages caused. Provides that there by no private cause of action arising out of the provisions of this section.   JUSTIFICATION: Systems which use artificial intelligence to make automated decisions are widely used today. However, the capabilities of such systems are limited in that they are unable to use a human level thought process to make these decisions. However, over the next few years, we can expect these systems to further increase in their abilities and even more complex decisions with a human level of deliberation. As a consequence, we are heading into a future where many important decisions, such as ones which affect a person's livelihood and freedom, and societies' liberties and democracy will be made by machines with little human input. While these machines have the ability to be more efficient and more objective than any human, we must still recognize the legitimate concerns that can be raised in the realms of fairness, transparency, and accountability. As such, we must ensure that people are protected from private companies and government entities who create systems that make automated decisions that affect people's lives. As such, it is necessary to safeguard these rights through a New York state artificial intelligence bill of rights. This Bill of Rights is modeled after the White House Office of Science and Technology's white paper called "A Blueprint for an AI Bill of Rights."   PRIOR LEGISLATIVE HISTORY: New bill   FISCAL IMPLICATIONS: LBD   EFFECTIVE DATE: This act shall take effect on the ninetieth date after it shall have become law. Effective immediately, the addition, amendment, and/or repeal of any rule or regulation necessary for the implementation of this act on its effective date are authorized to be made and completed on or before such effective date.
Go to top

A08129 Text:



 
                STATE OF NEW YORK
        ________________________________________________________________________
 
                                          8129
 
                               2023-2024 Regular Sessions
 
                   IN ASSEMBLY
 
                                    October 13, 2023
                                       ___________
 
        Introduced  by M. of A. VANEL -- read once and referred to the Committee
          on Science and Technology
 
        AN ACT to amend the state technology law, in relation  to  enacting  the
          New York artificial intelligence bill of rights
 
          The  People of the State of New York, represented in Senate and Assem-
        bly, do enact as follows:

     1    Section 1. Short title. This act shall be known and may  be  cited  as
     2  the "New York artificial intelligence bill of rights".
     3    §  2.  Legislative  intent.  This  legislature  hereby finds that this
     4  generation of humans is the first in history  to  have  the  ability  to
     5  create  technologies that can make decisions which previously could have
     6  only been made by humans. States and  countries  across  the  world  are
     7  grappling  with  critical questions of how we can use these technologies
     8  to solve our problems, how we can avoid or manage the new problems  that
     9  these  technologies  may  create,  and how we can control these powerful
    10  technologies.
    11    Therefore,  the  legislature  declares  that  any  New  York  resident
    12  affected  by  any  system making decisions without human intervention be
    13  entitled to certain rights and protections to  ensure  that  the  system
    14  impacting  their  lives  do  so  lawfully, properly, and with meaningful
    15  oversight.
    16    Among these rights and protections are  (i)  the  right  to  safe  and
    17  effective  systems; (ii) protections against algorithmic discrimination;
    18  (iii) protections against abusive data practices; (iv) the right to have
    19  agency over one's data; (v) the right to know when an  automated  system
    20  is  being  used;  (vi)  the right to understand how and why an automated
    21  system contributed to outcomes that impact one; (vii) the right  to  opt
    22  out of an automated system; and (viii) the right to work with a human in
    23  the place of an automated system.
    24    The  legislature also finds that automated systems will continue to be
    25  developed and evolve both within the state and outside the state. It  is
 
         EXPLANATION--Matter in italics (underscored) is new; matter in brackets
                              [ ] is old law to be omitted.
                                                                   LBD13117-01-3

        A. 8129                             2
 
     1  therefore  critical that New York does not overburden the development of
     2  innovative systems that better the state and its  residents,  nor  drive
     3  the development of such systems to foreign states or countries with less
     4  appropriate regulation, nor threaten the security of our state, country,
     5  and its people.
     6    To these ends, the legislature declares that the white paper published
     7  by  the  White  House Office of Science and Technology titled "Blueprint
     8  for an AI Bill of Rights" in October of 2022 is  commensurate  with  the
     9  goals of this state in relation to artificial intelligence.
    10    § 3. The state technology law is amended by adding a new article IV to
    11  read as follows:
    12                                 ARTICLE IV
    13                   ARTIFICIAL INTELLIGENCE BILL OF RIGHTS
 
    14  Section 401. Definitions.
    15          402. Application.
    16          403. Construction.
    17          404. Safe and effective systems.
    18          405. Algorithmic discrimination practices.
    19          406. Data privacy.
    20          407. Notice and explanation.
    21          408. Human alternatives, consideration, and fallback.
    22          409. Penalties; no private cause of action.
    23    § 401. Definitions. As used in this article, the following terms shall
    24  have the following meanings:
    25    1.  "Civil  rights, civil liberties, and privacy" or "rights, opportu-
    26  nity, and access" means such rights and protections provided for in  the
    27  United  States  Constitution,  federal law, the laws and constitution of
    28  the state of New York, and privacy and other freedoms that exist in both
    29  the public and private sector contexts, which shall include,  but  shall
    30  not be limited to:
    31    (a) freedom of speech;
    32    (b) voting rights;
    33    (c) protections from discrimination;
    34    (d) protections from excessive or unjust punishment; and
    35    (e) protections from unlawful surveillance.
    36    2. "Equal opportunity" means equal access to education, housing, cred-
    37  it, employment, and other programs.
    38    3. "Access to critical resources or services" means such resources and
    39  services  that are fundamental for the well-being, security, and equita-
    40  ble participation of New York residents in society, which shall include,
    41  but shall not be limited to:
    42    (a) healthcare;
    43    (b) financial services;
    44    (c) safety;
    45    (d) social services;
    46    (e) non-deceptive information about goods and services; and
    47    (f) government benefits.
    48    4. "Algorithmic discrimination" means circumstances where an automated
    49  system contributes to an unjustified different treatment or impact which
    50  disfavors people based on their age, color, creed, disability,  domestic
    51  violence  victim status, gender identity or expression, familial status,
    52  marital status, military status, national origin,  predisposing  genetic
    53  characteristics, pregnancy-related condition, prior arrest or conviction
    54  record,  race,  sex,  sexual orientation, or veteran status or any other
    55  classification protected by law.

        A. 8129                             3
 
     1    5. "Automated system" means any  system,  software,  or  process  that
     2  affects  New York residents and that uses computation as a whole or part
     3  of a system to determine outcomes, make or aid decisions, inform  policy
     4  implementation, collect data or observations, or otherwise interact with
     5  New  York residents or communities. Automated systems shall include, but
     6  not be limited to, systems derived from machine learning, statistics, or
     7  other data processing or artificial intelligence techniques,  and  shall
     8  exclude passive computing infrastructure.
     9    6.  "Passive  computing infrastructure" shall include any intermediary
    10  technology that does not influence or determine  the  outcome  of  deci-
    11  sions,  make  or  aid  in  decisions,  inform  policy implementation, or
    12  collect data or observations, including web  hosting,  domain  registra-
    13  tion, networking, caching, data storage, or cybersecurity.
    14    7.  "Communities"  means  neighborhoods,  social  network connections,
    15  families, people connected by affinity, identity, or shared  traits  and
    16  formal  organizational ties. This includes Tribes, Clans, Bands, Ranche-
    17  rias, Villages, and other Indigenous communities.
    18    8. "Social network" means  any  connection  of  persons  which  exists
    19  online or offline.
    20    9.  "Families"  means  any  relationship, whether by blood, choice, or
    21  otherwise, where one or more persons assume a caregiver role, primary or
    22  shared, for one or more others, or where  individuals  mutually  support
    23  and are committed to each other's well-being.
    24    10.  "Equity"  means  the  consistent  and  systematic fair, just, and
    25  impartial treatment of all New York residents. Systemic, fair, and  just
    26  treatment  shall  take into account the status of New York residents who
    27  belong to underserved communities that have been denied such  treatment,
    28  such as Black, Latino, and Indigenous and Native American persons, Asian
    29  Americans  and  Pacific Islanders and other persons of color; members of
    30  religious minorities; women, girls, and non-binary people; lesbian, gay,
    31  bisexual,  transgender,  queer,  and  intersex  persons;  older  adults;
    32  persons  with disabilities; persons who live in rural areas; and persons
    33  otherwise adversely affected by persistent poverty or inequality.
    34    11. "Sensitive data" means any data and metadata:
    35    (a) that pertains to a New York resident in a sensitive domain;
    36    (b) that are generated by technologies in a sensitive domain;
    37    (c) that can be used to infer data from a sensitive domain;
    38    (d) about a New York resident, such as disability-related data, genom-
    39  ic data, biometric data, behavioral data, geolocation data, data related
    40  to the criminal justice system, relationship history,  or  legal  status
    41  such as custody and divorce information, and home, work, or school envi-
    42  ronmental data;
    43    (e)  that  has  the  reasonable  potential to be used in ways that are
    44  likely to expose New York residents to meaningful harm, such as  a  loss
    45  of privacy or financial harm due to identity theft; or
    46    (f) that is generated by a person under the age of eighteen.
    47    12.  "Sensitive  domain"  means a particular area, field, or sphere of
    48  activity in which activities being conducted can cause  material  harms,
    49  including  significant  adverse effects on human rights such as autonomy
    50  and dignity, as well as civil liberties and civil rights.
    51    13. "Surveillance technology" means products or services marketed  for
    52  or  that  can  be  lawfully used to detect, monitor, intercept, collect,
    53  exploit, preserve, protect, transmit, or retain data, identifying infor-
    54  mation, or communications concerning New York residents or groups.

        A. 8129                             4
 
     1    14. "Underserved communities" means communities that have been system-
     2  atically denied a full opportunity to participate in aspects of  econom-
     3  ic, social, and civic life.
     4    §  402. Application. The rights contained within this article shall be
     5  construed as applying to New York residents against  persons  developing
     6  automated  systems  that  have  the potential to meaningfully impact New
     7  York residents':
     8    1. civil rights, civil liberties, and privacy;
     9    2. equal opportunities; or
    10    3. access to critical resources or services.
    11    § 403. Construction. The rights contained within this article shall be
    12  construed as harmonious and mutually supportive.
    13    § 404. Safe and effective systems. 1.  New  York  residents  have  the
    14  right  to  be  protected  from  unsafe or ineffective automated systems.
    15  These systems must be developed in collaboration with  diverse  communi-
    16  ties,  stakeholders,  and  domain  experts  to  identify and address any
    17  potential concerns, risks, or impacts.
    18    2. Automated systems shall undergo pre-deployment testing, risk  iden-
    19  tification  and mitigation, and shall also be subjected to ongoing moni-
    20  toring that demonstrates they are safe  and  effective  based  on  their
    21  intended  use,  mitigation of unsafe outcomes including those beyond the
    22  intended use, and adherence to domain-specific standards.
    23    3. If an automated system fails  to  meet  the  requirements  of  this
    24  section,  it  shall  not  be  deployed  or,  if already in use, shall be
    25  removed. No automated system shall be designed  with  the  intent  or  a
    26  reasonably  foreseeable possibility of endangering the safety of any New
    27  York resident or New York communities.
    28    4. Automated systems shall be designed to proactively protect New York
    29  residents from harm stemming from unintended, yet foreseeable,  uses  or
    30  impacts.
    31    5. New York residents are entitled to protection from inappropriate or
    32  irrelevant  data use in the design, development, and deployment of auto-
    33  mated systems, and from the compounded harm of its reuse.
    34    6. Independent evaluation and reporting that confirms that the  system
    35  is  safe  and  effective, including reporting of steps taken to mitigate
    36  potential harms, shall be performed and the results made public whenever
    37  possible.
    38    § 405. Algorithmic discrimination practices. 1. No New  York  resident
    39  shall face discrimination by algorithms, and all automated systems shall
    40  be used and designed in an equitable manner.
    41    2. The designers, developers, and deployers of automated systems shall
    42  take proactive and continuous measures to protect New York residents and
    43  communities from algorithmic discrimination, ensuring the use and design
    44  of these systems in an equitable manner.
    45    3.  The  protective  measures  required  by this section shall include
    46  proactive equity assessments as part of the system design, use of repre-
    47  sentative data, protection against proxies for demographic features, and
    48  assurance of accessibility for New York residents with  disabilities  in
    49  design and development.
    50    4. Automated systems shall undergo pre-deployment and ongoing dispari-
    51  ty testing and mitigation, under clear organizational oversight.
    52    5. Independent evaluations and plain language reporting in the form of
    53  an  algorithmic  impact  assessment, including disparity testing results
    54  and  mitigation  information,  shall  be  conducted  for  all  automated
    55  systems.

        A. 8129                             5
 
     1    6.  New  York  residents shall have the right to view such evaluations
     2  and reports.
     3    §  406.  Data  privacy.  1. New York residents shall be protected from
     4  abusive data practices via built-in protections and shall maintain agen-
     5  cy over the use of their personal data.
     6    2. Privacy violations shall be mitigated through design  choices  that
     7  include  privacy  protections  by default, ensuring that data collection
     8  conforms to reasonable expectations and  that  only  strictly  necessary
     9  data for the specific context is collected.
    10    3. Designers, developers, and deployers of automated systems must seek
    11  and   respect   the  decisions  of  New  York  residents  regarding  the
    12  collection, use, access, transfer, and deletion of  their  data  in  all
    13  appropriate ways and to the fullest extent possible. Where not possible,
    14  alternative privacy by design safeguards must be implemented.
    15    4.  Automated systems shall not employ user experience or design deci-
    16  sions that obscure user choice or burden  users  with  default  settings
    17  that are privacy-invasive.
    18    5.  Consent  shall  be  used to justify the collection of data only in
    19  instances where it can be  appropriately  and  meaningfully  given.  Any
    20  consent  requests  shall be brief, understandable in plain language, and
    21  provide New York residents with agency  over  data  collection  and  its
    22  specific context of use.
    23    6.  Any  existing practice of complex notice-and-choice for broad data
    24  use shall be transformed, emphasizing clarity and user comprehension.
    25    7. Enhanced protections and restrictions shall be established for data
    26  and inferences related to sensitive domains. In sensitive domains, indi-
    27  vidual data and related inferences may only be used for necessary  func-
    28  tions, safeguarded by ethical review and use prohibitions.
    29    8.  New  York  residents  and  New York communities shall be free from
    30  unchecked surveillance; surveillance technologies shall  be  subject  to
    31  heightened  oversight,  including  at least pre-deployment assessment of
    32  their potential harms and scope limits  to  protect  privacy  and  civil
    33  liberties.
    34    9.  Continuous surveillance and monitoring shall not be used in educa-
    35  tion, work, housing, or  any  other  contexts  where  the  use  of  such
    36  surveillance  technologies  is likely to limit rights, opportunities, or
    37  access.
    38    10. Whenever possible, New York residents shall have access to report-
    39  ing that confirms respect for  their  data  decisions  and  provides  an
    40  assessment of the potential impact of surveillance technologies on their
    41  rights, opportunities, or access.
    42    § 407. Notice and explanation. 1. New York residents shall be informed
    43  when  an  automated  system  is  in  use and New York residents shall be
    44  informed how and why the system  contributes  to  outcomes  that  impact
    45  them.
    46    2.  Designers,  developers,  and  deployers of automated systems shall
    47  provide  accessible  plain  language  documentation,   including   clear
    48  descriptions  of the overall system functioning, the role of automation,
    49  notice of system use, identification of the individual  or  organization
    50  responsible  for  the system, and clear, timely, and accessible explana-
    51  tions of outcomes.
    52    3. The provided notice shall be kept up-to-date, and  New  York  resi-
    53  dents impacted by the system shall be notified of any significant chang-
    54  es to use cases or key functionalities.

        A. 8129                             6
 
     1    4.  New  York residents shall have the right to understand how and why
     2  an outcome impacting them was determined by an  automated  system,  even
     3  when the automated system is not the sole determinant of the outcome.
     4    5.  Automated  systems shall provide explanations that are technically
     5  valid, meaningful to the individual and any other persons  who  need  to
     6  understand  the  system  and proportionate to the level of risk based on
     7  the context.
     8    6. Summary reporting, including plain language information about these
     9  automated systems and assessments of the clarity and quality  of  notice
    10  and explanations, shall be made public whenever possible.
    11    §  408.  Human  alternatives, consideration, and fallback. 1. New York
    12  residents shall have the right to opt out of  automated  systems,  where
    13  appropriate,  in  favor  of  a human alternative. The appropriateness of
    14  such an option shall be determined based on reasonable expectations in a
    15  given context, with a focus on ensuring broad accessibility and protect-
    16  ing the public from particularly harmful impacts. In some  instances,  a
    17  human or other alternative may be mandated by law.
    18    2.  New  York  residents shall have access to a timely human consider-
    19  ation and remedy through a fallback and escalation process if  an  auto-
    20  mated  system  fails,  produces  an  error, or if they wish to appeal or
    21  contest its impacts on them.
    22    3. The human consideration and fallback process shall  be  accessible,
    23  equitable,  effective,  maintained,  accompanied by appropriate operator
    24  training, and should not impose an unreasonable burden on the public.
    25    4. Automated  systems  intended  for  use  within  sensitive  domains,
    26  including  but  not  limited to criminal justice, employment, education,
    27  and health, shall additionally be tailored  to  their  purpose,  provide
    28  meaningful access for oversight, include training for New York residents
    29  interacting  with  the  system,  and incorporate human consideration for
    30  adverse or high-risk decisions.
    31    5. Summary reporting, which  includes  a  description  of  such  human
    32  governance  processes and an assessment of their timeliness, accessibil-
    33  ity, outcomes, and effectiveness, shall be made publicly available when-
    34  ever possible.
    35    § 409. Penalties; no private cause of action. 1. Where an operator  of
    36  an  automated system violates or causes a violation of any of the rights
    37  stated within this article, such operator shall be liable to the  people
    38  of  this  state  for  a  penalty  not less than three times such damages
    39  caused.
    40    2. The penalty provided for in subdivision one of this section may  be
    41  recovered  by  an action brought by the attorney general in any court of
    42  competent jurisdiction.
    43    3. Nothing set forth in this article shall be construed  as  creating,
    44  establishing,  or  authorizing a private cause of action by an aggrieved
    45  person against an operator of an automated system who has  violated,  or
    46  is alleged to have violated, any provision of this article.
    47    §  4.  This  act shall take effect on the ninetieth day after it shall
    48  have become a law.
Go to top