Regulates automated decision-making by government agencies; requires agencies to conduct impact assessments; requires disclosure of automated decision-making tools utilized by governmental agencies.
STATE OF NEW YORK
________________________________________________________________________
8295--A
2025-2026 Regular Sessions
IN ASSEMBLY
May 12, 2025
___________
Introduced by M. of A. OTIS -- read once and referred to the Committee
on Science and Technology -- committee discharged, bill amended,
ordered reprinted as amended and recommitted to said committee
AN ACT to amend the state technology law, in relation to automated deci-
sion-making by government agencies
The People of the State of New York, represented in Senate and Assem-bly, do enact as follows:
1 Section 1. The state technology law is amended by adding a new article
2 5 to read as follows:
3 ARTICLE V
4 AUTOMATED DECISION-MAKING IN GOVERNMENT AGENCIES
5 Section 501. Definitions.
6 502. Disclosure of automated decision-making tools by government
7 agencies.
8 503. Impact assessments.
9 504. Submission to the governor and legislature.
10 § 501. Definitions. For the purpose of this article:
11 1. "Automated decision-making tool" shall mean any software that uses
12 algorithms, computational models, or artificial intelligence techniques,
13 or a combination thereof, to automate, support, or replace human deci-
14 sion-making. "Automated decision-making tool" shall not include any
15 software used primarily for basic computerized processes, such as calcu-
16 lators, spellcheck tools, autocorrect functions, spreadsheets, electron-
17 ic communications, or any tool that relates only to internal management
18 affairs such as ordering office supplies or processing payments, and
19 that do not materially affect the rights, liberties, benefits, safety or
20 welfare of any individual within the state. "Automated decision-making
21 tools" shall not include "automated employment decision-making tools" as
22 defined in section four hundred one of this chapter.
23 2. "Meaningful human review" means review, oversight and control of
24 the automated decision-making process by one or more individuals who
EXPLANATION--Matter in italics (underscored) is new; matter in brackets
[] is old law to be omitted.
LBD11535-07-5
A. 8295--A 2
1 understand the risks, limitations, and functionality of, and are trained
2 to use, the automated decision-making tool and who have the authority to
3 intervene or alter the decision under review, including but not limited
4 to the ability to approve, deny, or modify any decision recommended or
5 made by the automated tool.
6 3. "Government agency" shall mean: (a) the state or civil division
7 thereof; (b) a county, city, town or village; (c) a school district,
8 board of cooperative educational services, vocational education and
9 extension board or a school district as enumerated in section one of
10 chapter five hundred sixty-six of the laws of nineteen hundred sixty-
11 seven, as amended; (d) the state university of New York; (e) the city
12 university of New York; (f) a public improvement or special district
13 including police or fire districts; (g) a public authority, commission
14 or public benefit corporation; or (h) any other public corporation,
15 agency, instrumentality or unit of government which exercises govern-
16 mental power under the laws of this state.
17 § 502. Disclosure of automated decision-making tools by government
18 agencies. Any state agency that utilizes an automated decision-making
19 tool, as defined in section five hundred one of this article, shall
20 publish a list of such automated decision-making tools on such state
21 agency's website no later than the thirtieth of December next succeeding
22 the date on which this section takes effect, and annually thereafter.
23 Such disclosure shall include:
24 1. a description of the automated decision-making tool utilized
25 by such state agency;
26 2. the date that the state agency use of such automated decision-
27 making tool began;
28 3. a summary of the purpose and use of such automated decision-
29 making tool; and
30 4. any other information deemed relevant by the agency.
31 § 503. Impact assessments. 1. Government agencies seeking to utilize
32 or apply an automated decision-making tool permitted under section five
33 hundred two of this article with continued and operational meaningful
34 human review shall conduct or have conducted an impact assessment
35 substantially completed and bearing the signature of one or more indi-
36 viduals responsible for meaningful human review for the lawful applica-
37 tion and use of such automated decision-making tool. Following the first
38 impact assessment, an impact assessment shall be conducted in accordance
39 with this section at least once every two years. An impact assessment
40 shall be conducted prior to any material change to the automated deci-
41 sion-making tool that may change the outcome or effect of such tool.
42 Such impact assessments shall include:
43 (a) a description of the objectives of the automated decision-making
44 tool;
45 (b) an evaluation of the ability of the automated decision-making tool
46 to achieve its stated objectives;
47 (c) a description and evaluation of the objectives and development of
48 the automated decision-making including:
49 (i) a summary of the underlying algorithms, computational modes, and
50 artificial intelligence tools that are used within the automated deci-
51 sion-making tool; and
52 (ii) the design and training data used to develop the automated deci-
53 sion-making tool process;
54 (d) testing for:
55 (i) accuracy, fairness, bias and discrimination, and an assessment of
56 whether the use of the automated decision-making tool produces discrimi-
A. 8295--A 3
1 natory results on the basis of a consumer's or a class of consumers'
2 actual or perceived race, color, ethnicity, religion, national origin,
3 sex, gender, gender identity, sexual orientation, familial status, biom-
4 etric information, lawful source of income, or disability and outlines
5 mitigations for any identified performance differences in outcomes
6 across relevant groups impacted by such use;
7 (ii) any cybersecurity vulnerabilities and privacy risks resulting
8 from the deployment and use of the automated decision-making tool, and
9 the development or existence of safeguards to mitigate the risks;
10 (iii) any public health or safety risks resulting from the deployment
11 and use of the automated decision-making tool;
12 (iv) any reasonably foreseeable misuse of the automated decision-mak-
13 ing tool and the development or existence of safeguards against such
14 misuse;
15 (e) the extent to which the deployment and use of the automated deci-
16 sion-making tool requires input of sensitive and personal data, how that
17 data is used and stored, and any control users may have over their data;
18 and
19 (f) the notification mechanism or procedure, if any, by which individ-
20 uals impacted by the utilization of the automated decision-making tool
21 may be notified of the use of such automated decision-making tool and of
22 the individual's personal data, and informed of their rights and options
23 relating to such use.
24 2. Notwithstanding the provisions of this article or any other law, if
25 an impact assessment finds that the automated decision-making tool
26 produces discriminatory or biased outcomes, the government agency shall
27 cease any utilization, application, or function of such automated deci-
28 sion-making tool, and of any information produced using such tool.
29 § 504. Submission to the governor and legislature. 1. Each impact
30 assessment conducted pursuant to this article shall be submitted to the
31 governor, the temporary president of the senate, and the speaker of the
32 assembly at least thirty days prior to the implementation of the auto-
33 mated decision-making tool that is the subject of such assessment.
34 2. (a) The impact assessment of an automated decision-making tool
35 shall be published on the website of the relevant government agency.
36 (b) If the government agency makes a determination that the disclosure
37 of any information required in the impact assessment would result in a
38 substantial negative impact on health or safety of the public, infringe
39 upon the privacy rights of individuals, or significantly impair the
40 government agency's ability to protect its information technology or
41 operational assets, such government agency may redact such information,
42 provided that an explanatory statement on the process by which the
43 government agency made such determination is published along with the
44 redacted impact assessment.
45 (c) If the impact assessment covers any automated decision-making tool
46 that includes technology that is used to prevent, detect, protect
47 against or respond to security incidents, identity theft, fraud, harass-
48 ment, malicious or deceptive activities or other illegal activity,
49 preserve the integrity or security of tools, or to investigate, report
50 or prosecute those responsible for any such malicious or deceptive
51 action, such government agency may redact such information for the
52 purposes of this subdivision, provided that an explanatory statement on
53 the process by which the government agency made such determination is
54 published along with the redacted impact assessment.
55 § 2. The state technology law is amended by adding a new section 103-f
56 to read as follows:
A. 8295--A 4
1 § 103-f. Automated decision-making tool inventory. 1. The office shall
2 maintain an inventory of state automated decision-making tools. The
3 office shall issue guidance to state agencies identifying the data
4 elements to be collected and submitted to the office for such inventory,
5 including but not limited to the purpose and uses of such automated
6 decision-making tools. The inventory shall be posted on the New York
7 state open data website on the thirtieth of December next succeeding the
8 date on which this section takes effect, and annually thereafter. State
9 agencies shall submit information required by the office at least sixty
10 days in advance of the annual publication date. The office may withhold
11 certain information if it determines disclosure of this information
12 would jeopardize the security of information technology assets, or as
13 prescribed by article six of the public officers law.
14 2. For purposes of this section, "automated decision-making tool"
15 shall have the same meaning as the term is defined in section five
16 hundred one of this chapter.
17 3. The office may ask and shall receive from any state agency any
18 information or assistance necessary to carry out its powers and duties
19 under this section.
20 4. The office shall submit a copy of the artificial intelligence
21 inventory to the governor, the temporary president of the senate, and
22 the speaker of the assembly.
23 § 3. Disclosure of existing automated decision-making tools. Any
24 government agency, that directly or indirectly, utilizes an automated
25 decision-making tool, as defined in section 501 of the state technology
26 law, shall submit to the legislature a disclosure on the use of such
27 tool, no later than one year after the effective date of this section.
28 Such disclosure shall include:
29 (a) a description of the automated decision-making tool utilized by
30 such agency;
31 (b) a list of any software vendors related to such automated deci-
32 sion-making tool;
33 (c) the date that the use of such tool began;
34 (d) a summary of the purpose and use of such tool, including a
35 description of human decision-making and discretion supported or
36 replaced by the automated decision-making tool;
37 (e) whether any impact assessments for the automated decision-making
38 tool were conducted and the dates and summaries of the results of such
39 assessments where applicable; and
40 (f) any other information deemed relevant by the agency.
41 § 4. This act shall take effect immediately, provided that section one
42 of this act shall take effect one year after it shall have become a law.