ALGORITHMIC DECISION-MAKING DISCLOSURE
ORGANIZATION INFORMATION
| Field | Information |
|---|---|
| Organization Name | [ORGANIZATION NAME] |
| Address | [FULL ADDRESS] |
| Privacy Contact | [NAME, EMAIL, PHONE] |
| AI/Technology Contact | [NAME, EMAIL, PHONE] |
| Disclosure Effective Date | [DATE] |
| Last Updated | [DATE] |
| Version | [VERSION NUMBER] |
PURPOSE OF THIS DISCLOSURE
This disclosure explains how [ORGANIZATION NAME] uses automated decision-making systems, including artificial intelligence (AI) and algorithmic tools, to make or assist with decisions that may affect you. We are committed to transparency about our use of these technologies.
This disclosure is provided in accordance with:
☐ EU General Data Protection Regulation (GDPR) Article 22
☐ EU Artificial Intelligence Act (Regulation (EU) 2024/1689)
☐ Colorado AI Act (SB 24-205)
☐ California Consumer Privacy Act (CCPA/CPRA)
☐ California AI Transparency Act (SB 942)
☐ Illinois Artificial Intelligence Video Interview Act
☐ NYC Local Law 144 (Automated Employment Decision Tools)
☐ Other applicable laws: [SPECIFY]
SECTION 1: OVERVIEW OF AUTOMATED DECISION-MAKING
1.1 What is Automated Decision-Making?
Automated decision-making (ADM) refers to the process of making decisions by technological means without human involvement. This includes:
- Fully Automated Decisions: Decisions made entirely by technology without any human review
- AI-Assisted Decisions: Decisions where technology provides recommendations or scores that humans consider
- Algorithmic Profiling: Creating predictions about individuals based on analysis of their data
1.2 Our Commitment to Transparency
We believe you have the right to know when automated systems influence decisions about you. This disclosure provides:
- Information about which automated systems we use
- How these systems work and what data they use
- What decisions they influence
- Your rights regarding these decisions
- How to request human review or appeal decisions
SECTION 2: AUTOMATED SYSTEMS WE USE
2.1 System Inventory
The following automated decision-making systems are currently in use:
SYSTEM 1: [SYSTEM NAME]
| Attribute | Description |
|---|---|
| System Name | [NAME] |
| Provider | ☐ Internal ☐ Third-Party: [VENDOR NAME] |
| Decision Area | [e.g., Employment, Credit, Insurance, Services] |
| Risk Level | ☐ High-Risk ☐ Limited Risk ☐ Minimal Risk |
| Human Involvement | ☐ Fully Automated ☐ AI-Assisted with Human Review |
Purpose and Use:
[Describe the purpose of this system and how it is used in decision-making]
Data Inputs:
The system uses the following types of information:
- ☐ Personal identifying information
- ☐ Demographic information
- ☐ Employment/educational history
- ☐ Financial information
- ☐ Behavioral data
- ☐ Historical transaction data
- ☐ Third-party data
- ☐ Other: [SPECIFY]
How the System Works:
[Provide a plain-language explanation of how the system processes data and reaches outputs]
Decisions Affected:
This system influences the following decisions:
- [DECISION TYPE 1]
- [DECISION TYPE 2]
- [DECISION TYPE 3]
Human Oversight:
☐ All outputs are reviewed by a human before final decisions
☐ Outputs above certain thresholds trigger human review
☐ Humans can override system recommendations
☐ Periodic audits of system decisions
☐ Other: [DESCRIBE]
SYSTEM 2: [SYSTEM NAME]
| Attribute | Description |
|---|---|
| System Name | [NAME] |
| Provider | ☐ Internal ☐ Third-Party: [VENDOR NAME] |
| Decision Area | [e.g., Employment, Credit, Insurance, Services] |
| Risk Level | ☐ High-Risk ☐ Limited Risk ☐ Minimal Risk |
| Human Involvement | ☐ Fully Automated ☐ AI-Assisted with Human Review |
Purpose and Use:
[Describe the purpose of this system and how it is used in decision-making]
Data Inputs:
The system uses the following types of information:
- ☐ Personal identifying information
- ☐ Demographic information
- ☐ Employment/educational history
- ☐ Financial information
- ☐ Behavioral data
- ☐ Historical transaction data
- ☐ Third-party data
- ☐ Other: [SPECIFY]
How the System Works:
[Provide a plain-language explanation of how the system processes data and reaches outputs]
Decisions Affected:
This system influences the following decisions:
- [DECISION TYPE 1]
- [DECISION TYPE 2]
- [DECISION TYPE 3]
Human Oversight:
☐ All outputs are reviewed by a human before final decisions
☐ Outputs above certain thresholds trigger human review
☐ Humans can override system recommendations
☐ Periodic audits of system decisions
☐ Other: [DESCRIBE]
SYSTEM 3: [SYSTEM NAME]
[Repeat format for additional systems]
2.2 Decision Categories
We use automated decision-making in the following areas:
Employment Decisions
☐ We use automated systems for employment-related decisions
| Decision Type | System Used | Fully Automated? | Human Review? |
|---|---|---|---|
| ☐ Resume screening | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Video interview analysis | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Skills assessment | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Background screening | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Promotion decisions | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Performance evaluation | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Compensation decisions | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Termination decisions | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
Financial/Credit Decisions
☐ We use automated systems for financial decisions
| Decision Type | System Used | Fully Automated? | Human Review? |
|---|---|---|---|
| ☐ Credit decisions | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Loan applications | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Interest rate setting | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Credit limit decisions | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Fraud detection | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
Insurance Decisions
☐ We use automated systems for insurance decisions
| Decision Type | System Used | Fully Automated? | Human Review? |
|---|---|---|---|
| ☐ Underwriting | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Premium calculation | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Claims processing | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Risk assessment | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
Service Access Decisions
☐ We use automated systems for service-related decisions
| Decision Type | System Used | Fully Automated? | Human Review? |
|---|---|---|---|
| ☐ Account approval | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Service eligibility | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Pricing/offers | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Content recommendations | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
Healthcare Decisions
☐ We use automated systems for healthcare-related decisions
| Decision Type | System Used | Fully Automated? | Human Review? |
|---|---|---|---|
| ☐ Treatment recommendations | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Diagnosis support | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Coverage determinations | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
| ☐ Appointment scheduling | [SYSTEM] | ☐ Yes ☐ No | ☐ Yes ☐ No |
SECTION 3: THE LOGIC BEHIND OUR SYSTEMS
3.1 General Methodology
Our automated decision-making systems generally work by:
- Data Collection: Gathering relevant information from various sources
- Data Processing: Analyzing information using algorithms and models
- Output Generation: Producing scores, recommendations, or decisions
- Human Review: [If applicable] Review by qualified personnel
- Final Decision: Implementation of the decision
3.2 Factors Considered
Our systems consider factors such as:
Factors That May Positively Influence Decisions:
- [FACTOR 1]
- [FACTOR 2]
- [FACTOR 3]
Factors That May Negatively Influence Decisions:
- [FACTOR 1]
- [FACTOR 2]
- [FACTOR 3]
Factors We Do NOT Consider:
We do not use the following factors in our automated decisions:
- Race, ethnicity, or national origin
- Religion
- Gender identity or sexual orientation
- Disability status (except where legally permitted and relevant)
- [OTHER PROTECTED CHARACTERISTICS]
3.3 Limitations and Accuracy
We acknowledge that our automated systems:
- Are not perfect and may make errors
- May not account for all relevant individual circumstances
- Require ongoing monitoring and adjustment
- Benefit from human oversight and review
Known Limitations:
[Describe known limitations of the systems]
Accuracy Information:
[Provide available accuracy metrics or performance information]
SECTION 4: DATA SOURCES AND RETENTION
4.1 Data Sources
Our automated systems use data from:
| Source Type | Description | System(s) Using |
|---|---|---|
| ☐ Information you provide directly | [DESCRIBE] | [SYSTEMS] |
| ☐ Information from your use of our services | [DESCRIBE] | [SYSTEMS] |
| ☐ Information from third parties | [DESCRIBE] | [SYSTEMS] |
| ☐ Publicly available information | [DESCRIBE] | [SYSTEMS] |
| ☐ Information from affiliated companies | [DESCRIBE] | [SYSTEMS] |
4.2 Data Retention
| Data Type | Retention Period | Purpose |
|---|---|---|
| Input data | [PERIOD] | [PURPOSE] |
| System outputs | [PERIOD] | [PURPOSE] |
| Decision records | [PERIOD] | [PURPOSE] |
| Audit logs | [PERIOD] | [PURPOSE] |
SECTION 5: YOUR RIGHTS
5.1 Right to Information
You have the right to:
☐ Know when automated decision-making is used
☐ Understand the logic involved in the decision
☐ Know the significance and potential consequences
☐ Receive information about the data used
5.2 Right to Human Review
You may request human review of automated decisions:
When Human Review is Available:
- For decisions that significantly affect you
- When required by applicable law
- When our policies permit appeal
How to Request Human Review:
[PROVIDE CONTACT INFORMATION AND PROCESS]
Response Time: We will respond to human review requests within [TIMEFRAME]
5.3 Right to Contest Decisions
You may contest automated decisions by:
- Submitting a written request to [CONTACT]
- Explaining why you believe the decision is incorrect
- Providing any additional information or context
- Awaiting our review and response
Appeal Process:
[DESCRIBE THE APPEAL PROCESS]
5.4 Right to Opt Out
In certain circumstances, you may have the right to opt out of automated decision-making:
☐ Opt-out available for: [DESCRIBE CIRCUMSTANCES]
☐ Opt-out not available because: [EXPLAIN LEGAL BASIS]
To Opt Out:
[PROVIDE PROCESS FOR OPTING OUT]
Consequences of Opting Out:
[DESCRIBE ANY CONSEQUENCES OR ALTERNATIVE PROCESSES]
5.5 Additional Rights Under Specific Laws
Under GDPR (EU/EEA residents):
- Right not to be subject to decisions based solely on automated processing that produce legal or similarly significant effects
- Right to obtain human intervention
- Right to express your point of view
- Right to contest the decision
Under Colorado AI Act (Colorado residents):
- Right to be informed of AI use in consequential decisions
- Right to an explanation of how AI contributed to the decision
- Right to appeal and obtain human review
- Right to correct personal data used in the decision
Under CCPA/CPRA (California residents):
- Right to know about automated decision-making
- Right to access the categories of information used
- Right to opt out of certain profiling activities
SECTION 6: FAIRNESS AND BIAS PREVENTION
6.1 Our Commitment to Fairness
We are committed to ensuring our automated systems operate fairly and do not discriminate. Our measures include:
☐ Regular bias testing and audits
☐ Diverse data sets for system training
☐ Human review of high-stakes decisions
☐ Ongoing monitoring for disparate impact
☐ Third-party audits (where applicable)
☐ Employee training on fair use of AI systems
6.2 Bias Testing and Audits
Testing Frequency: [DESCRIBE FREQUENCY]
Testing Methodology: [DESCRIBE METHODOLOGY]
Most Recent Audit Date: [DATE]
Audit Results: [SUMMARIZE OR PROVIDE LINK TO RESULTS]
6.3 Reporting Concerns
If you believe you have experienced unfair treatment due to our automated systems:
Contact: [CONTACT INFORMATION]
What to Include:
- Your name and contact information
- Description of the decision or interaction
- Why you believe it was unfair
- Any supporting documentation
Response Time: We will investigate and respond within [TIMEFRAME]
SECTION 7: CONTACT INFORMATION
7.1 Questions About This Disclosure
For questions about this disclosure or our automated decision-making practices:
| Contact Type | Information |
|---|---|
| [EMAIL ADDRESS] | |
| Phone | [PHONE NUMBER] |
| [MAILING ADDRESS] | |
| Online Form | [URL] |
7.2 To Exercise Your Rights
To request human review, contest a decision, or exercise other rights:
| Contact Type | Information |
|---|---|
| [EMAIL ADDRESS] | |
| Phone | [PHONE NUMBER] |
| Online Form | [URL] |
7.3 To Report Concerns
To report concerns about fairness or potential discrimination:
| Contact Type | Information |
|---|---|
| [EMAIL ADDRESS] | |
| Phone | [PHONE NUMBER] |
| Anonymous Reporting | [MECHANISM IF AVAILABLE] |
7.4 Data Protection Authority (EU/EEA)
For GDPR-related concerns, you may also contact:
Supervisory Authority: [NAME OF AUTHORITY]
Website: [URL]
SECTION 8: UPDATES TO THIS DISCLOSURE
We will update this disclosure as our automated decision-making practices change or as laws evolve.
How We Notify You of Changes:
☐ Posted on our website
☐ Email notification
☐ Notice in our app/service
☐ Other: [DESCRIBE]
Change Log:
| Date | Version | Description of Changes |
|---|---|---|
| [DATE] | [VERSION] | Initial disclosure |
| [DATE] | [VERSION] | [DESCRIBE CHANGES] |
| [DATE] | [VERSION] | [DESCRIBE CHANGES] |
SECTION 9: DEFINITIONS
Algorithm: A set of rules or instructions that a computer follows to solve a problem or make a decision.
Artificial Intelligence (AI): Computer systems designed to perform tasks that typically require human intelligence, such as learning, problem-solving, and decision-making.
Automated Decision-Making: The process of making decisions using technological means without human involvement.
Consequential Decision: A decision that has a material legal or similarly significant effect on an individual, such as decisions about employment, credit, insurance, housing, education, or criminal justice.
High-Risk AI System: An AI system classified as high-risk under the EU AI Act or similar regulations, typically involving significant impacts on health, safety, or fundamental rights.
Human-in-the-Loop: A process where a human reviews and can override automated system outputs before they become final decisions.
Profiling: Automated processing of personal data to evaluate certain aspects about a person, including analyzing or predicting their performance, economic situation, health, preferences, interests, behavior, or location.
APPENDIX A: LEGAL BASIS FOR AUTOMATED DECISION-MAKING
EU/GDPR Legal Basis
Under GDPR Article 22, we rely on the following legal bases for automated decision-making:
☐ Contract Necessity: The automated decision is necessary for entering into or performing a contract with you.
☐ Legal Obligation: The automated decision is authorized by EU or member state law.
☐ Explicit Consent: You have given explicit consent to the automated decision-making.
☐ Not Applicable: Our automated systems do not make decisions solely by automated processing that produce legal or similarly significant effects (human review is always involved).
U.S. Legal Basis
Our automated decision-making practices comply with applicable U.S. federal and state laws, including:
- Fair Credit Reporting Act (FCRA) for consumer reports
- Equal Credit Opportunity Act (ECOA) for credit decisions
- Fair Housing Act for housing-related decisions
- Title VII and state employment discrimination laws
- State-specific AI and automated decision-making laws
APPENDIX B: REQUEST FORMS
Human Review Request Form
Your Information:
Name: _________________________________
Contact Email: _________________________________
Contact Phone: _________________________________
Decision Information:
Decision Date: _________________________________
Decision Type: _________________________________
Reference/ID Number (if any): _________________________________
Your Request:
☐ I request human review of this automated decision
☐ I request an explanation of the logic involved
☐ I request access to the data used in the decision
☐ I wish to contest the decision
☐ I wish to provide additional information
☐ Other: _________________________________
Reason for Request:
[SPACE FOR EXPLANATION]
Supporting Documentation Attached:
☐ Yes (describe): _________________________________
☐ No
Signature: _________________________________
Date: _________________________________
Opt-Out Request Form
Your Information:
Name: _________________________________
Contact Email: _________________________________
Contact Phone: _________________________________
Opt-Out Request:
I request to opt out of automated decision-making for:
☐ All automated decisions (where legally permitted)
☐ Specific decision types: _________________________________
☐ Specific systems: _________________________________
Acknowledgment:
I understand that opting out may result in:
[LIST CONSEQUENCES]
☐ I acknowledge the above and wish to proceed with my opt-out request
Signature: _________________________________
Date: _________________________________
This Algorithmic Decision-Making Disclosure template is provided for informational purposes. Organizations should customize it based on their specific automated systems, applicable laws, and legal counsel advice.
About This Template
Jurisdiction-Specific
This template is drafted for general use across all U.S. jurisdictions. State-specific versions with local statutory references are also available.
How It's Made
Drafted using current statutory databases and legal standards for compliance regulatory. Each template includes proper legal citations, defined terms, and standard protective clauses.
Important Notice
This template is provided for informational purposes. It is not legal advice. We recommend having an attorney review any legal document before signing, especially for high-value or complex matters.
Last updated: February 2026