Inori
FeaturesToolsPricing
Learn
GuidesStep-by-step tutorials and walkthroughs
GlossaryInsurance and compliance terminology
CompareSee how Inori compares to alternatives
Support
Help CenterFind answers and get support
ChangelogLatest updates and improvements
DemoSee Inori in action
Legal
PrivacyHow we handle your data
TermsTerms of service and usage
Blog
Sign InStart Free

Product

  • Features
  • Pricing
  • Tools
  • Demo

Resources

  • Help Center
  • Guides
  • Glossary
  • Compare

Company

  • About
  • Blog
  • Changelog
  • Contact

Legal

  • Privacy
  • Terms
  • DPA
  • Security

© 2026 Inori Inc.

  1. Home
  2. /Glossary
  3. /Right to Opt Out of Profiling

Right to Opt Out of Profiling

A consumer right recognized by 18 of the 20 comprehensive US state privacy laws to decline being subject to automated decision-making that produces legal or similarly significant effects — such as denial of credit, housing, insurance, or employment.

Overview

The Right to Opt Out of Profiling lets a consumer refuse to be subject to solely automated decisions that produce legally or similarly significant consequences. "Profiling" in this context is narrower than the everyday meaning — it refers to automated processing that evaluates, analyzes, or predicts aspects of a natural person and feeds into a consequential decision.

The right is rooted in GDPR Article 22 but has been adopted — with softer carve-outs — by nearly every US state privacy law. The CCPA/CPRA regulatory overlay from the California Privacy Protection Agency adds a stricter layer in the form of the ADMT (Automated Decision-Making Technology) regulations.

The right does not require the controller to abandon automation. It requires the controller to give the consumer an opt-out mechanism and, depending on the state, a human-review alternative when the decision is significant enough.

When It Applies

The opt-out right engages when all three conditions are met:

  1. Solely or largely automated processing — a human does not meaningfully review the decision before it takes effect
  2. Profiling — the system evaluates, analyzes, or predicts aspects of the consumer (creditworthiness, reliability, behavior, health, preferences, movements)
  3. Legal or similarly significant effects — denial of, or access to, financial services, housing, insurance, education, employment, healthcare, essential goods, or criminal-justice outcomes

Examples that trigger the right:

  • Algorithmic underwriting for insurance premiums or eligibility
  • Automated tenant-screening decisions in commercial real estate
  • Resume-filtering systems in recruitment
  • Algorithmic credit scoring used for loan approval
  • Dynamic pricing based on consumer attributes

Examples that typically do not trigger the right:

  • Content recommendation on a streaming service (no legal or similarly significant effect)
  • Fraud-detection scoring that flags a transaction for human review (not "solely" automated)
  • Aggregate analytics used for business planning (no individual decision)

Variations Across Jurisdictions

From docs/privacy-knowledge/consolidated/COMPLIANCE_MATRIX.md Table 1:

StateRight AvailableNotable Nuance
California (CCPA/CPRA)YesADMT regulations impose additional pre-use notice + access + opt-out + human-review obligations
Virginia, Colorado, ConnecticutYesStandard framework
Colorado AI Act (SB205)YesParallel high-risk AI obligations; DPIAs interact with CPA DPIAs
UtahNoUCPA omits profiling opt-out entirely
IowaNoSimilarly minimal
Oregon, Texas, Florida, Montana, Delaware, Nebraska, New Hampshire, New Jersey, Tennessee, Minnesota, Maryland, Indiana, Kentucky, Rhode IslandYesStandard framework

California's ADMT rules go further than any peer state: controllers must provide a pre-use notice before deploying ADMT, give consumers access to information about the system's logic, and offer a human-review alternative for consequential decisions.

Colorado layers its SB205 AI Act on top — for "high-risk AI systems" (which includes most employment, housing, and insurance automation), controllers must publish a statement about the system, provide consumer notice, and conduct algorithmic impact assessments in addition to the CPA DPIA.

How Inori Handles This

Inori uses automated processing in two narrow contexts: (i) AI-assisted extraction of coverage data from certificates via Claude Haiku, and (ii) automated compliance scoring of vendor records.

Grounding in code:

  • Scope — the extraction step is pre-decision (a human reviewer confirms before action); the compliance score is deterministic rule-matching, not probabilistic profiling. Neither currently meets the "solely automated + consequential effect" threshold for profiling opt-out under any state law.
  • Pre-use notice — src/content/legal/privacy.mdx v1.2 discloses the use of AI extraction in the "Automated Processing" section, per California ADMT's anticipated notice requirement.
  • Human review — the review_status enum in certificates (migration 014_certificate_review.sql) plus /api/certificates/[id]/review enforce a human checkpoint before extracted data drives any downstream compliance decision.
  • Opt-out channel — if a customer-controller delegates automated decisioning to Inori (future capability), the DSAR flow at src/app/api/dsar/ accepts profiling-opt-out requests and routes them with the 45-day statutory timeline.
  • Heuristic versioning — certificates.guard_version (SP17) provides the auditable record of which version of the compliance heuristic was applied to each decision.

Related Concepts

Profiling opt-out is frequently bundled with Sensitive Personal Information analysis — automated decisions using SPI are the highest-risk category. A DPIA is effectively mandatory before deploying any profiling system. California's ADMT regime sits inside the broader CCPA/CPRA framework.

See how Inori handles right to opt out of profiling

Try our free COI checker first, or start a free trial of the full platform.

Free COI CheckerStart Free Trial

Related Terms

Data Protection Impact Assessment (DPIA)

A documented risk analysis required before processing activities that present a heightened risk to consumers — such as profiling, targeted advertising, sale of personal data, or processing of sensitive categories.

Sensitive Personal Information (SPI)

Categories of personal data that receive heightened protection under state privacy laws — including race, health, biometric, genetic, precise geolocation, sexual orientation, immigration status, and children's data — typically requiring opt-in consent.

CCPA / CPRA (California Consumer Privacy Act / California Privacy Rights Act)

California's comprehensive consumer privacy laws giving residents the right to know, delete, correct, and opt out of the sale or sharing of their personal information. CPRA amended and expanded CCPA effective January 1, 2023.

Colorado Privacy Act (CPA)

Colorado's comprehensive privacy law — the third state after California and Virginia — notable for being the first to formally approve Global Privacy Control as a Universal Opt-Out Mechanism and for pairing with the Colorado AI Act.