
Colorado's AI Act Is Blocked by a Court Order — But the June 30 Deadline Hasn't Moved
Apr 30, 2026
On April 27, a Colorado Magistrate Judge blocked enforcement of Colorado's AI Act until the AG finishes rulemaking — but the law's June 30 effective date remains in place and a legislative rewrite is moving in parallel. Here's what that means for developers and deployers right now.
Colorado's AI Act has a June 30 effective date, a court order blocking enforcement, a working group drafting a full rewrite, and a legislature that hasn't acted yet. If you build or deploy AI systems that affect Colorado consumers, you are navigating the most uncertain state AI compliance window in the country — and the decisions you make in the next 61 days will matter.
What Just Happened: The April 27 Court Order
On Monday, April 27, 2026, a Colorado Magistrate Judge ordered the Colorado Attorney General not to enforce SB 24-205 — Colorado's Artificial Intelligence Act — until the AG's office completes its rulemaking process implementing the law. The order means that even if the June 30 effective date arrives with the law technically in force, the AG cannot bring enforcement actions against developers or deployers for violations until final rules are adopted. That rulemaking is not yet complete, and no completion date has been publicly committed to.
This is a significant development, but it is not a clean reprieve. The court order blocks AG enforcement — it does not suspend the law itself. The June 30 effective date still stands on the statute books. Obligations that attach on June 30 — impact assessments, risk management policies, consumer disclosures, developer documentation requirements — are still live legal obligations once that date passes, even if the AG cannot immediately sue for violations. And separately, a working group convened by the Governor's office is drafting a full repeal and reenactment of the law, with a proposed reset to January 1, 2027. That bill has not been introduced in the legislature. Until it is enacted, SB 24-205 remains the law of Colorado.
What Colorado's AI Act Actually Requires
Who it covers: developers and deployers of high-risk AI systems.
SB 24-205 draws a distinction between developers — companies that build and sell AI systems — and deployers — companies that integrate and use AI systems to make consequential decisions about Colorado consumers. A "high-risk artificial intelligence system" is one that makes, or is a substantial factor in making, a "consequential decision" about a consumer. Consequential decisions include employment decisions (hiring, termination, compensation), housing (rental or purchase access and terms), credit and financial services, health care, insurance, education enrollment, and access to legal services. If your product makes or materially influences these decisions about Colorado residents, you are in scope — regardless of where your company is incorporated or headquartered.
Developer obligations: documentation, disclosure, and 90-day AG notification.
Developers of high-risk AI systems must provide deployers with comprehensive documentation — model cards, dataset cards, training data summaries, known harmful uses, technical capabilities and limitations, and materials necessary for deployers to conduct their own impact assessments. Developers must maintain a publicly available statement or use-case inventory describing the types of high-risk systems they offer and how they manage algorithmic discrimination risks. If a developer discovers a risk of algorithmic discrimination in a deployed system, they must notify the Colorado AG and all known deployers within 90 days of discovery.
Deployer obligations: risk management programs, annual impact assessments, consumer rights.
Deployers must establish and maintain a risk management policy and program — aligned with the NIST AI Risk Management Framework (AI RMF) or an equivalent recognized framework — governing each high-risk AI system they deploy. They must complete an initial impact assessment and repeat it annually and within 90 days of any substantial modification to the system. They must disclose to consumers when an AI system has made an adverse consequential decision, provide a plain-language explanation of that decision, and offer the ability to appeal the decision through human review if technically feasible. Deployers must also maintain a public summary of the types of high-risk systems they deploy and how algorithmic discrimination risks are managed.
Penalties: up to $20,000 per violation.
Violations of SB 24-205 constitute deceptive trade practices under the Colorado Consumer Protection Act (CCPA), subject to civil penalties of up to $20,000 per violation. The AG has exclusive enforcement authority — meaning private lawsuits by consumers are not available, but AG enforcement is. Once the rulemaking is finalized and the court order no longer blocks enforcement, the AG can bring actions for any violations that occurred after June 30, even if enforcement was delayed.
The affirmative defense: NIST AI RMF compliance.
The law provides an affirmative defense for developers and deployers who discover and cure violations through their own compliance programs, provided they are also in material compliance with the NIST AI Risk Management Framework or another designated standard. This defense means that building a documented, NIST-aligned AI governance program is not just good practice — it is the specific legal shield the statute provides.
The Legislative Wild Card: The Proposed Rewrite
The Governor's working group — which includes legislators, industry representatives, consumer advocates, and school district stakeholders — has circulated a draft that would repeal and reenact SB 24-205 in a substantially revised form, refocusing the law on "automated decision-making technology" (ADMT) and resetting the effective date to January 1, 2027. The draft removes the standalone affirmative duty to "avoid algorithmic discrimination" that appeared in the original law, signals a narrower and more decision-focused compliance framework, and gives businesses additional time to implement.
But this bill has not been introduced, has not passed, and has no guaranteed timeline. Colorado's legislative session has limited floor time. Until the legislature acts, the current SB 24-205 — court order and all — is the operative law. Companies that assume the rewrite will pass and delay their compliance work are betting on a legislative outcome that is not certain.
What This Means for AI Founders, Deployers, and Colorado Businesses
If you are a startup building an AI product that touches employment decisions, credit decisions, healthcare decisions, or educational access — and you have any Colorado users — SB 24-205 applies to you. The court order does not change your substantive obligations; it only delays the AG's ability to enforce them. And that delay could be short. If the AG's rulemaking completes before the legislative session ends, enforcement authority restores immediately.
For larger companies deploying AI systems across multiple product lines — HR tools, underwriting algorithms, loan decisioning systems, clinical decision support — the compliance footprint under SB 24-205 is substantial. Annual impact assessments, risk management program documentation, consumer disclosure infrastructure, and 90-day AG notification protocols must all be designed and operational by June 30 under the current law, even if enforcement is temporarily suspended.
For AI developers selling into Colorado — platform companies, foundation model providers, enterprise AI vendors — the documentation obligations to your deployer customers are your most immediate action item. Deployers cannot complete their own impact assessments without the materials you are required to provide. If you have Colorado-based enterprise customers using your AI system for consequential decisions, their compliance depends in part on your documentation.
What Founders and Deployers Should Do Right Now
Audit your AI systems against the "high-risk" definition immediately. Does your product make, or substantially influence, consequential decisions about employment, housing, credit, healthcare, insurance, education, or legal services for Colorado consumers? If yes, you are in scope. If you are uncertain, the answer is almost certainly yes — the definition is written broadly and the AG has rule-making authority to expand it further.
Start building your NIST AI RMF-aligned risk management program now. This is both a compliance obligation and your affirmative defense. A documented, operationalized AI risk management program aligned to NIST's AI RMF is the specific legal shield the statute provides. It also positions you for compliance with California's 2027 requirements, EU AI Act obligations, and whatever the rewritten Colorado law ultimately requires. Build it to be durable, not jurisdiction-specific.
Developers: prepare your deployer documentation packages. Model cards, dataset documentation, known-use-case descriptions, algorithmic discrimination risk materials — these need to be ready and deliverable to your enterprise customers. If you cannot provide them, your customers cannot comply.
Deployers: begin your initial impact assessment now. The law requires completion within 90 days of the effective date — meaning by September 28, 2026 under current law, or sooner if you are using the system on June 30. The assessment requires information from your developer that may take time to obtain. Start the process now, not in September.
Watch the legislative calendar closely. If the repeal-and-rewrite bill is introduced and advances in the Colorado legislature this session, it changes the compliance timeline materially. We are tracking the bill's status closely and will update clients the day it is introduced. Do not base your compliance planning on a legislative outcome that has not happened.
Strategic Takeaway
Opportunity → Colorado is the first U.S. state to enact comprehensive high-risk AI regulation. Companies that build genuine NIST-aligned AI governance programs now are not just complying with SB 24-205 — they are building the compliance infrastructure that federal AI regulation (when it comes) and other state laws will require. First movers on AI governance have a credibility advantage with enterprise customers, regulators, and investors that is becoming a real commercial differentiator in 2026.
Risk → The court order and legislative uncertainty are creating a false sense of reprieve for Colorado AI companies. The AG enforcement block is temporary and conditional. The law's obligations are real. And companies that wait for the rewrite — which may or may not pass, on an unknown timeline — may find themselves scrambling to comply with a new law that has a January 2027 effective date and no grandfather period for prior-practice systems.
What Comes Next
Watch three simultaneous tracks: (1) the AG's rulemaking process — when finalized, enforcement authority restores and the court order becomes moot; (2) the legislative session — whether and when the repeal-and-rewrite bill is introduced and advances; and (3) the June 30 effective date itself — even with the court order in place, obligations attach on that date and retroactive enforcement is possible once the rulemaking is complete. At Launch Legal, we are tracking all three tracks and advising Colorado AI companies on compliance architecture that works across all likely outcomes.
Bottom Line
Colorado's AI Act is in legal and legislative flux. The court order buys time on enforcement, not on obligations. The June 30 deadline is still on the books. The rewrite is still a draft. The right posture for any company deploying high-risk AI to Colorado consumers is to build toward compliance now — not because enforcement is certain, but because the alternative is designing your AI governance program under emergency conditions when the window closes.
Learn More
Colorado SB 24-205 — Full Bill Text (Colorado General Assembly)
Colorado's AI Law Coming Online: What Developers and Deployers Should Know — Brownstein
State AI Laws — Where Are They Now? (April 24, 2026) — Cooley
Colorado AI Act: Postponed Implementation History — Akin Gump
Colorado AI Act: Developer and Deployer Obligations — White & Case