On May 17, 2024, Colorado Governor Jared Polis signed Colorado’s historic artificial intelligence (AI) consumer protection bill, SB 24-205, colloquially known as “Colorado’s AI Act” (“CAIA”), into law.

As we noted at the time, CAIA aims to prevent algorithmic discrimination in AI decision-making that affects “consequential decisions”—including those with a material, legal, or similarly significant effect with respect to health care services and employment decision-making. The bill is scheduled to take effect February 1, 2026.

The same day he signed CAIA, however, Governor Polis addressed a “signing statement” letter to Colorado’s General Assembly articulating his reservations. He urged sponsors, stakeholders, industry leaders, and more to “fine tune” the measure over the next two years to sufficiently protect technology, competition, and innovation in the state.

As the local and national political climate steers toward a less restrictive AI policy, Governor Polis drafted another letter to the Colorado legislature. On May 5, 2025, Polis—along with Attorney General Phil Weiser, Denver Mayor Mike Johnston, and others—requested that CAIA’s effective date be delayed until January 2027.

“Over the past year, stakeholders and legislators together have worked to find the right path forward on Colorado’s first-in-the-nation artificial intelligence regulatory law,” the letter states, adding that the collaboration took “many months” and “brought many ideas, concerns, and priorities to the table from a wide range of communities.” Nevertheless, “it is clear that more time is needed to continue important stakeholder work to ensure that Colorado’s artificial intelligence regulatory law is effective and implementable.”

The letter came the same day that SB 25-318, a bill that would have amended CAIA, was postponed indefinitely by the state Senate and reportedly killed by its own sponsor. Colorado Senate Majority Leader Robert Rodriguez introduced SB 25-318, entitled “Artificial Intelligence Consumer Protections,” just one week earlier.

On May 6, 2025, the day before the legislative session in Colorado ended, House Democrats made an eleventh-hour attempt to postpone the effective date of CAIA by inserting the delay into another unrelated bill, but that attempt also failed.

Proponents for the delay are calling for a framework “that protects privacy and fairness without stifling innovation or driving business away from our state,” as the Polis letter states. Technology groups have urged Governor Polis to call a special legislative session to delay implementation of CAIA. 

SB 25-318 Key Provisions

Despite SB 25-318’s failure to pass, several provisions remain noteworthy and likely to remain part of the ongoing policy debate. Viewed as “thoughtful amendments” by some commentators, the legislation would have modified the consumer protections of CAIA, which required developers and/or deployers of AI systems to implement a risk management program; do an impact assessment and make notifications to consumers. If it passed, SB 25-318 would have delayed many requirements from February 1, 2026, to January 1, 2027, and included the following adjustments:

Definitions. SB 25-318 attempted to redefine “algorithmic discrimination” to mean the use of an AI system that results in a violation of any applicable federal, state, or local discrimination law. It also would have created exemptions to the definition of “developer” of an AI system and exempted certain technologies, such as those performing a narrow procedural task, or cybersecurity and data security systems, from the definition of “high-risk AI systems.”

Reasonable Care. The bill would have eliminated the duty of developers or deployers of a high-risk AI system to use reasonable care to protect consumers from known or reasonably foreseeable risks of algorithmic discrimination and further would have eliminated the duty of deployers to notify the attorney general of such risks arising from intended uses or that the system causes algorithmic discrimination.

Developer Disclosures. SB 25-318 sought to exempt developers from specified disclosure requirements if, for example, the systems make 10,000 or fewer consequential decisions in a year for 2027-2028, decreasing to 2,500 or fewer for 2029-2030. Other contemplated exemptions included instances where developers received less than $10,000 from investors, have annual revenues of less than $5,000,000, have operated and generated revenue for less than 5 years, etc. It sought to broaden disclosure requirement exemptions for deployers based on the number of full-time employees (500 instead of 50 between 2027 and 2028 and decreasing to 100 in 2029). It further would have also exempted developers with respect to the use of AI in hiring. A further exemption would apply if the AI system produces or consists of a score, model, algorithm, or similar output that is a consumer report subject to the Fair Credit Reporting Act.

Impact Assessments. SB 25-318 sought to amend the requirement that deployers, or third parties contracted by deployers, complete impact assessments within 90 days of a substantial modification to instead require these impact assessments be completed before the first deployment or January 1, 2027, whichever comes first, and annually thereafter. SB 25-318 would have also required deployers to include in an impact assessment whether the system poses any known or reasonably foreseeable risks of limiting accessibility for certain individuals, an unfair or deceptive trade practice, a violation of state or federal labor laws, or a violation of the Colorado Privacy Act.

Disclosures to Consumers. SB 25-318 attempted to require deployers to provide additional information to consumers if a high-risk AI system makes, or is a substantial factor in making, a consequential decision. It further included a transparency requirement that consumer disclosures must include information on whether and how consumers can exercise their rights.  

Documentation Requirements. SB 25-318 would have required developers and deployers to maintain required documentation, disclosures, and other records with respect to each high-risk AI system throughout the period during which the developer sells, markets, distributes, or makes available the high-risk AI system—and for at least three years following the last date on which the developer sells, markets, distributes, or makes available the high-risk AI system.

Takeaways

Because Colorado’s 2025 legislative session ended at midnight on Wednesday, May 7, the CAIA will go into effect as originally passed on February 1, 2026, unless Governor Polis calls a special session, or a new bill is introduced in time for the new legislative session on January 14. To the extent additional attempts to modify CAIA arise before February 1, 2026, we anticipate that they will revive certain issues addressed in SB 25-318 as part of such efforts.

Many outside of Colorado are also following this process closely, including other states who are using CAIA as a framework for their own state laws and by federal lawmakers whose efforts to pass comprehensive AI legislation through Congress have stalled. On Tuesday, May 13, the House Energy and Commerce Committee will mark up language for potential inclusion in the reconciliation package that would prevent states from passing and implementing such AI laws for 10 years, but this language may not pass.

As we noted last year, organizations should start to consider compliance issues including policy development, impact assessments, engagement with AI auditors, contract language in AI vendor agreements to reflect responsibilities and coordination, and more.  Impact assessments, in particular, take time and resources to design and conduct, and therefore we recommend that businesses using high-risk AI systems in Colorado begin preparations to conduct these impact assessments now, rather than waiting for a speculative change to the law. If properly designed, impact assessments will be a useful tool for businesses to ensure that their AI systems are reliable and deliver expected outcomes while minimizing the risk of algorithmic discrimination. 


The Epstein Becker Green team will be tracking developments and will provide further updates. If you have questions, please contact the authors of this post and/or the Epstein Becker Green attorneys with whom you normally work.


Epstein Becker Green Staff Attorney Ann W. Parks contributed to the preparation of this post.

Back to Health Law Advisor Blog

Search This Blog

Blog Editors

Authors

Related Services

Topics

Archives

Jump to Page

Subscribe

Sign up to receive an email notification when new Health Law Advisor posts are published:

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.