Artificial intelligence workers may get special protections in Colorado

The Colorado Capitol in January 2025

Even before Colorado has established regulations for artificial-intelligence development, legislators are advancing a proposal to give special whistleblower protection to sector workers — a plan raising significant concerns for business groups and Gov. Jared Polis.

It’s not the broad idea of safeguarding whistleblowers that bothers opponents of House Bill 1212 so much as several provisions in the bill and the idea that Colorado would be alone in offering such great protections to workers who report major risks to state authorities. A key official from the Democratic governor’s administration said that Polis worries this will incentivize companies to move AI development and highly paid workers outside of Colorado to states where they face lower levels of regulation.

Still, HB 1212, sponsored by Democratic Rep. Manny Rutinel of Commerce City and GOP Rep. Matt Soper of Delta, passed it first legislative test Tuesday, advancing out of the House Judiciary Committee on a 7-3 vote that garnered bipartisan support and opposition. Its backers promised to continue negotiating changes but said the threat of public harm from poorly developed AI systems is great enough that workers willing to go over the heads of their bosses to report problems must know the state will come to their defense.

“We certainly want to be able to encourage workers in the artificial-intelligence space that that when they see vulnerabilities to our public safety or national security or financial systems to potential fraud and abuse that they say something to their managers — and that they are able to address those vulnerabilities and those weaknesses before they impact all of us in society,” Soper said.

Colorado state Rep. Matt Soper speaks in a House committee hearing in April 2024.

How the new bill fits into artificial intelligence conversation

AI development is advancing rapidly, as users employ such programs to write speeches and papers, to weed out job applicants and even to understand complex medical needs. Polis last year signed the nation’s most comprehensive AI regulatory bill, aimed at barring discrimination by these programs, but he immediately put together a task force to examine the issue and tweak rules before they are set to go into place in February 2026.

That task force has been meeting for more than six months and is deep in the throes of coming up with definitions around what kind of significant decisions AI must impact before a system’s developer and deployer are subject to the regulations. Business groups on that task force, including TechNet and the Colorado Chamber of Commerce, have asked that they be allowed to finish their work before legislators place another layer of rules and regulations onto the field.

Rutinel and Soper argued that while AI holds potential for improving sectors from health care to education, the fact that it also can be manipulated to expose private information, pose cybersecurity attacks or even launch weapons means it must have greater oversight. The best oversight will come from workers within the companies developing foundational AI programs who can raise red flags and, if not taken seriously by their superiors, can alert the Colorado Attorney General’s office or other state authorities.

Colorado state Reps. Brianna Titone and Manny Rutinel discuss their AI bill with the House in May. Rutinel co-sponsored last year’s AI regulatory bill and now is a co-sponsor of the whistleblower-protection bill.

What the bill would do

HB 1212, following some amendments Tuesday, is tailored narrowly at the10 or so largest AI developers in the world — companies like OpenAI, Google and Amazon. Companies subject to the proposed law would have to spend at least $100 million in model training over the course of a 12-month period and have spent at least $20 million training a specific AI model, though the law then applies to them for five years prospectively.

It gives protection against job-based retaliation to employees who report either violations of the law or their belief that the developer’s activities pose a substantial risk to public safety that could result in death, bodily injury or damage of property. That risk to public safety does not have to involve violation of a law, just their belief that a foundational AI program could be used in such a way to put people’s lives and security at risk.

While Colorado has whistleblower laws already, those protect disclosure of information on risks to health and safety within a workplace. HB 1212 transposes that to offer protection to employees who report findings that development of products within their workplaces could be harmful to the public at large.

“Whistleblowers act as an essential guardian to our democracy, and nowhere is that more true than in the tech industry,” said Jennifer Gibson, director of PSST, an organization that helps tech workers who want to speak out about irregularities. “It signals to workers inside AI labs that Colorado legislators have their backs … and it gives them the assurance that if they are not being listened to internally, they can speak their concerns externally.”

Pushback against “punitive” proposal

But in addition to being premature, the bill creates opportunities for large lawsuits over violations like developers not providing monthly reports, opponents such as TechNet lobbyist Melanie Layton and Colorado Chamber senior vice president Meghan Dollar said. It includes a private right of action allowing workers to sue for reinstatement, lost pay, punitive damages and attorneys’ fees, which Colorado Completive Council Executive Director Rachel Beck called “a pretty punitive remedy for a bill that’s written so broadly.”

Gov. Jared Polis speaks to a Colorado Chamber of Commerce board meeting in 2023.

In addition, Polis opposes the bill because he fears it will chill growth of a tech sector that accounts for 300,000 jobs and $125 billion in generated revenue in Colorado, said Michael McReynolds, legislative director for the governor’s Office of Information Technology. The state already has whistleblower protections in place, McReynolds noted, so giving special legal abilities to AI workers could cause largely international companies to shift workforces to areas where workplace disagreements won’t end in punitive damages.

“While the proponents of the bill may be well-intentioned … this overreach could deter investment in a sector that is vital to Colorado’s economic future,” McReynolds told the committee. “HB 25-1212 presents a significant risk to Colorado’s technology sector.”

Soper is no stranger to efforts to grow the sector, having sponsored a Polis-backed 2024 law that created $148 million in incentives and loan-loss funding to try to make Colorado a mecca for quantum computing. That law helped the state to be named a national quantum sector hub, a designation that brought with it a $40.5 million federal grant that’s expected to help attract billions of dollars in private capital and create some 10,000 jobs.

More debate on artificial intelligence to come

Rep. Cecelia Espinosa, the Denver Democrat who sided with two committee Republicans in opposing HB 1212 on Tuesday, questioned whether the whistleblower proposal would undo the positive effects of the hub designation and cost Colorado a major employer.

Soper, who sided with six committee Democrats in advancing the bill, said he doesn’t believe it will and asserted: “Whatever we need to do here to make sure quantum will still develop, we will do that.”

The conversation shifts now to the full House, where bill supporters must contend not only with criticisms that HB 1212 is too regulatory but skepticism from some Democrats who feel that it doesn’t go far enough. Rep. Yara Zokaie, D-Fort Collins, said, for example, that the definition of perceived risks that can allow workers to blow the whistle on AI developers should expand to include potential financial harm and violations of privacy in addition to body or property damage.

Meanwhile, the Artificial Intelligence Impact Task Force continues to meet to try to craft balanced regulations on AI developers and deployers that the Legislature can consider before the end of this session. Business and consumer groups have approached the issue with different viewpoints and are trying to find some consensus.