Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards

Life Health > Life Insurance

New Colorado Law Bans High-Tech Discrimination

Your article was successfully shared with the contacts you provided.

What You Need to Know

  • Colorado's law could affect issuers of annuities, disability insurance and long-term care insurance, as well as of life insurance.
  • Insurers could use regulatory efforts and lawsuits to block implementation.
  • Isaac Asimov was, sort of, here.

Colorado state lawmakers are trying to make robots and computers behave themselves.

They recently passed SB 169, a bill that prohibits insurers from using algorithms, external data sources and predictive modeling systems in ways that appear, from the perspective of the lawmakers, to discriminate against people based on “race,  color,  national  or  ethnic  origin,  religion,  sex,  sexual orientation, disability, gender identity, or gender expression.”

The new law applies to life, disability and long-term care insurance issuers, and to annuity issuers, as well as to property and casualty insurers.

Gov. Jared Polis, a Democrat, signed the bill into law earlier this month.

Data Types

Lawmakers have tried to protect the ability of life, annuity, long-term care insurance and disability insurance issuers to use traditional underwriting factors, family history, medical test, occupational, disability and behavioral information that “based on actuarially sound principles, has a direct relationship to mortality, morbidity, or longevity risk.”

But the new law prohibits use of that kind of information, even when the information has a direct relationship to mortality, morbidity or longevity risk, if that information comes from an algorithm or predictive model that uses “external consumer data and information sources,” if use of that information has the result of unfairly discriminating against protected classes of people.

An insurance underwriting “algorithm” is a set of rules that either a computer or a human can use to make decisions about whether to sell insurance to an applicant, and how much to charge the applicant for the insurance.

A “predictive modeling system” is a software program that helps a computer use data, rules about how the world works, and statistical methods to make forecasts.

The new law defines “external consumer data and information source” as “a data or an information source that is used by an insurer to supplement traditional underwriting or other insurance practices or to establish lifestyle indicators that are used in insurance practices.”

Some of those new types of data sources are “credit scores, social media habits, locations, purchasing habits, home ownership, educational attainment, occupation, licensures, civil judgments and court records.”


The new law puts Michael Conway, Colorado’s insurance commissioner, in charge of developing regulations that will show insurers what they have to do to demonstrate that use of algorithms, predictive models, and external data and information does not lead to unfair discrimination against protected classes of people.

Insurers and other parties will have a chance to react to the new law during a public comment period. The insurance commissioner’s review is supposed to include consideration of any solvency impacts of implementation of the rules.

The law is now set to take effect Jan. 1, 2023, at the earliest.

Insurers that believe that new rules are unworkable may be able to block implementation by persuading the insurance commissioner that the rules would hurt their solvency; by persuading lawmakers or the commissioner to put off the effective date; by persuading lawmakers or the commissioner to repeal or change the new law; or by opposing the new law in court.

The Consumer Federation of America View

The Consumer Federation of America and other consumer groups have been fighting for years to persuade state lawmakers and state insurance regulators to keep insurers from using automated systems and other high-tech analytical systems in ways that lead to unfair discrimination.

Douglas Heller, a federation representative, said in a comment welcoming the new Colorado law that the law “takes direct aim at insurance practices that have unfair and illegal outcomes, irrespective of the intention behind the practice.”

What It Means

Science fiction writer Isaac Asimov famously developed the Three Laws of Robotics at a time when writers showed far more interest in robots than in computers.

One version of Asimov’s first law states that, “A robot may not injure a human being under any conditions — and, as a corollary, must not permit a human being to be injured because of inaction on his part.”

Colorado’s new legislation means that one of the earliest high-profile laws governing computerized entities could be seen as prohibiting computerized entities from discriminating against protected classes of people when those people are shopping for insurance, even if the discrimination is not intentional.

(Image: ra2 studio/Shutterstock)


© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.