Un(in)sure(d) of Why I Pay So Much

Haleigh
3 min readApr 21, 2021

They say math is in everything in the universe, and no where do we see that more than in the modern era. Mathematical algorithms can be used to determine nearly every facet of our lives: where we can live, what we can buy, what we drive and even whether we can insure our lives or not. When it comes to obtaining insurance at reasonable rates, O’Neil exposes the capabilities for mathematical models to manipulate data in order to categorize us by pre-selected traits — where the person lives, how far they travel, to what areas they travel, and even the person’s health itself. This aggregate data is gathered in order to create models that help assess the potential risk from the insurance company’s perspective.

According to the model, person who tends to always travel higher than the posted speed limit, or happens to drive or live on a street with many bars and clubs during the late hours of the night, is flagged as more likely to be a risk of acciden. The model fails to take into account that the person driving might be a barista who starts her shift at 5 a.m. or a youth pastor who travels along that street frequently to interact with at-risk teens.

Similarly, when looking at health insurance, the models gather personal information such as cholesterol, BMI and other red flag data to determine if the person is considered a high risk of health issues. More and more employers are taking into account the health of employees when hiring. Data such as the how many trips to the Emergency Room made by employees each year or the number of prescriptions filled may be factored into the employer’s premium. So, naturally, employers want to hire employees who aren’t considered “overweight” by the BMI scale, even if this is a form of prejudice.

“In a filing to the Wisconsin Department of Insurance, the CFA listed one hundred thousand microsegments in Allstate’s pricing schemes. These pricing tiers are based on how much each group can be expected to pay. Consequently, some receive discounts of up to 90 percent off the average rate, while others face an increase of 800 percent.” In short, this seems highly unfair. This model is not the same as predatory marketing schemes that aims to give loans with terrible interest to the poor, but it is targeting those susceptible and charging them 800 percent more for a service. O’Neil says that credit has more to do with the current models than DUI’s, which means an immigrant new to America will have to pay more for car insurance than a driver who has motor violations that include intoxication. “The result — another feedback loop — is that poor drivers who can least afford outrageous premiums are squeezed for every penny they have.”

So how do we fix this? O’Neil explores various options, including National Association of Insurance Commissioner’s ability to expose the situation to the public. “The underlying idea was that drivers should be judged by their records — their number of speeding tickets, or whether they’ve been in an accident — and not by their consumer patterns or those of their friends or neighbors.” A careful 30-something driver shouldn’t be expected to pay more for car insurance just because they defaulted on a gym membership after they graduated from college. Credit should not factor into how much someone is charged for insurance as it is not an assessment of driving ability. That would be like expecting people to pay more for apples at the grocery store because they failed algebra in high school.

Teens and young adults are charged more for insurance because of their age. That is intuitive because they lack experience. However, if the student’s driving record itself is assessed, careful, skilled teens could receive a discount to their insurance by showing they are a safe driver. Only when they fail to be a safe drive, as indicated by tickets and accidents, would their rates go up.

Overall, the models penalize before true, relative data is attached to the individual. Charging more because of a person’s race, where they live, or how old they are should be unethical and illegal. Yet depending how they twist the data, companies can get away with stacking the deck against groups of people any way they deem in order to make more money. It’a time to stop allowing these models to change me more for what other people do and make the companies be accountable to my exact behaviors and characteristics instead.

--

--