Collateral Damage

Andrew Gould
3 min readApr 16, 2021

In the eighth chapter of Cathy O’Neil’s “Weapons of Math Destruction,” she touches on the topic of financial assessments of individuals. The majority of people are well aware that their financial status and history are heavily reported on. Any time an individual applies for a loan, credit card, or other line of credit from a bank or other financial institution, their person is submitted to a credit reporting agency that returns the holy grail of their financial future: their credit score.

O’Neil details that credit scores are not actually WMDs and are almost exemplary examples of the correct use of mathematical models for evaluating persons. Compared to the process beforehand, when a person would go in person to ask a banker for a loan, many human biases were inflicted upon the borrower. The banker would assess their financial documents, but subtle biases like their proclivity to attend church, their rebellious older sister, or their dad’s run ins with the law were also, if not purposefully but subconsciously, factored into the application process. The process of humans evaluating heavily numbers-based applications was riddled with faults, and had the effect that loans were given out very cautiously due to subconscious biases that weren’t necessarily indicative to the borrowers ability to repay the loan in a timely manner. Enter the credit score. The credit score was a boon for the lending markets because it opened up loans to a much wider demographic. No longer was it only the well to do man who established himself in the community, now the relatively secluded single mother who does not have the time to socialize had the ability to receive a loan because she spent her time keeping herself in good financial standing.

Credit scores an important part of our financial institutions today, but they are now further being used outside of their necessary context. For example, employers, when performing a background check, are likely to see delinquent accounts, unpaid bills, and over drafted balances. From the surface, this seems relatively plausible, employers don’t want to employ people who are unable to even maintain themselves financially; however, once you dig deeper into the ramifications of this type of system, it becomes apparent that this is a WMD. People who have not had jobs for a long time, for one reason or another, would most likely not be in good financial standing at the moment, but were before losing their job. This type of system relies on a dangerous proxy to make assumptions about a person’s work ethic. Bad financial standing is not an adequate proxy to determine whether someone is suitable for employment.

These systems are examples of WMDs. Packaging up data that is useful for one purpose, like a credit score, is acceptable since it directly evaluates historical performance in the same vein to predict future performance. The problem lies when proxies are used to make predictions about other parts of a person’s life. An unpaid loan, a domestic violence charge, or a lack of establishment in the community are not effective proxies for other parts of society. People are not one dimensional, and compressing their data down into one data set to extrapolate everything about the person is a dangerous game that only ends in a nasty feedback loop. To solve this issue, companies should stop using mathematical models that assume that just because they have all the data, that it is relevant to their system. Similarly, companies should be able to adjust their models when they return a wrong result. Both of these are the case currently and are what gives these types of models the designation of a WMD.

--

--