Content Weapons Primer: Bias in A.I.

The artificial intelligence debate has been around for many years. For one reason or the other, artificial intelligence is always returning, and we are in one of those moments. However, this time, the main topic makes perfect sense — bias.

The reality is that no one can question the benefits of artificial intelligence. However, along with it, comes bias.

The truth is that there have been a lot of concerns regarding artificial intelligence bias in what concerns gender or race in a wide variety of areas that include financial services, judicial sentencing, policing, and even hiring.

In case we are able to put artificial intelligence to work, it is crucial that we are able to solve the bias problem quickly to ensure that we can create a healthy artificial intelligence ecosystem.

Artificial Intelligence Key Challenges

#1: Bias Built Into Data:

No matter where you look, you can only see data. And in what concerns artificial intelligence, this is a good thing. After all, artificial intelligence is fed with data and one of its main advantages is that it can learn at an incredibly fast pace. However, as you can imagine, most of this data contains biases.

Just think about an algorithm that is created based on judges sentencing decisions. While it is obvious that you should add any race or gender criteria, the reality is that you just need to look at our current criminal justice system today to see that between an black or brown Americans compared to white Americans, the first ones tend to be more targeted by the police. According to the existent data, there are a lot more arrests of blacks in America which can create a bias in the artificial intelligence system.

Therein lies the issue with data, what can you do when the data has been skewed for decades? Well its not the data itself per se, it’s the systems ran by humans that generate skewed data. Pulling a single statistic will bring this to light: What is the incarceration rate for black and brown Americans? How is this so much higher than a race that is the majority of a population. Racial bias and socioeconomic positioning is the research backed answer.

#2: Artificial Intelligence Induced Bias:

Another big challenge when we are talking about artificial intelligence bias is related to the way how algorithms evolve.

The reality is that artificial intelligence algorithms are not static. Instead, they keep learning and, therefore, changing over time.

So, let’s say that at the beginning, an algorithm will make a decision based on 7 different factors using a small number of data sources. As the system evolves, the data used will increase. So, we can expect that the processing of all the data becomes more complex and sophisticated. However, it is important to notice that these changes are not made by humans. It is the algorithm, the machine, that adapts its behavior on its own. While this may not always be the case, there will be cases where bias will be introduced.

Check out my latest books:

Content Weapons the book #contentweapons by Michael Stattelman

Learn how to lead “Next Practices” initiatives like this in Meta Leadership also by Michael Stattelman

#3: Teaching Artificial Intelligence Human Rules:

As human beings, we know that we live in a society where we need to follow certain rules. However, artificial intelligence doesn’t know this and this may end up creating a bias.

Just think about the different wages that many women and men earn and that are different just because they have a different gender. The artificial intelligence system will see this and he will learn that this is the right way to do things.

#4: Evaluating Cases Of Suspected Artificial Intelligence Bias:

Another major challenge that you should take into consideration is that just because there is a suspicion of bias in some cases, it doesn’t mean that it will always be there.

The reality is that making any kind of decision involves a lot of factors and not all of them can be seen through data.

Let’s say that you want to hire someone for a specific position. You get several candidates and you select two of them to the final interview. During this interview, you ask them if they have children. While this may seem a personal question, for some employers this question allows them to understand if you are ready to take with additional responsibilities that only a father can because he had dependants at home. This is something that won’t be reflected in any kind of data and the artificial intelligence system won’t be able to identify it. So, this poses an incredible challenge once again.

#contentweapons

#metaleadership

--

--