AI Bias in Healthcare: Discrimination and Regulatory Uncertainty

AI Bias in Healthcare: Discrimination and Regulatory Uncertainty

Ultimately, the discrimination is the result of the bias in the AI algorithm. HT: A recent survey suggested that over 36% of companies, not just in the healthcare industry, experienced challenges or direct business impact due to an occurrence of AI bias. Kay Firth-Butterfield: Bias comes into algorithms in several ways. For example, if you’re building an algorithm with a world view and yet your developers are all young men in their twenties, when they’re thinking about what data to include and how to create the algorithm, they aren’t bringing that world view to it. The other way that bias gets into algorithms is when you use the wrong data. Kay Firth-Butterfield, AI in Healthcare_Balancing the business impacts of AI bias HT: What are some direct business impacts due to AI bias in healthcare and other industries. For example, if your algorithm is set up in the wrong way, then you are going to lose the opportunity of hiring people who will make your business succeed. It has even been proven that if you employ an algorithm to predict the likelihood of someone recommitting a crime after being booked into jail, the data is so biased in the United States against people of color that even if the Black person was originally charged with a lesser offense than the White person, it will suggest that the Black person is more likely to recommit a crime.

The other example comes from the Algorithmic Justice League – an organization that aims to raise public awareness about the impacts of AI –  about how algorithms typically in the past have not identified Black women correctly. Kay Firth-Butterfield, AI in healthcare_Women and diversity’s role in diminishing AI bias Only 22% of women are AI scientists, which inevitably causes a bias and a problem with the way that the algorithms are going to be created. There is also a paucity of Black AI scientists. The majority of people working in AI and creating algorithms are White males or Indian subcontinent males. One of the ways that we’ve seen this being corrected is when you start to think about the algorithms that you’re going to use, bringing those diverse voices into the room. For instance, due to the underrepresentation of women AI scientists, we’ve got to bring them in a different way, such as with your social scientists, which is great because often they bring that societal perspective that aids in the development of the algorithm anyway.

Evaluating AI algorithms Kay Firth-Butterfield: There will be the European AI act, predicted to come into effect in 2024, but other than that there are little to no regulations as of today. There is some legislation in the US coming up around the use of algorithms in hiring because it is a huge impact on someone’s life. As mentioned before the US Equal Employment Opportunity Commission (EEOC) is looking at algorithms based on the Civil Rights Act. The law exists to prevent discrimination against people by people and I think we will soon see cases brought by the EEOC against algorithms that discriminate against protected classes. Once we see that happening, that will provide us with a baseline of how people and how companies are going to create and use algorithms. If you’re a company producing that hiring algorithm and you sell it to another company that then has its workforce biased because you got it wrong, then you could be liable for the project that you created.




Next Article

Did you find this useful?

Medigy Innovation Network

Connecting innovation decision makers to authoritative information, institutions, people and insights.

Medigy Logo

The latest News, Insights & Events

Medigy accurately delivers healthcare and technology information, news and insight from around the world.

The best products, services & solutions

Medigy surfaces the world's best crowdsourced health tech offerings with social interactions and peer reviews.


© 2024 Netspective Foundation, Inc. All Rights Reserved.

Built on Nov 22, 2024 at 12:50pm