AI is biased: Who's to blame?
- AI biases are rooted in the very data sets reflective of the humans behind them, according to Bloomberg. Because AI collects data and produces answers without explanation, researchers say resolving the bias problem will take years.
- Biases creep into AI in various ways, including undiversified training data from which the AI learns its skills, according to the report. The programmers behind AI oftentimes present AI applications with biased samples, such as presenting images of more female than male nurses.
- Efforts are being made to specifically focus on AI bias by reexamining "word embeddings," or data that's used as a "computer dictionary" for AI. Researchers are also exploring the idea of implementing different algorithms to examine different groups in a data set, rather than measuring unique groups with the same criteria.
Some experts fear an impending robot apocalypse, but biases programmed in AI are problems companies are facing now. Programmers are ultimately responsible for the social downfalls of AI as the "mind" of AI is a digital extension of its programmer.
If the datasets behind AI are biased, so are the solutions it concludes. As AI develops further without any resolutions to these biases, the decisions it makes will presumably also be unquestioned. The implications of this have a variety of impacts on society, especially when AI is used for social efforts. Minorities and women could face unfair treatment or categorization made by AI — which could result in financial and legal ramifications.
About one-quarter of CIOs already implemented AI solutions and are in charge of how the technology will best serve existing services. The market for AI is forecast to hit $5.05 billion by 2020, yet there are fewer than 10,000 experts in the world well trained in the field.
Companies are embracing the technology for automated tasks and data retrieval and analytics. But the companies that struggle with acquiring the funds and talent for AI and ML capabilities are turning to drag-and-drop frameworks and AI solutions in the cloud.
- Bloomberg Researchers Combat Gender and Racial Bias in Artificial Intelligence
- MIT Technology Review Biased Algorithms Are Everywhere, and No One Seems to Care
Follow Samantha Ann Schwartz on Twitter