Google looks to redesign the ways people interact with AI systems
- Google announced a new research initiative Monday designed to study and redesign the ways people interact with artificial intelligence systems. Known as the People + AI Research Initiative (PAIR), the goal is to focus on the "human side" of AI and how to make it broadly inclusive.
- Google also released two new open source visualization tools, Facets Overview and Facets Dive, for researchers and AI experts. The applications give engineers an easier way to view the datasets they use to train AI systems
- "We believe AI can go much further — and be more useful to all of us — if we build systems with people in mind at the start of the process," wrote Google researchers Martin Wattenberg and Fernanda Viégas wrote in a blog post. Google researchers also said they are working with visiting academics from Harvard University and MIT.
AI holds huge potential, but if researchers don't consciously train systems, people could lose trust. There have already been a few examples of bias of various kinds built into the foundation of an AI program that can then carry through to the applications where that AI is used. For example, research released in April showed that AI programs can exhibit both racial and gender biases, The Guardian reports.
Google wants to ensure everything is examined early so that the AI systems it builds are the best they can possibly be. And hopefully avoid introducing biases to the systems.
For Google, it all boils down to the bottom line. The company is betting big on AI, and strives to be the leader in that space by developing new technologies based on it. Google's parent company Alphabet has had the most AI acquisitions so far, ahead of Microsoft, Apple, Intel and Salesforce, according to recent data from research firm Quid.
But Google also knows if it looks too much at the tech and not enough at the human side, it could come up short. Enterprises looking to invest in new AI-based tools will want to ensure the tech is sound before they write that check.