- The Department of Defense on Wednesday requested research proposals on the topic of Explainable Artificial Intelligence (XAI) technology.
- XAI technology would explain how it arrives at a conclusion, therefore giving users reassurance that the technology is delivering a reasonable recommendation.
- The Pentagon's Defense Advanced Research Projects Agency (DARPA) wants users to better understand how AI works so that they trust it and have confidence in its application.
Artificial intelligence (AI) adoption is on the rise, with more than half of surveyed organizations reporting plans to deploy AI technologies by 2018, according to a recent report from Narrative Science and the National Business Research Institute.
For the DOD, the potential uses of AI are enormous. But users must trust AI in any application in order to make it useful. The end user, DARPA said in the request, "needs to understand the rationale for the system’s decisions."
The agency also wants to measure if a user’s trust in AI grows if the system is indeed able to explain itself.