Consumers and employees are growing accustomed to the benefits of AI-supported products, putting pressure on executives with purchasing power.
Employees want to find AI products in their work toolkit that can make workflows easier and amplify their efficiency. Consumers want the companies they patronize to be speedy and accurate in everything from customer service to e-commerce suggestion engines.
AI adoption is a clear trend among enterprises — 68% of companies say they already adopt or plan to adopt AI in the next two years, Spiceworks Ziff Davis' 2022 State of IT report shows. The technology is yielding positive impacts for nearly three-quarters of organizations using AI, according to statistics presented at Forrester's Data Strategy & Insights virtual conference Thursday.
But the volume of available options from AI providers can be dizzying. It's a market set for further expansion as traditional software vendors load AI functions into their existing products in a number of verticals.
"There's literally hundreds if not thousands of enterprise use cases," said Mike Gualtieri, VP and principal analyst at Forrester, speaking Thursday at the event.
To understand whether a specific AI product is the right fit, here are five questions to ask vendors.
What's the business value?
While employees expect AI to have a positive impact on company culture, it's up to decision makers to find the connection between an AI product and the business value it can deliver.
"Just because it's AI it doesn't necessarily mean that it has business value," said Gualtieri.
Evaluating the solution like any other software platforms, executives should ask what is the positive business outcome to be gained from adoption: better decisions, automated decisions, predictions or identifying patterns, he said.
Is this actually a product?
With the number of AI vendors proliferating, tech executives should determine if they're being sold a custom product that will meet their needs or a pre-built product or platform.
"You really have to follow up to make sure that other customers are using this," said Gualtieri. It's up to decision-makers to understand whether there is an actual product capability being offered to them, or if vendors are "just knowledgeable and wishful about what they can do for you."
How sophisticated is the vendor's technology?
Once a proposal makes it past the previous two questions, the product sitting before decision-makers is diagnosed as capable of delivering business value and existing as an actual custom product instead of a platform in disguise.
Then, there's the question of the actual technology powering the solution, and how mature it is.
"You have to make sure that this product is analyzing data to create a machine learning model," said Gualtieri. "If you're using some other technology than machine learning, there might be a different evaluation, but 90-plus percent are using machine learning, so it's all about data."
There is a range of maturity levels when it comes to AI applications; some products have AI features in their roadmap, while other products use a working model that has been in the market for four or more years. The latter is an example of products with the highest level of maturity, Gualtieri said.
What data trains the model?
The right data can make or break an AI model. If the conversation with a vendor has gotten this far, a smart next question regarding data would be: Mine, yours or ours?
"It's all about the data when it comes to the accuracy of the model," Gualtieri said. "And it's a huge prerequisite to machine learning success."
The best possible case is when the product uses a pre-trained model that relies on industry data collected by the vendor, but that can later be customized and made more accurate with the company's proprietary data, according to Gualtieri.
How is the model monitored?
AI products use models trained on past data. Its outcomes represent the past and what's likely to happen in the future, which means models are imperfect by nature and must be updated as situations change.
"A critical question to ask that vendor is how they're monitoring the model," said Gualtieri. "When a vendor delivers code, that code is going to always run as written, but the model is going to decay in performance."
Executives need to make sure vendors constantly evaluate models that are against the KPIs and have some way of retraining it automatically.