- Two-thirds of IT decision-makers expect distributed cloud deployments to increase in 2024, according to a ClearPath Strategies survey of 425 IT leaders commissioned by cloud provider Akamai.
- Running analytics and AI workloads closer to data stores is driving adoption of edge solutions, according to the survey. Security and reliability were additional motivations, the report said.
- “Cloud’s next phase requires a shift in how developers and enterprises think about getting applications and data closer to their customers,” Dave McCarthy, VP of research at IDC, said in a blog post accompanying the report. “Workloads are no longer built for one place but are delivered across a wide spectrum of compute and geography.”
As enterprises push cautiously toward generative AI adoption, CIOs are grappling with the technical challenges inherent in leveraging the technology.
The need to maintain data security and low latency is shifting enterprise strategy, encouraging organizations that have migrated to cloud to bring some servers on-prem or direct certain workloads to smaller, regional data centers.
“The biggest change in attitude I’ve seen about cloud is a shift from thinking of it as a location — a remote data center managed by someone else — to an operating model that can be deployed anywhere,” McCarthy said in an email. “This is where you see cloud-native concepts applied to on-premises data centers and edge facilities.”
Redirecting specific workloads is a tactical enhancement to a broader cloud strategy, not a move away from public cloud, according to McCarthy. But it is a response to latency issues, data sovereignty and resilience concerns and the potential cost of feeding data-hungry LLMs.
“As more data is created outside of the cloud, the cost of transmitting it to a cloud region along with long-term storage fees can be problematic,” McCarthy said.
DevOps professionals are favoring distributed services as an enhancement to public cloud, according to a separate Akamai survey of 700 cloud developers conducted by SlashData.
More than half of respondents said they were already using the technology, citing improved user experience, greater scaling flexibility and better performance for data-intensive applications.
“This will become more important as enterprises embrace AI and LLMs,” McCarthy said. “The idea of relying on a centralized cloud for both training and inference will become a bottleneck.”