The act took effect Aug. 1, but enforcement deadlines are spread out through 2027. The act could have a similar global influence to GDPR.
By: Lindsey Wilkinson• Published Aug. 1, 2024
The European Union has raised the bar for AI governance, security and risk management as the provisions in its AI Act go into force Aug. 1.
The AI Act’s rules and guardrails assign AI applications to three risk categories:
Unacceptable ones are banned, such as social scoring, toys with voice assistance, facial recognition databases through internet scraping and inferring emotions in workplaces.
High-risk use cases have many obligations, including when used in non-banned biometrics, critical infrastructure and recruitment and employment.
Limited risk AI systems, such as chatbots, are subject to lighter transparency obligations.
The rest are viewed as minimal risk and largely left unregulated, including spam filters and video games.
Non-compliance can carry fines of up to $37.8 million (35 million euros). Regulators will take the nature, gravity and duration of the infringement into account. Supplying enforcers with incomplete or misleading information, for example, is subject to fines of up to $8.1 million (7.5 million euros).
The clock is ticking for providers and enterprises deploying AI systems to become compliant as they face monetary consequences and investigations by regulators. Analysts anticipate the EU rules will set the global standard for how businesses treat AI systems, comparing the impact to that of the General Data Protection Regulation.
In response, enterprise tech leaders have grown closer to their compliance, privacy and legal C-suite counterparts as priorities shifted to AI adoption. The partnership has become critical as enterprises try to find the best way forward.
Vigilance and scalable processes that can adapt to developing regulations are key, Chief Data Officer Andy Hill and Chief Privacy Officer Christine Lee said in the post.
Ahead of the EU AI Act’s enforcement, CIO Dive broke down what to expect in the coming months and years:
When do the rules take effect?
While the AI Act takes effect Aug. 1, parts of the comprehensive AI regulation won't become enforceable until February.
When will the EU begin enforcing the rules?
The European AI office of the European Commission will enforce the rules around general-purpose AI models,while other AI systems will fall to the national enforcement level, according to a Wilson Sonsini report. Each EU country has one year to identify enforcers after the AI Act becomes law.
Enforcers will take a phased approach over a two-year transitional period.
The AI Office and Board will share the codes of practice, including the level of detail needed for summaries.
August 2025
Providers of general-purpose AI models in the EU will need to publish a summary of the content used to train the model. EU officials are expected to release templates for the disclosure prior to the deadline. Technical documentation addressing testing processes and evaluations is also required.
General-purpose AI model providers will need to put in place a policy to comply with copyright and other related rights.
Additional obligations are placed on providers of general-purpose AI models that fall under the systemic risk category.
General-purpose models already on the market by August 2025 will have until 2027 to become compliant.
Article top image credit: Johannes Simon via Getty Images
Top takeaways from the CrowdStrike outage for IT teams
“This may be the biggest stress test that I’ve ever seen for direct, first-line IT support teams,” Gartner’s Jon Amato said.
By: Roberto Torres• Published July 22, 2024• Updated July 23, 2024
A faulty software update from cybersecurity provider CrowdStrike bricked millions of computers Friday, plunging global businesses using Microsoft systems into chaos.
Though Microsoft and CrowdStrike issued remediation strategies, recovery was still ongoing Monday for many businesses, including major airlines and hospital systems. For CIOs, the scenes of business disruption highlighted the importance of having plans in place for when an outage hits.
The CrowdStrike outage shone a bright light on the limitations in IT organizations, according to Jon Amato, senior director analyst at Gartner.
"This may be the biggest stress test that I've ever seen for direct, first-line IT support teams," Amato said Friday.
CIOs can take the CrowdStrike outage as an opportunity to reassess their company's preparedness against major outages. Testing out crisis scenarios and developing business continuity plans are some of the tools available to tech chiefs before the next major IT outage strikes.
"This has happened before, and we can expect something like this to happen again in the future," said Frank Trovato, principal advisor at Info-Tech Research Group.
Software monoliths
The scale of Friday's outage could be attributed to the number of enterprise machines running Windows and CrowdStrike's endpoint protection platform, Cybersecurity Dive reported. Customers using the Linux or Mac versions of the update were unaffected.
The disruption also put the focus on potential single points of failure in the IT ecosystem, according to Spencer Kimball, CEO at Cockroach Labs.
"The fact that something like this could happen should open people's minds to the real risks of technical monocultures," Kimball told CIO Dive.
However, reliance on single software providers is part of the reality in modern IT estate management." If you commit to using Azure as your primary cloud environment and the various services they provide, you are vulnerable to an Azure outage," Trovato said.
Plan and prepare
Backup plans can give businesses a way to hedge against the cost of unforeseen outages
Organizations lose an estimated $400 billion per year due to IT failures and unplanned downtime, a Splunk report published in June found. Though cybersecurity issues were the most common factor, infrastructure and software issues are the second-most common driver of outages.
CIOs hoping to avoid a crisis during the next major software outage will look more closely at software quality reassurances from their main software providers, according to Amato. Analysts told CIO Dive business leaders must resist falling back into business-as-usual to improve their standing.
"That really has to be part of the culture, and should be ingrained into purchasing and operational processes," Amato said.
Though CrowdStrike said it is taking steps to prevent a similar issue from happening again, competitors in the endpoint detection and response space stand to benefit from the reputational impact of the outage on the vendor, analysts said.
The vast majority of IT leaders say outages or disruptions degraded customer trust in their organizations, according to PagerDuty research.
As businesses recover and the urgency from the initial outage subsides, CIOs can increase preparedness by running simulation exercises on what the next IT crisis could look like — and develop an effective response.
"What you want to identify first is: when we imagine all the different things that can go wrong, which one is going to be the most disastrous for us," said Kimball. "Then you want to cross it with which is most likely.”
Article top image credit: Jack Taylor via Getty Images
CIOs resist vendor-led AI hype, seeking out transparency
As vendors fuel a sense of urgency on AI adoption, tech chiefs turn to risk mitigation frameworks to clear the smoke.
By: Lindsey Wilkinson• Published July 12, 2024
Amid vendor-led pressure to adopt generative AI, CIOs say they aren’t rushing to embed the technology into every inch of the tech stack. First, executives want to check the facts — no matter how hard providers push for speed.
Providers' desire for speedy enterprise adoption is palpable. Generative AI has been embedded in popular CRMs, ERPs, software development tools and IT support solutions.
“I can just feel from the vendor community right now enormous amounts of pressure,” said Jason Strle, CIO at Discover Financial Services, told CIO Dive. “It’s now showing up not just in how they’re trying to engage with us, but also how generative AI is mentioned.”
While early trends indicate enterprise interest in AI is at an all-time high, CIOs can keep the upper hand in vendor conversations by developing a clear understanding of their organization’s risk appetite, auditing provider claims and leveling unnecessary hype.
Generative AI dominates industry conferences, update cycles and future plans. In the hunt for enterprise customers, some vendors have taken less-than-savory approaches, such as AI washing, overstating the technology’s capabilities and using the word AI as a marketing gimmick.
These tactics can open vendors up to regulatory and legal scrutiny. They also raise red flags for CIOs interested in modernization.
“A lot of vendors may think claiming that their product is leveraging some sort of AI capability is helpful, but it’s not always,” Strle said. “As soon as we believe that there may be a nondeterministic element to their solution, we then have to go through additional risk management steps based on our own policy.”
The hype can backfire, leading customers to further question vendor claims.
“When they’re throwing around these terms, they’re inadvertently signing up for a lot more scrutiny from us,” Strle said.
How tech leaders are pushing back
Executives must put the buzzing vendor field to the test, identifying the safest applications of the technology — and the ones that add more value.
“A lot of CIOs and organizations generally have gotten so enamored with what they believe AI is capable of doing that they have a false sense of security,” said Chris Novak, senior director of cybersecurity consulting at Verizon and advisory board member of the Cybersecurity and Infrastructure Security Agency.
The misconception lies in thinking AI automatically enhances processes.
“If you add hot sauce to a cake mix, it doesn’t necessarily make it better,” Novak said.
Finding the balance between keeping the business safe but still innovative is a struggle CIOs know all too well. They know ignoring disruption is not a viable option either.
Risk tolerance varies among organizations, but having a clear process for evaluating tools, systems and their potential impacts is crucial.
Copyright indemnities, which offer conditional protection for customers using generative AI tools, became a popular addition to vendor contracts as questions around ownership persisted. Enterprises are split on their effectiveness.
“There are some that are saying, ‘Yeah, the indemnification clauses, they don’t really do anything and we’re accepting the risk that if there’s something going on, we will be able to back out, we will be aware of that, and take it into account when we’re doing our development,’” Andrew Cornwall, senior analyst at Forrester, said.
Others are looking for more assurances.
“All of this is very complex and certainly gives rise to a lot of just concern,” Thomas Humphreys, compliance expert and content manager at Prevalent, told CIO Dive.
Customers have their doubts, too. Three in 5 are wary of AI, and nearly three-quarters want companies implementing the technology to make it more trustworthy, according to a KPMG survey released in January.
Depending on AI vendors for transparency and protection isn’t enough, analysts say. Enterprises must rely on their own abilities to audit tools, reinforce governance standards, and mitigate risks.
The U.S. Army has worked to identify the downsides and opportunities that come with adopting AI as part of a 100-day plan laid out in March.
“In the [Department of Defense], because of national security, we want to verify and trust,” Young Bang, the U.S. Army’s Principal Deputy Assistant Secretary for Acquisition, Logistics, and Technology, said during an AWS summit last month in Washington, D.C. “We want to adopt third-party algorithms, but we want to understand the risks associated with certain things, and then we will make informed decisions.”
With a growing appetite for emerging capabilities, the Army will deploy a 500-day plan to operationalize the findings, using a layered risk mitigation framework along the way.
“We’re trying to overcome the things that are going to prevent us from adopting third-party generative AI, and we’re doing that now,” Bang said during the conference.
Highly-regulated and risk-averse private sector organizations are taking a similar approach.
“We definitely have a very eager appetite for these capabilities, but we’re going to do it in a way where we can feel really, really confident that we’ve understood all the risks that are involved with it,” Strle said.
Article top image credit: JohnnyGreig via Getty Images
Enterprises near migration bottleneck as SAP deadlines approach
More than half of the ERP giant’s customers risk running out of maintenance support by 2027, according to an analysis by Basis CTO David Lees.
By: Matt Ashare• Published July 8, 2024
Only 57% of enterprises using SAP’s on-prem ERP Central Component solutions are on track to fully migrate to the vendor’s cloud-based S/4HANA system before mainstream maintenance support sunsets in 2027, according to research by Basis Technologies CTO David Lees.
More than half of customers running ECC and S/4HANA solutions risk missing the 2027 support termination deadline and one-third are approaching a 2025 support cliff, the analysis found.
The adoption trend model produced by Lees cross-referenced SAP, Gartner and Basis customer data with deadlines, resources and support policies.
While SAP doesn’t publicly share customer data, the company confirmed via email that it has incentives and migration programs in place for existing customers in addition to direct support from SAP experts.
SAP is banking on moving its business to the cloud.
The ERP giant stepped up efforts to speed adoption of its service-based enterprise solutions earlier this year with a multibillion-dollar internal transformation, which encompassed significant investments in AI capabilities and the addition of a board-level cloud taskforce led by former CIO Thomas Saueressig.
Despite tying technical and financial migration incentives to the RISE and GROW with SAP programs, customers have been slow to initiate what is often a long, taxing process.
“The stark reality is that nearly three-quarters of SAP customers are yet to start or complete their migration across to S/4HANA,” Lees said.
Not all ERP migrations are equally arduous. Smaller companies can complete the process in as few as 12 months with a staff of roughly 10, said Lees, who finished an 11-year tenure at Procter & Gamble leading the consumer goods multinational’s SAP transformation in 2018, prior to joining Basis.
Customers who don’t yet have a plan are caught between looming deadlines and an often difficult business decision involving modernization investments and acclimating to a SaaS-based billing model.
“That business case is only going to get worse, because the cost of doing this is only going to increase,” Lees said. “If you've still got more than 23,000 companies that need to do an S4 transformation in the next five to seven years, it’s going to mean more scarcity and competition for technical resources.”
As migrations accelerate, Lees anticipates the enterprise backlog to trigger a tech talent crunch that will peak in 2027. “The resource requirements for all of these projects when stacked together just starts jumping,” he said. “Companies are making these plans within the context of a market that could see significant rate increases or unavailability of resources.”
For most customers, the question is no longer whether to migrate but when to start. Further procrastination invites uncertainty.
“These companies have to [migrate] or risk running on software that’s not supported and not getting vendor updates,” Lees said. “That’s a very difficult position to be in when you’re a company listed on Wall Street.”
Article top image credit: Victor Golmer via Getty Images
Microsoft, playing the long game, invests billions globally to expand Azure empire
“The pressure is on — this is no longer the experimental phase of the cloud,” Alistair Speirs, director of Azure global infrastructure, said.
By: Matt Ashare• Published June 10, 2024
As cloud revenue growth regained momentum this year, the $10 billion Microsoft forked over to OpenAI for generative AI in January 2023 has started to seem like small potatoes.
Since firing the first salvo in the battle to capture enterprise AI ambitions, the tech giant has opened its war chest to expand its Azure infrastructure empire, committing tens of billions of dollars to amass the resource the technology craves — cloud storage and compute.
“The pressure is on,” Alistair Speirs, Microsoft’s director of Azure global infrastructure, told CIO Dive. “This is no longer the experimental phase of the cloud.”
Speirs directs data center strategy with a focus on what he called “the easy problems” — reliability, sustainability, resilience, security and data sovereignty.
Building, embedding and supporting enterprise-grade large language model technologies — the coding assistants, chatbots and summarization tools already in production — adds new challenges. To prepare for AI compute, Microsoft is building out capacity while installing servers outfitted with graphics processing units and other high-powered chips.
“AI is another workload that's coming into the data center,” Speirs said. “It has some unique requirements that change the types of servers you need and the power requirements as well but is essentially part of that continuum. It is giving us another data point to accelerate.”
The acceleration has gained momentum in recent months.
“We have been doing what is essentially capital allocation to be a leader in AI for multiple years now,” CEO Satya Nadella said, during the company’s Q3 2024 earnings call in April.
A brief (recent) history of Microsoft global data center spending
Microsoft announced a three-year, 2.5 billion pounds ($3.2 billion) plan to expand AI data center infrastructure in the UK, including sites in London and Wales, as well as a potential expansion into Northern England.
Feb. 15
Microsoft pledged to double cloud and AI capacity in Germany by the end of 2025, committing 3.2 billion euros ($3.5 billion) to the project.
Microsoft said it will invest $2.2 billion over the next four years to advance new cloud and AI infrastructure in Malaysia — the single largest private investment in the company's 32-year history in the country.
Microsoft announced an investment of 4 billion euro ($4.3 billion) to bring up to 25,000 GPUs to data centers located in France by the end of 2025, expand data centers in Paris and Marseille and build a new facility in the Mulhouse Alsace region.
May 14
The Milwaukee Business Journal reports an additional $8.8 million land purchase by Microsoft at the Southeast Wisconsin data center site.
May 22
Microsoft and G42 announce a $1 billion investment package in Kenya to include an Azure data center in Olkaria run on geothermal energy and a new East Africa Cloud Region.
The AI leadership push is reflected in Microsoft’s spending patterns. The company reported $14 billion in capital expenditures for the three-month period ending March 31, including infrastructure and leases to support cloud demand. That’s nearly double the $7.8 billion in capital expenditures the company reported during the same period last year.
CFO Amy Hood said the spending went toward satisfying demand for cloud and was likely to increase as the company deepens AI infrastructure investments.
“We continue to bring capacity online as we scale our AI investments with growing demand,” Hood said, noting near-term demand surpassed available capacity.
Building compute
Data center facilities house the engines that drive cloud business. Optimizing servers for AI workloads is just the latest infrastructure challenge.
Microsoft learned a lot about ramping up capacity during the pandemic years, Speirs said. But data centers are, by definition, strategic investments. They can take years to build and require small armies to run.
“These are complex infrastructure projects,” Speirs said. “They require a lot of people and it’s not just the servers that go into them, but also the construction effort, the planning, the sustainable siting of the land, the permitting and renovating — all of that has to happen.”
The hyperscaler business model was built on supplying enterprise compute resources as efficiently as a public utility. Scalable, service-based infrastructure allowed customers to fuel digital transformation without adding costly and complex on-prem estates.
“These are the new railroads for the 21st century,” Gordon Dolven, director of Americas data center research at CBRE Group, said. “If you want to deliver a service, whether it's streaming or content or messaging on an application, it requires data centers."
Infrastructure expansion was already in the cards, as hyperscalers built capacity in order to keep pace with cloud adoption across industries. Despite last year’s macroeconomic headwinds, enterprise cloud consumption drove an 18% year-over-year increase in global cloud spending, according to Gartner. This year, the analyst firm expects 20% growth, with LLM technologies helping to drive global cloud spend close to $700 billion.
As consumption of cloud resources and AI compute grows, efficiency remains a crucial feature.
“We have a nice strategic altruism alignment here of making sure that any electron used in our data center is billable,” Speirs said. “That is driving a lot of investment in how we get more efficient, more sustainable and reduce one of our largest operational expenses.”
Azure’s AI boost
The race to deploy generative AI capabilities has already augmented Microsoft’s cloud business.
Azure revenue jumped 31% year over year during the first three months of 2024. AI services provided a big assist, Hood said during the earnings call, adding seven percentage points of revenue growth to Azure and other cloud services.
The revenue bump gave Microsoft sufficient momentum to capture one-quarter of the global cloud market for the first time. The company’s two main rivals — AWS and Google Cloud — also saw revenue growth increase as they held onto 31% and 11% of the cloud market, respectively.
Microsoft remains the second largest hyperscaler behind AWS
Cloud market share 2019-2024
Microsoft’s gradual ascent has mostly been at the expense of smaller cloud providers, not AWS or Google Cloud, John Dinsdale, Synergy Research Group chief analyst and research director, said.
An infrastructure arms race
Microsoft isn’t operating in a vacuum. AWS and Google Cloud are also doubling down on infrastructure build outs to capture AI business and market share.
AWS has more or less matched Microsoft’s cadence with data center build outs. Amazon’s cloud division will spend $11 billion on cloud data centers in Indiana, the company announced in April, on top of the $10 billion the company committed to multiple complexes in Mississippi at the start of the year and $7.8 billion pledged to expand infrastructure in Ohio last summer.
Amazon’s capital expenditures for AWS were $14 billion during the first three months of the year. That number is expected to increase over the next three quarters, Amazon CFO Brian Olsavsky said during Amazon’s April earnings call.
While supply grew 26% year over year in the data center rental and colocation market, the vacancy rate was at a near record low of 3.7%. As new capacity ramps up, an estimated 83% is already spoken for, CBRE said.
Big cloud providers, along with companies in the technology, financial services and healthcare sectors, are consuming the lion’s share of available capacity, Dolven said.
“Historically, these companies may have thought about pre-leasing six, nine or 12 months in advance,” Dolven said. “Now we’re seeing more and more interest in getting ahead of the curve.”
The enterprise bill comes due
The hyperscaler spending spree offers a glimpse into cloud’s AI-driven business model: build capacity, embed generative AI capabilities, cover the cost on the back end as enterprise consumption of infrastructure and services grows.
“Net new revenue is going to offset the cost of doing business,” Gartner VP Analyst Sid Nag said. “That premium that you're going to pay — that the CIO is going to pay — will essentially absorb the cost of these multibillion-dollar data centers.”
Just as LLM builders depend on cloud for compute resources, hyperscalers are counting on generative AI applications to consume and pay for the infrastructure the technology is reshaping.
“Compute and storage are the most expensive parts when it comes to cloud costs,” Tracy Woo, Forrester principal analyst, said. “Now customers are asking for supercharged compute, which can be really expensive.”
The inherent advantage of running generative AI in cloud is that enterprises don’t have to foot the bill for up-front infrastructure investments to work with the technology, Woo said. The hyperscalers are providing customers with the optimized infrastructure.
“That's pretty much been the value prop of cloud for the last seven to eight years,” Woo said.
The benefits of internal process efficiency gains, faster time to market with new products and services and overall business growth offset the downstream cost to the enterprise over time, the thinking goes.
Generative AI’s cost-per-usage expense is likely to come down, as well, as hyperscalers optimize data center operations, chip technologies become more energy efficient and enterprise-grade models get smaller and more contained.
As global cloud infrastructure capacity doubles over the next four years and more than 100 new facilities come on line annually, smaller data centers are becoming more common, Dinsdale said, citing SRG market analysis.
A dispersal trend can help mitigate environmental costs, an additional boon to enterprise customers as emissions regulations tighten.
“Rather than just expand existing data centers, we're adding new regions,” Speirs said. That way, Azure can locate data centers closer to customers to satisfy data sovereignty and latency reduction requirements, and the company can situate new facilities near renewable energy sources.
As AI systems become more efficient, the build out strategy may need adjustment, David Linthicum, enterprise technology analyst at SiliconAngle and TheCube, cautioned.
“The hyperscalars may be overbuilding in anticipation of capacity that many AI systems, if architected properly, may not need,” Linthicum said. “Enterprises are going to get much better at optimizing any use of AI, for much less infrastructure than they thought they initially needed.”
Speirs isn’t worried.
“As we look forward, expect AI in every app and in every service,” Speirs said. “There will be a time relatively quickly when it will be kind of silly to have an AI strategy because it won't necessarily be something distinct or different.”
Article top image credit: Gerville via Getty Images