When the largest U.S. bank needs compute, there’s little margin for error.
Executives at JPMorgan Chase don’t want IT teams waiting in line to access resources needed to spin up an AI use case, upgrade a customer service portal, enhance market forecast analytics or support critical business applications.
“We're putting a lot of effort into improving the ability of our software engineers to be productive as they do development,” the company’s CFO, Jeremy Barnum, said during a January earnings call.
As technology drives business outcomes, infrastructure availability becomes a competitive advantage — and a priority.
“For our capacity, we like to use a just-in-time model, which usually means planning years before we need the capacity — as much as five to 10 years out,” Darrin Alves, CIO of infrastructure platforms at JPMorgan Chase, told CIO Dive. “If you wait until you're only a year or two away, you won't have the capacity available when you need it.”
Alves oversees the digital systems that connect more than 600 JPMorgan offices globally and over 5,000 individual branch locations to a complex hybrid ecosystem of private data centers, colocation facilities and hyperscaler deployments.
“We have everything from mainframe to quantum computers and blockchain to public cloud and generative AI,” Alves said.
Like cloud before it, generative AI is transforming banking processes, as coding assistants help unlock data in legacy COBOL applications, smarter chatbots improve customer experience and agentic solutions ease workflow processes.
JPMorgan Chase is an industry leader in AI adoption. The company widely rolled out its in-house generative AI assistant LLM Suite last September and ranked first for AI maturity in Evident Insights’ analysis of the sector’s top 50 financial institutions last year.
Modernization investments are integral to AI success going forward, said Alexandra Mousavizadeh, co-founder and CEO of Evident.
“Whether banks are building their own AI or buying third-party solutions, the end result will only be as good as the underlying infrastructure,” Mousavizadeh said. “As the drive toward organizationwide AI deployment ratchets up, we’ll start to see which institutions have placed the right bets.”
Computing capacity
Major transformations call for rigorous planning and a modernized infrastructure foundation. There are multiple challenges to surmount, especially in banking.
“Any time you go through an application, you first have to solve for legal and compliance,” Alves said. “It’s got to meet our security and risk and controls and then it's customer experience.”
Infrastructure and spending are also key considerations, though they are further down the decision tree.
“We have to decide where it’s going to fit into our architecture and the last step is to optimize for cost,” Alves said.
Finding the right compute environment and the capacity to run generative AI use cases became an enterprise pain point last year as hyperscalers raced to deploy GPU servers and build out data centers.
In August, CBRE analysts estimated that nearly 80% of data center space under construction had already been leased. Gartner forecast that 40% of existing AI data centers will be operationally constrained by power shortages in the next two years.
“AI expansion is putting constraints on the entire supply chain for data centers,” Alves said.
JPMorgan Chase navigated around the hardware hurdles proactively.
“We’re a company that wants to make sure we can control our own destiny, for our own business reasons and for regulatory reasons,” Alves said. “So, we partner closely with the hardware manufacturers and the colocation and data center designers.”
The bank is also vigilant about refreshing its data center infrastructure regularly, as it expands capacity where needed and leverages public cloud.
“If we’re on a continuous refresh cycle, we can accommodate significant growth without having to expand,” Alves said. “The most efficient data center that you can build is the one that you don't [have to build].”
Hybrid equilibrium
While cloud remains a cornerstone of JPMorgan’s IT strategy, the bank retains a diversified infrastructure portfolio and invests broadly in solutions.
“You use the right tool for the right job,” Alves said.
Firmwide, the company increased tech spend by $1.5 billion year over year to roughly $17 billion in 2024, Barnum said during a May presentation to investors. CEO Jamie Dimon said JPMorgan Chase would have three-quarters of its data and 70% of its applications migrated by the end of last year.
“We were methodical in our approach to cloud, so we didn't go too quickly,” Alves said. “As we went, we learned that you need to be very thoughtful about which workloads make sense in public cloud.”
Certain banking applications align with on-premises capabilities, while others do not, depending on their function, data needs and scaling requirements.
In some cases, a modernized mainframe is the right call.
“We process more than 10 trillion dollars in payments a day on our systems,” Alves said. “While there are other systems that participate in that, the mainframes are the workhorse that do the bulk of the heavy lifting.”
Developing AI solutions called for a different strategy. Initiatives were launched in the cloud, where piloting potential use cases was relatively easy and affordable. Now, as trial-and-error leads to practical applications, Alves expects to adopt a hybrid mix of infrastructure to support AI.
“The cloud gives us a place to do that experimental learning with no regrets,” he said. “If it doesn't work, we can shut it down very quickly. You can't really do that on your own infrastructure if you're making a significant investment in what AI requires.”