Addressing the cost reduction challenge in banking - part 2

Embedding optimization strategies
  • Liam Mobsby, Jenni Reinikkala
  • 18 September 2025

Cost optimization remains a focus of attention at banks, given the macroeconomic environment and the increasing number of fintech competitors – but tangible and long-lasting results from optimization programs remain elusive. In the first part of this series, we looked at the shift in mindset required to engineer continuing success, noting key priorities around technology selection, scalability needs, building Centers of Excellence and reorganizing decisively. In this second article, we look at how banks can avoid siloed approaches to cost optimization, control costs in key areas such as data management, and build a fuller picture of both vendor value and the cost of serving customers. 

Simplifying the technology estate and decommissioning legacy applications is often a key lever for cost optimization, especially in larger, complex banks servicing multiple lines of business. Likewise, simplifying and standardizing processes is often the first step towards automating them to reduce costs and improve customer and client service. 

However, banks are complex businesses. Technology simplification programs and other cost optimization levers need to be based on a sophisticated view of the bank’s interconnected processes and customer journeys – and how to deliver value in the future. 

Otherwise, today’s cost cutting will set the scene for rising costs and a lack of scalability in the longer term.

In this context, we propose a further four priorities for sustainable cost optimization, to complete our eight key priorities for bank CEOs. 

 

Optimize horizontally and across journeys 

Traditionally, cost reduction initiatives align to either functions – mirroring cost center hierarchies and the way that budgets are allocated – or to broad cost save buckets such as application rationalization and controls simplification. 

However, both approaches can give rise to a siloed approach with missed opportunities to address duplication of effort, apply technology solutions, and mend inefficient hand-offs and other costly friction points across end-to-end journeys. 

These journeys often begin with front-office customer interactions/transactions and span into tasks conducted by middle and back-office teams – who may then hand back to servicing teams if things go wrong. The journeys may cover multiple functions, with inherent wastage in the overlapping areas.  

Mapping costs across all functions and teams within a journey involves additional effort but reaps benefits in the longer term. It can be used to identify more cost-saving opportunities, alongside opportunities to deliver better customer and risk outcomes through more streamlined processes and controls. 

For example, one client found that bespoke customer requirements, agreed to by sales teams, resulted in significant operations efforts downstream to build and maintain bespoke reports. Banks need to understand the extent of this kind of cost so they can take rational decisions about how to cut or avoid them.

Process mining tooling can help to quickly identify inherent process inefficiencies across multiple functions, teams and systems, without having to complete a lengthy period of current state mapping to target the highest-value optimization journeys. 

 

Reduce your cost of data 

If data isn’t just the new oil but the new air, then it’s critical to make sure you’re not paying too much for it.  

Optimizing costs across the data value chain, from sourcing to processing, storage and distribution, is often an overlooked opportunity. This is only going to become truer as the volume of data processed – and the need for high quality data to feed AI models – grows exponentially. 

Examples of how to reduce data costs include: 

  • Optimizing data governance costs using AI tools
  • Rationalizing data controls
  • Shifting to self-service reporting through a standardized data distribution strategy 
  • Increasing data quality to reduce the costs driven by poor data, expensive remediation activities and regulatory fines.  

The key to investing in the right strategy is first being able to accurately measure the true cost of data sourcing, management and reporting. These costs will be dispersed across many teams outside of the CDO’s team, especially where cottage industries of manual, duplicative reporting and reconciliation have sprung up across front, middle and back-office functions. 

These inefficiencies often appear as a result of inadequate data controls and limitations in both self-service reporting and analytics capabilities around the bank’s ‘golden sources’ of data.

 

Take a strategic approach to vendor value

Vendor costs are often the first go-to lever for rapid cost reduction. It makes sense to identify and eliminate waste and duplication and to aim for fewer, more strategic partnerships. 

However, the focus must be on value, not just cost. Introducing a race to the bottom can save costs in the short term at the expense of quality. 

This will ultimately result in increases in the ‘cost of poor quality’, for example longer and more costly testing cycles to identify and resolve defects, longer time to market and even, ironically, delays to the implementation of other cost reduction initiatives. 

While vendor rationalization should clearly have cost optimization as a target, a balanced scorecard based on other factors such as ‘delivery/resource quality’ and ‘time to onboard’ is required to prevent unintended consequences. 

Additionally, aligning vendor commercial constructs to your organization’s goals, such as cost reduction and efficiency, will typically improve long-term results. But there must be true skin in the game for your partners.

The approach is generally better than assessing vendors on a simpler time and materials basis, which may appear cheaper at the outset but ultimately fail to account for value delivered. 

 

Understand cost-to-serve

Catering for unique client requirements by offering bespoke products, solutions and servicing can generate revenue and help build strong and lasting relationships, especially in corporate and institutional banking. 

However, as with any investment of effort, the benefits should be weighed against the long-term costs. 

Many of our clients find it challenging to accurately map ongoing operations and servicing costs to customers and to understand the true cost-to-serve. The cost of bespoke services and processes often remains hidden and difficult to address. 

The first step is to accurately measure this cost to serve by allocating underlying operational activities and costs – such as query handling, bespoke report production and technology change requests – to specific clients through the right tooling, data and governance. This can then inform rational, data-driven decisions about the servicing and tiering of clients. 

Strategies to address a mismatch don’t need to be as drastic as exiting clients or simplifying product offerings, and can include: 

  • Charging for certain bespoke services or products
  • Introducing additional servicing tiers
  • Educating clients on the self-service solutions available 
  • Investing in integration efforts to enable clients to use standard offerings.

 

Creating continuous optimization strategies 

If banks are to build optimization strategies that are successful in both the short and long-term, they need to find better ways to track and measure their success. 

Too often, as programs are rolled out, we see an over-reliance on tracking lagging indicators such as FTE saves or direct cost reduction. These depend on lengthy HR and finance processes for attestation, often meaning it is too late to course-correct once it becomes apparent that the target benefits have not been delivered on time. 

Embedding the right quarterly targets and reporting on leading metrics is key. For example, banks can measure the reduction in the number of manual controls, the reduction in a process turnaround time, or the increase in productivity of operations teams, which all act as pre-cursors to cost reduction or avoidance. 

As specific initiatives are completed, banks must continuously monitor costs horizontally and watch out for costs popping up again in unexpected areas, such as cost savings in operations that result in additional client servicing efforts. 

They must also watch for other unintended negative consequences, for example, by monitoring the impact on the client and employee experience, on sales growth and on the ease of innovation. 

Lasting success means treating cost reduction and containment as a continuous discipline and mindset, enabled through constant measurement. The bank must empower operations, technology and sales teams with the funding and tools to continuously improve processes and reduce costs in a sustainable way – even when these efforts are not part of a formal program.  

 

Contact Us

To find out more about working with Capco and how we can help you overcome any potential challenges, contact our experts via the form below.