Sponsored by: ?

This article was paid for by a contributing third party.

Expecting the unexpected

Expecting the unexpected

Lars Schröder, senior engagement manager at SkySparc, looks at what is driving shifts in central banks’ technology requirements.

It is 50 years since US president Richard Nixon moved to terminate the Bretton Woods era. Has the past decade been the most tumultuous in modern central banking? Very possibly. 

The unprecedented support provided by central banks to economies in the aftermath of the global financial crisis that began in 2007–08 continues to reshape central banking. The rapid expansion in the range, volume and credit quality of instruments held has revolutionised and disrupted many areas of central banking operations. When you’re used to trading AAA rated sovereigns, modelling the cashflows from asset-backed securities is not an overnight quick fix.

The strategic and operational implications of inflated balance sheets and extensive asset purchase programmes are continuing to unfold. But any thoughts of reaching a new equilibrium have been undermined by the varied and critical roles played by central banks in responding to the economic impacts of the Covid-19 pandemic. As governments issued more debt, and took steps to support industries, businesses and households, the formation and execution of policy responses relied heavily on central banks, which were also expected to handle new remote working conditions. 

Mandates will continue to change, broadening the remit of central banks and the range of considerations they must include. Since the first quarter of 2021, the Bank of England (BoE) has been charged with factoring climate change into its oversight of systemic risks. In Q3 2021, the US Federal Reserve said it would take a broader approach to fulfilling its full employment mandate. Both these changes have significant implications for policy, but also for systems, given the increased data input requirements and the potential need for rapid responses. 

“There has been a significant change in the way we handle data,” according to European Central Bank (ECB) senior adviser Per Nymand Andersen, speaking on a recent Central Banking webinar in which he highlighted greater use of alternative datasets to monitor financial and economic activity.

Implications for operations and systems

In this context, it is hardly surprising that operational and technological priorities have evolved rapidly. Stability, security and resilience of IT systems and architecture are no less important, of course. But flexibility and agility have risen up the agenda to allow central banks to handle higher volumes and wider ranges of data, and to increase speed and responsiveness.  

To source, store, analyse, integrate and distribute a greater volume and variety of data, central banks cannot rely on existing systems alone. As well as needing access to the latest tools and functionality offered by their core platform providers, they also require the ability to integrate new solutions and technologies, connecting and integrating them with automated but customised routines. They also need to nurture and harness a wider range of data science skills from their staff. 

Skill sets are already beginning to overlap, with new recruits increasingly expected to be comfortable with quantitative analysis, Python programming and machine learning algorithms. From a systems perspective, this means a greater emphasis on self-service, with central banks needing to offer staff access to a multidimensional toolbox, to better leverage and optimise the new skills and datasets.

This is a significant departure from central banks’ traditional approaches, but far from the only one. The pace at which central banks need to operate is accelerating. The ‘fail fast’ mantra of extensive testing and incremental development may be synonymous with fintechs, but it is now increasingly ubiquitous across the finance sector, with even central banks looking to improve agility through constant upgrades and enhancements. 

Similarly, the essential role data plays in informing mission-critical central banking decisions requires greater attention to the presentation, reporting and distribution of data. Tableau, Power BI and other business intelligence tools are increasingly being adopted and adapted to needs of central banks, helping staff visualise operational, market and credit risks, for example, or to develop new routines across departments. 

How can third-party service providers help? 

Over two decades supplying technology solutions and services to central banks, SkySparc has never stood still. And, just like central banks, it has increasingly picked up the pace of innovation over the past decade to serve as a one-stop shop for institutions looking to improve performance by optimising their technology platforms. 

For example, we have overhauled our systems upgrade and implementation capabilities to introduce a more packaged offering – Patch Upgrade as-a-Service – which eliminates much of the time traditionally consumed by ‘once in a blue moon’ projects. By taking a more programmatic and automated approach, SkySparc’s central banking clients have been able to install patches more frequently, even to complex core platforms. This offers more rapid access to new functionality, such as interbank offered rate (Ibor)-related fixes, while enhancing cyber security and ensuring compatibility and integration with other applications. 

Similarly, SkySparc’s OmniFi data management and process automation solution has long been core to its services, but is constantly renewed via a continuous release cycle. OmniFi leverages the cloud to connect to more applications in more locations, supporting innovative new routines, created and deployed on a self-service basis, putting power in the hands of users. Widely used by central banks to handle growing data warehousing needs, OmniFi recently evolved further, with the addition of a dedicated Data Mart module. This helps take clients’ data analysis capabilities to a new level, providing a simple, fast and customisable channel for feeding data from core systems into a new generation of data analytics, offering users valuable insights and efficiencies.

Furthermore, SkySparc has continually evolved its outsourced support proposition in response to client challenges. Recently, this has ensured it can support the growing need to work across multiple testing and production environments as development teams look to make smaller, but more frequent, changes to core systems. 

More change ahead

There was a time accelerating processes at a central bank meant running up the stairs with a message, rather than waiting for the lift. Those days have gone the way of the gold standard.

With governments worldwide looking to achieve substantial progress by 2030 towards meeting their 2050 net-zero greenhouse gas emissions targets, this decade could see even more change than the one just past. Certainly, the expansion of datasets used by central banks will continue to grow apace, with the ECB already acknowledging the need to incorporate environmental, social and corporate factors into its asset purchasing programmes.

SkySparc will continue to be alert to the evolving needs of central banking clients, and looks forward to the opportunity of supporting new innovations in the decades ahead. 

 

SkySparc was named Technology Adviser of the year in the Central Banking FinTech & RegTech Global Awards 2021

You need to sign in to use this feature. If you don’t have a Central Banking account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account

.