The global financial crisis sparked a far-ranging overhaul of the international architecture for financial regulation, coupled with a deep reflection around the fitness-for-purpose and effectiveness of supervisory efforts, writes Anne Leslie-Bini.
In response to the financial crisis that began in 2007, achieving transparency within the financial sector accounted for nearly 20% of the 2009 Group of 20 action plan. Since then, hundreds of rules have been written and – with more than 30 new reporting regimes coming down the pipeline to 2020 and beyond – firms and regulators continue to invest huge amounts of resources to implement the ongoing change requests on top of managing existing obligations.
Justifying the means to an end
According to research by industry analysts at Medici, between 2008 and 2016 there has been a 500% increase in regulatory changes in developed markets, which has led to 10–15% of the total workforce of financial institutions working in compliance functions, and an explosion of associated costs.
In the face of this regulatory tsunami, the question being asked ever more insistently is: exactly how resilient is the financial system as a result? The answer to this question depends largely on who is being asked and how the question is framed.
According to the Financial Stability Board, a decade of continuous effort has indeed served to fix the faultlines that caused the crisis, and the results are conclusive. The financial system is now reportedly safer, simpler and fairer than before, with large banks needing to hold 10 times more capital than previously to absorb unexpected losses, and G20 countries transforming shadow-banking activities into resilient, market-based finance.
However, one of the primary frailties exposed by the crisis was the inability of banks to efficiently source and collate risk data, which in turn severely hampered the efficacy of supervisory and regulatory actions by the authorities. In recognition of the gravity of the issue, the Basel Committee for Banking Supervision (BCBS) published its Principles for effective risk data aggregation and risk reporting in 2013, known as BCBS 239. This was a landmark development, because the principles impose minimum standards for data gathering and management, as well as for IT infrastructure, all of which tends to require additional investment in technology and significant organisational restructuring.
Easier said than done
It is interesting to note that, during the recent risk data fire drill conducted by the European Central Bank (ECB) as part of its thematic review into banks’ compliance with BCBS 239, it was revealed that – a full decade after the financial crisis – many systemically important European banks are still unable to aggregate and report risk data in a stress or crisis scenario.
Despite the intense regulatory effort, some of Europe’s most significant institutions have not managed to put in place integrated platforms and automated processes that enable the accelerated generation of key risk metrics in periods of stress and crisis. One of the primary conclusions of the ECB’s thematic review was the enduring lack of accountability for the data ownership that is required for a fully compliant BCBS 239 programme. The scrutiny exercised by the ECB during the review also revealed how reliant banks are on individual judgement and tactical workarounds in their risk management practices, while still not having all of the information available to a standard that satisfies regulators.
Glass half full, glass half empty
While the financial system may indeed be more sound and stable than it needed to be 10 years ago, progress should be measured in relation to both the starting point and the future end-state, the latter being more akin to an elusive moving target than a known destination. It is not necessarily through fault or for lack of trying; nonetheless, it would appear premature to claim the battle won.
Over the past decade, the world’s banks have garnered almost $1 trillion in profits, in spite of having paid a total of $321 billion in fines related to infringed regulation and customer redress. Regulators and supervisors remain engaged in a seemingly Sisyphean task to tame international finance, and it comes as no surprise that the public perception of the financial services industry remains badly tarnished.
With constrained means, regulators and supervisors are striving to maintain stability in an incumbent financial system with which they are well acquainted, while grappling with the additional challenges presented by emerging financial technology, or ‘fintech’, business and operating models, not to mention the difficulties inherent in keeping a burgeoning cryptocurrency sector in check.
Existing supervisory policies, processes and resources are already showing signs of strained rigidity in the face of a rapidly changing external environment, and this mismatch of adaptive capacity is likely to be further exacerbated in the coming years.
Furthermore, many financial authorities have not only responded to demands for improved regulatory and supervisory effectiveness in their core jurisdiction but also opted to extend the scope of their mandates to include responsibilities that were formerly considered to oppose, or at least blur, the stability mandate (such as financial inclusion, competition and consumer protection).7 This adds to the already difficult task of allocating limited resources in a balanced manner.
The emergence of suptech
As more and more supervisors strive to position themselves as catalysts for innovation and business growth through the introduction of more effective and efficient systems of supervision, there is an ensuing shift in the traditional debate between risk-based and principles-based supervision towards a more agile approach, which is insight-driven and fuelled by data.
New and revised reporting frameworks are generating unprecedented volumes of increasingly granular data, which needs to be sourced, aggregated, collected, validated, stored and analysed. Currently, this process is burdened with friction and inefficiencies, which cause significant financial overhead and human frustration at all stages of the regulatory data value chain.
A critical step in transforming financial supervision is therefore improving one of its most crucial inputs, namely data, and the automation of the processes that support the governance of this mission-critical resource. As a result, authorities worldwide are revisiting their data management infrastructures as the foundations for a fundamental rethink of their overall approach to supervision.
This is where the newly dubbed ‘suptech’ – or supervisory technology – is entering the fray.
As defined by the Financial Stability Institute, suptech is the use of innovative technology by supervisory agencies to support supervision work. Suptech is currently found primarily in two areas: data collection and data analytics, and this shift to a data-centric focus is allowing authorities to adopt a more insight-driven approach in the exercise of their mandates.1
By helping supervisory agencies digitise reporting and regulatory processes, the aim is for more efficient reporting to reduce the burden on firms and more proactive monitoring of financial institutions’ risk profiles and overall compliance. Both individually and collaboratively, supervisors around the world – from the UK to Singapore, from Mexico to the Philippines and beyond – are actively exploring innovative ways to implement even more effective risk-based supervision.
Today, thanks to technological and scientific progress in the areas of machine learning, artificial intelligence, distributed ledgers and big data, supervisory authorities can envisage fundamentally overhauling their existing supervisory tools and developing considerably superior applications by leveraging the advances of suptech.
However, with great potential come great challenges, and authorities are grappling with issues that are eminently practical in nature, such as the availability and quality of data, and the availability of internal expertise. While technology can supplement and augment human capability, the role of human judgement and transparent oversight of the use of emerging tools and technology is more crucial than ever – to avoid the hazards of automated ‘black-box’ decision-making or the risk of hard-coded algorithmic biases in machine learning, for example.
Suptech and its close relative regulatory technology, or ‘regtech’, are being touted as the ‘next big thing’ to hit the financial services industry after fintech, serving primarily to reduce the cost and complexity of complying with and implementing the myriad regulations that have been enacted since the financial crisis. However, is aiming to only do more of the ‘same’, faster and cheaper, perhaps missing the point?
Regulation and supervision do not exist in a vacuum, nor does technology: all evolve organically in a geopolitical, social and economic context, and the direction their development takes is conditioned by the policies and decisions made at every level that serve to shape society.
The global economy has never known so much wealth from a macro perspective, and yet the global prosperity gap has never been so enormous. When eight high-net-worth individuals possess riches equivalent to those held by 50% of the world’s population, when populism rises to be a palatable alternative to a flailing status quo, when democratic process and individual freedom are facing a real and present danger – then the time has come to acknowledge that the global order, and the economic and financial system that underpins it, needs significant adjustment to produce a more sustainable set of collective outcomes.
Clearly, much work remains to be done to reinforce stability and restore trust, both within and beyond the strict confines of the financial system. In a context of finite resources, this necessarily translates into an arbitrage of priorities; success, however it is measured, lies in the clear definition of our desired outcomes and the alignment of effective incentive mechanisms to achieve them.
In times of stress, contagion knows no geographical frontiers and yet regulation and supervision are still largely national efforts. While the primary mandate of supervisors and central banks remains financial stability, what if the misalignment between the observed pace of change in the industry and the adaptive capacity of regulators, coupled with a dogged commitment to maintaining an arguably anachronistic status quo, became the catalyst for an unanticipated source of systemic risk?
Suptech can equip financial authorities with 21st-century tools to bring the ease and efficiency of real-time digital monitoring to a realm that has largely remained analogue. In an era of data abundance, regulators and supervisory authorities are facing new, complex challenges because of expanded mandates and the rapid growth of digital products that cross sectors.
Real-time capabilities for real-world complexities
One of the most promising developments in suptech is its application in data analytics, where it is being used to turn risk and compliance monitoring from a backward-looking process into a powerful, real-time predictive and proactive endeavour. Building on this as a starting point for a new supervisory orthodoxy, what if suptech could jump-start innovative thought processes around broad-based collaborative efforts between industry stakeholders, encouraging new supervisory approaches that mutualise resources, crowdsource learning and eliminate redundant effort?
In this changing environment, authorities are finding themselves having to analyse new, dynamic datasets and collaborate for cross-border, cross-sector supervision. Getting value from these new datasets requires new capabilities – such as modelling and visualisation techniques, scenario analysis and stress testing – and, while this is challenging, it can generate significant benefits for consumers, providers and financial authorities by increasing access to financial services, creating an enabling environment for innovation, as well as allowing for more targeted supervision.
If we approach suptech less as a specific set of applications and more as a holistic mindset shift, could perhaps the resulting technology stacks and process blueprints form the backbone of a shareable ‘soft infrastructure’ to support a new global model of supervision that is truly adapted to the realities of the times in which we live?
Across domains, across borders, the pace of change is unsettling and relentless. The only choice lies in electing how we respond: by resisting the pressure exerted by external forces, and risking irrelevance and obsolescence, or embracing the discomfort of instability and actively working across agencies and across borders to shape an augmented set of positive outcomes that are the fruit of vision, leadership and sound governance.
Anne Leslie-Bini began her career at BNP Paribas and ABN Amro, and today is a director and member of the Executive Leadership Team in the RegTech Solutions unit at BearingPoint. She also lectures part-time on the topic of blockchain at the NEOMA Business School and mentors at the BNP Paribas Plug and Play Fintech Accelerator in Paris. Particularly active in the global regulatory community since 2009, Leslie-Bini’s drive and commitment have allowed her to gain the peer-legitimacy to be successfully elected to a number of working groups that push industry collaboration around transformative agendas. Well known on social media for her engagement on such topics as regtech, fintech, blockchain and diversity, she is an accomplished conference speaker as well as a contributor to thought-leadership publications on the future of banking, compliance and the international regulatory environment.
1. An insight-based approach to supervision is an incremental improvement to existing risk-based supervisory frameworks achieved by bringing together a multitude of data points and combining them in new ways to form a more complete, and less obvious, picture of financial institutions, collectively and individually. As the quantity and quality of data increases, technology such as machine learning can detect weak signals and outlier results that might otherwise go unnoticed.