- Joanne Horgan, Chief Innovation Officer, Vizor Software
- Justin McCarthy, Chairman of the Global Board, Professional Risk Managers’ International Association
- Rabi Mishra, Chief General Manager, Reserve Bank of India
- Moderator: Daniel Hinge, News Editor, Central Banking Publications
Central Banking convened a panel of experts to discuss how central banks and other authorities are making use of new risk-based assessment techniques to remain ahead of the fintech curve.
Rapid developments in financial technology – commonly referred to as ‘fintech’ – have enhanced the need for high‑quality risk-based supervision as threats evolve and move to new areas of the financial system. Greater quantities of data present not only new opportunities for detecting and responding to risks, but also challenges.
Among the topics under discussion are the best methods and practices for collecting and combining qualitative and quantitative data, why central banks and financial regulators should invest in advanced regulatory and supervisory technology, how that technology can aid co‑operation between regulators and the regulated, and the application of expert judgement in setting an overall risk score of regulated entities. The panel also addresses issues around increased amounts of data, how central banks and other authorities are making use of new risk-based assessment techniques to stay ahead of the curve and offers insight to those looking to harness risk reporting to better protect financial stability.
How do you define risk-based supervision, and how does it differ from other forms of supervision?
Justin McCarthy, Professional Risk Managers’ International Association: Traditionally, regulators would have taken much more of a compliance-driven approach – supervisory staff in regulatory bodies would go in with a set of rules, they would compare the firm to these rules and measure an organisation against them. Often there would be no consideration of the larger systemic risk to the economy from different organisations. Also, this approach would only look at individual organisations, do its supervision and present its findings. The risk-based supervision approach takes a view that certain firms – if they were to experience an unfortunate event such as a collapse – would do a particular level of damage to the national economy. With risk-based supervision, regulators can allocate limited resources to where there are larger risks in the economy.
Can you describe your experience with risk-based supervision in India?
Rabi Mishra, Reserve Bank of India (RBI): Since the global financial crisis, there has been a significant shift towards a risk-based framework; however, since 2012 the ‘Camels’ approach – whereby a firm’s capital adequacy, asset quality, management, earnings, liquidity and sensitivity are assessed – has been replaced with an elaborate risk-based approach to supervision. The Camels approach essentially uses a backward-looking methodology and transaction-testing model. It also has the drawback of being a ‘one-size-fits-all’ approach, and is behind the curve when it comes to keeping pace with industry as it is seen to be static in nature. Moreover, in the compliance-based Camels approach, individual risks are examined in isolation, whereas in a risk-based framework it is the interaction between risks that are observed.
There are two main objectives in India for risk-based supervision. The first is ensuring the soundness of the individual banks, thereby protecting the interests of depositors. The interest of the depositors is the priority, and the second objective is to safeguard the stability of the financial system. The risk-based approach to supervision aims to achieve these objectives via a process of proactive assessment of the measured risks. The critical difference is that, under a risk-based approach, a more organised structure is in place to identify and quantify the activities of the bank that carry greater risk, and also to assess the risk management practices and controls in place to mitigate the risk.
How has risk-based supervision developed in recent years?
Justin McCarthy: Since the financial crisis, an understanding has been reached that we have to do things differently, that something failed in how we were supervising entities. Too often, a supervisory team would go into an entity and someone would look at the credit risk, someone would look at the market risk and then someone at operational risk, but no one was standing back and asking: ‘Is this a viable business model? Is this an organisation that can continue to function in future years?’ A big part of it has been saying that we now have different kinds of risk in the organisation.
In what ways does risk-based supervision require a balance between a judgement-based approach and a data-driven one?
Joanne Horgan, Vizor: I think you need both. Having data come in a timely manner directly from the firm with the assurance that it has been quality-checked is really important, but you also need to have that wider view and be able to see the interaction of those different risks. This is about the probability of failure and its impact across a lot of different risk categories. Collecting the data is essential but, equally, so is having a system where the supervisor can exercise judgement based on the key risk indicators (KRIs). This ideal system combines the data-driven approach producing automatic calculations with key indicators being flagged to the supervisor, and then a system where judgement can be exercised and recorded.
According to our poll, about 70% of people are actively using risk-based supervision and another 20% or so are planning to use it in the future – only about 12% say they are not yet using it.
Justin McCarthy: That is interesting because this is an approach that has been increasingly adopted over the past few years, so one might have expected the take-up to be closer to 100%. A big part of the risk-based supervision project I undertook in Ireland was organisational change. Technology is a huge enabler, but a major aspect is going into an organisation and telling them how we are going to carry out our supervision from now on.
Joanne Horgan: Technology is a really important factor in this as it is an enabler of risk-based supervision. It’s not the entire solution – but its does provide a solid foundation for a change in approach. You need the team to establish the goals and long-term direction, and then the technology needs to be able to adapt as the approach becomes more embedded. When an organisation is starting with the risk-based supervision approach, it often needs an existing framework to work with, and there are some common themes in terms of risk categories and KRIs that you might derive from; but each jurisdiction will have a particular way it wants to approach the overall risk framework, and you need technology that is capable of changing with that over time.
Rabi Mishra: One of the key ingredients in a risk-based supervision framework is technological data mining analysis. Risk-based supervision is critically dependent on the robustness and the authenticity of data provided by regulated entities. Banks need to be encouraged to remain on board in the tasks of developing IT systems with similar wavelength to the supervisor, so that the online data transmission from each of the banks to bring them onto a single centralised platform at the supervisor’s end is smoothened. For example, in India there is a central repository of information on the large credits that collects, stores and disseminates data on all the borrowers’ stressed credit exposures. The purpose of such a mechanism is to improve transparency of credit information, which in turn enables banks to identify the borrower’s financial status to help recognise and resolve asset quality problems.
Is there such a thing as too much data?
Justin McCarthy: It’s wonderful to gather huge amounts of data, and there is a comfort in having as much data as you are able to. The challenge, however, lies in finding ways to use it. I worked at a senior level in a large bank last year – we were receiving requests from our regulators for vast amounts of data. You’re left wondering if they are using it all and, frustratingly, different departments of the regulator will request it at different times. There is an onus on the regulator to show the regulated entities that this data is actually being used.
Joanne Horgan: Where you have a regulator that perhaps has an advanced analytics system in place, with a very clear stream of data coming from the regulated entities and going into an analytics system, it’s wonderful. We sometimes find that it may take a while to get to that point, so a regulator may want to start more simply if you’re introducing risk-based supervision. It’s really about collecting the data that’s important, in a timely manner, making sure it’s quality-checked and then ensuring it is usable downstream with the right context and quality indicators.
How realistic is it to expect banks to do more to upgrade their IT systems to make it easier for data sharing?
Justin McCarthy: One of the biggest problems in banking at the moment is a cost challenge – as banks have to hold more capital, they’re encouraged to take less risk, and I know from banking clients there is a huge challenge to get any kind of budget size involved. The flipside is if someone said: ‘We will reduce the cost of regulation if you put in place the systems as it will allow us to gather data easier.’
Joanne Horgan: I think there is a challenge to show that the cost of an IT spend is going to be paid back in some way. IT, and technology in general, definitely has the potential to reduce the cost of compliance, but there is an onus on regulators and banks to work more closely together and with the wider fintech or regulatory technology – known as ‘regtech’ – communities to really look at how these IT investments are going to generate a return. There is a challenge on both sides to work together to ensure any IT spend is justified and is delivering real business value.
How is data, for stress testing in particular, different? How can regulators and supervisors ensure that it is accurate?
Justin McCarthy: It’s a subset of the same problem because you would hope the data you get is valid and correct. They have probably spent a huge amount of time gathering all that data from different systems within the organisation. One of the problems with stress testing is that it becomes quite public and political when the results are published.
Extensible Business Reporting Language (XBRL) is something we are hearing quite a lot about. How far has it spread, and is it the ‘gold standard’ in data collection?
Joanne Horgan: XBRL has been around for quite some time, and is used extensively in Europe and Asia. It is a very good format for data collection, but is not the only format. This goes back to some of the concerns from the industry about a format being mandated, which may be expensive to implement at regulated entities. The benefit of collecting the data in a standardised format, as in Europe, is that it enables the national competent authorities to send the data to the European supervisory authorities to undertake comparisons and analytics. It’s about making sure the structure suits the type of data you want to collect. For a risk-based approach, there is a significant amount of structured or quantitative data to collect to derive ratios or KRIs, but there are also going to be unstructured pieces of information and qualitative feedback that need to be captured, for example, when analysing the business model of an entity.
What formats does the RBI use, and how does the bank organise that data?
Rabi Mishra: We have developed a compendium of data point definitions to aid the standardisation of information flow across banks. The RBI has made significant progress in accessing information directly from transaction-level data. The change in data collection and compilation has predominantly occurred in two areas: (1) the standardisation of inputs, and (2) minimising manual intervention and data transformation in banks. This technology and risk-based supervision are risk- and bank-specific. Systemically important banks attract more supervisory attention, and so require more of this type of technological investment. Also, we are working widely in the area of fraud, taking steps to prevent hacking and cyber-related risk while simultaneously creating an infrastructure on banks’ information flow to the central bank and supervisors.
What can smaller central banks do to keep pace with this technological challenge?
Joanne Horgan: It’s a very good point that Dr Mishra makes – some of these advancements have been employed in the RBI, such as looking at transaction-
level data and making more of a direct data flow between banks and the central bank. For some smaller central banks, that might seem light years away, but it’s not – it’s happening today in smaller jurisdictions. Just because some central banks may not have the capabilities today to collect XBRL or acquire this type of transactional data does not mean they cannot implement an effective risk-based supervision approach and technology system. It is important to begin leveraging technology – if it’s Excel for now, start using Excel, but start to standardise templates, put rules on the data, get it into a centralised supervisory system and look to move into more standardised XML or XBRL and more automated systems over time. It is important for smaller central banks to know there is technology already available that does not have huge implementation costs and timelines.
We’ve spoken a lot about the quantitative data so far, but what is the best way of combining qualitative and quantitative data?
Justin McCarthy: Qualitative data is interesting because you might have large numbers of notes, board meetings and minutes being brought in. Traditionally, you would require somebody to attend board and committee meetings, and meet with the audit committee – and that’s very resource-intensive. We can take that approach and then have somebody perhaps assess the data and write something relatively unstructured into your supervision system.
Joanne Horgan: Technology has to be able to collect both structured and comparatively unstructured data. It is very important to make sure all of the qualitative data that comes from the firm is combined in one database, but also to look at how to use new technologies for the future. When you are talking about judgement, you need somebody that has a lot of experience because they have to understand what all the different sources of information they might have are based on. There is huge potential for technology to assist here, even with a judgement-based approach. If technology can handle the quantitative data component, this frees up the specialist resource required to look at the qualitative data. Over time, machine learning will also become more frequently deployed in these scenarios.
Has the price of technological investment reduced sufficiently, and are there tools available to allow smaller supervisors to implement machine leaning and other big data techniques?
Joanne Horgan: It’s still quite early. There are tools being added continually and there is published evidence of trial concepts coming from different regulators. The time it takes to implement and the cost of investment in technology are still quite high, but they are coming down. And the more engagement that regulators have with the fintech and regtech communities, the more we’re going to see that speed up.
We’re in a digital age now; it no longer takes three to five years for these changes to happen – they are happening very quickly now. There are a couple of examples of real-use cases in the regulatory space, but I think machine learning is still quite limited to the bigger regulators with more budget for experimentation. It’s just important that the technologies will become more commoditised and more accessible over time.
What can supervisors do to be more forward-looking? How can we move ahead and look at emerging risks?
Justin McCarthy: That depends on your horizon for emerging risk, because some of the early-warning systems might be saying we have a bubble emerging in a certain part of the economy. Cyber is something we are hearing more about, and you’re left wondering how you can get a structure in place for your supervisory staff so they can perform an adequate review of something that is an emerging risk. You may not have staff on hand that are experts, so how can you put in place a controlled assessment and review of data that can flag potential issues with cyber in your sector? And we may have to help identify where one of our banks will have to bring in outside expertise to work on it.
Joanne Horgan: Whatever IT system you have, it is really important that it is flexible enough to collect the data you want, whenever you want it. It is also important that you have enough data and skills internally, and can forecast based on trends in the data. Spotting emerging risks has to be based on more than just the data that you collect directly from regulated entities – for example, there is market sentiment analysis. Perhaps ensuring you have a system in place that can handle many different sources of data and has the capabilities to combine them and look at potential emerging risks provides the best basis to be forward-looking and future-proof.
Rabi Mishra: Our discussion on risk-based supervision is, by definition, forward-looking, and there are areas in which a forward-looking approach can use larger data availability: for stress testing; for bubble detection and as an early-warning mechanism; for financial fraud detection; and for assessing the creditworthiness of borrowers and the realisable value of the collateral promised with the loan.
According to our poll, in terms of the human element, top of the list of challenges faced by central banks, next to data, is resources and skills.
Rabi Mishra: Skills are an issue. The industry has been confronted with a tremendous requirement for skill, which has resulted in many turning to outsourcing – creating another of these interesting areas of outsourcing risk. I would summarise the future priority as being skill and ethics in manpower. Technology will come and go, but what remains is basic common sense and honest human beings. Without these, nothing will work.
Justin McCarthy: Use new technology, such as machine learning, to save resources, and some of the interactions discussed. I have seen examples of small projects that can turn data into judgement, so also try to do a small project – start with something small and make it work.
Joanne Horgan: It’s a combination of technology and human judgement – we’re not at that stage where we’ve got machines making decisions for us, but I do think that anyone introducing risk-based supervision should invest in technology that is future-proof. It needs to be able to acquire additional data over time while maintaining data quality and context information so that this larger pool of data can be used for more machine-assisted decisions in the future.
This is a summary of the forum that was convened by Central Banking and moderated by Central Banking’s news editor, Dan Hinge. The commentary and responses to this forum are personal and do not necessarily reflect the views and opinions of the panellists’ respective organisations.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email [email protected]
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email [email protected]