Building up the banks

Basel III will necessitate a careful examination of banks' IT systems to reveal how to best provide a single risk profile based on verifiable data, calculated in the right way and stemming from previously distinct silos.

The rules include an increase in capital requirements for common equity and Tier 1 capital, tighter limits on leverage, more stringent liquidity controls and a resolution mechanism for strategically-important firms with capital requirements for complex exposures, such as those created by securitisation, to rise as much as fourfold.

Estimates show that capital levels will now need to go up by as much as 25 per cent, and a July 2010 report by JP Morgan showed that the world's 18 largest banks would need US$95bn more in capital. But having increased capital levels is not the only cost: the cost of reaching and maintaining compliance is expected to be significant, with a thorough look at systems to ensure not only compliance, but more analytical and reporting tools to track and highlight non-compliance, required.

Aggregation
Data aggregation to provide a holistic yet granular view of a firm's risk is perhaps the biggest task to be undertaken. Banks need to be able to see overall exposures on a firm-wide basis. To do that data that is currently maintained on a desk-by-desk basis needs to be brought together and analysed.

PJ Di Giammarino, CEO at JWG, explains: "A key issue is identifying what data needs to be made available outside the silo of origin, whether it is accurate and thus able to be added to the firm's overall risk-profile calculation. An additional problem is getting common data definitions so that the same data from various silos is being aggregated."

Louis Teunissen, principal consultant for banking at SAP, adds that by keeping data in distinct silos rather than aggregating it, banks do not see the trade off between 'x' and 'y' risk. "By aligning all the financial, accounting and accountancy data you get a better overall view of risk and you also spend less time reconciling data from various systems," he says.

Data quality and accuracy is the next issue to be addressed. There is little point in aggregating data to provide a 360 degree view if it is not correct or is out of date. Indeed, without good-quality data being inputted then the end result will always be a poor one – and because the information resulting from the calculation is to be used for critical decision making, the initial quality is of utmost importance. It is used for stress testing, pricing, assessing counterparty risk and is also leveraged for internal strategic purposes. It therefore needs to be credible.

Leigh Bates, head of financial services at SAS, comments: "There has been a lot of attention on new processes and now it is the turn of data quality and lifecycle. Currently there are too many systems with too many data points; the legacy of acquisitive growth where systems have been patched together. There now needs to be clarity on where data has come from, how it has been used and whether it is still timely or not. It is really hard to manage that lifecycle especially when data is housed in a number of distinct systems and areas."

Journey
The journey of the data is key as the Financial Services Authority (FSA) needs to see how a given calculation has been reached, and this, according to Bates, is one area that is ripe for streamlining. It also needs to incorporate other aspects such as risk management and meta data management (tracking). Doing this allows certainty that the data analysed has been based on the same capital rules and that the same framework has been applied to all data – even the calculation itself is a huge system issue, according to David Kelly, managing director at Quantifi Solutions. "The calculation is run on various 'what if' scenarios to determine capital adequacy. If you are simulating various 'what if' scenarios over many variables then it becomes even more complex and banks need big grid systems to calculate in a timely manner the liquidity, leverage ratio and securitization figures," he says.

Unsurprisingly given the complexity and variability of what is required, in some cases Basel III has also served to illuminate an inadequacy gap; systems are just not capable of bridging the silo divide, linking up all of the systems and ensuring that the end result is an accurate aggregation delivered in a timely manner.

Essentially this means that IT departments will need to work together to join up systems, including treasury and risk, to support massive demand to access and acquire the right data at the right granularity level, to provide risk assessment at an acceptable quality, consistency and time and then be able to push the data back out across the bank.

Di Giammarino comments: "The correct framework is not just essential in helping banks to achieve compliance, but is also crucial for them to maximise opportunities for growth, while minimising operational risk and tightening measurement systems for key operational metrics."

Opportunity?
Indeed, banks have the opportunity to use the technological changes forced by Basel III for strategic gain. But will they take it? Much depends on whether a firm has already
addressed the data consolidation and data warehousing issues of Basel I and II, in which case Basel III will just be an additional block to place on top – changes will be more about a recalibrating exercise, even though the new data requirements, especially around the treasury and finance functions, are undoubtedly complex.

"This is probably not going to be a greenfield exercise – more a case of putting in information hubs to aggregate all the data coming in, and turn that around to provide the right output. The key is to have a good look at a firm's dashboard as well as what lies beneath to best access existing systems and aggregate and manipulate that data," adds Di Giammarino.

But Heather-Anne Hubbell, head of risk at Rule Financial, thinks Basel III presents a good opportunity to rethink technological operations: "The back office has usually been quite low priority but now the front office decision makers would do well to recognise that it may be cheaper to invest in joined-up and capable IT systems now in order to make more effective future strategic decisions on the back of aggregated and
granular data."

She thinks the timing, in terms of the broader marketplace, is also fairly good. "Banks have reduced headcounts, hived off non-core areas and cherry-picked and then streamlined the areas they do wish to work in. The next logical step is to have a look at operations and to move that forward. Basel III is a good excuse to make a start on that and build a multi-layer system that removes operational risk," she says.

But in the end, the opportunity is there for banks to use these changes to reassess infrastructure, argues Cubillias Ding, research director in the Securities group at Celent. "This is a very real opportunity to reassess infrastructure – regulation is not going away and compliance can be taken further and used for strategic purposes. Clever banks know this."

The Basel III timeline:
The Basel Committee on Banking Supervision has set timeframes for the changes to take effect:
1 January 2015: The Common Equity level will rise from two per cent to 4.50 per cent, and the Tier 1 Capital requirement will increase from four per cent to six per cent.
1 January 2019: Additional Common Equity Capital Concentration Buffer will kick in at 2.50 per cent from 0 per cent, as will the Addtional Countercyclical Buffer.

    Share Story:

Recent Stories


Safeguarding economies: DNFBPs' role in AML and CTF compliance explained
Join FStech editor Jonathan Easton, NICE Actimize's Adam McLaughlin and Graham Mackenzie of the Law Society of Scotland as they look at the role Designated Non-Financial Businesses and Professions (DNFBPs) play in the financial sector, and the challenges they face in complying with anti-money laundering and counter-terrorist financing regulations.

Ransomware and beyond: Enhancing cyber threat awareness in the financial sector
Join FStech editor Jonathan Easton and Proofpoint cybersecurity strategist Matt Cooke as they discuss the findings of the State of the Phish 2023 report, diving into key topics such as awareness of cyber threats, the sophisticated techniques being used by criminals to target the financial sector, and how financial institutions can take a proactive approach to educating both their employees and their customers.

Click here to read the 2023 State of the Phish report from Proofpoint.

Cracking down on fraud
In this webinar a panel of expert speakers explored the ways in which high-volume PSPs and FinTechs are preventing fraud while providing a seamless customer experience.

Future of Planning, Budgeting, Forecasting, and Reporting
Sage Intacct is excited to present FSN The Modern Finance Forum’s “Future of Planning, Budgeting, Forecasting, and Reporting Global Survey 2022” results. With participation from 450 companies around the globe, the survey results highlight how organisations are developing their core financial processes by 2030.