Carrot and stick

Data quality has moved up financial institutions’ agendas ahead of new regulatory regimes. But good data management also offers several advantages, reports Graham Buck

Headlines are an effective catalyst for focusing the mind, and financial institutions have fast approaching on the horizon the Basel III global regulatory standard on capital adequacy for the banking sector and Solvency II, its equivalent for the insurance industry. The latter is perhaps more demanding. Solvency II, which takes effect from the start of 2013, also requires insurers to implement economic risk management standards and fully understand risks inherent in their business before allocating enough capital to cover them. Despite the fast-approaching deadline, the requirements of Solvency II were imprecise until fairly recently.

Both regimes have heightened the focus for financial institutions on data quality and good data management. The credit crunch that commenced in 2007 and the ensuing global financial crisis starkly exposed the fact that banks were less adept at it than many imagined. Solvency II requires insurers to actually demonstrate robust internal governance and reporting mechanisms - they also need a data management and analytics model that both complies with the new regulations and provides advanced risk management capabilities.

Some have already experienced a taste of what might be in store. “Financial service companies are already beginning to think more carefully about looking after their intellectual property,” reports Nik Whitfield, head of investment banking at Detica. “Take customer data, for instance; we’re seeing clients who’ve had an unexpected and pretty unpleasant visit from the Financial Services Authority’s ARROW team (Advanced, Risk-Responsive Operating FrameWork) team and as a result are working with us to protect their customer data.”

Banks that seek to differentiate themselves on the basis of privacy have experienced the embarrassment of customer lists being stolen and sold on, with unwelcome attendant publicity. This often leads to large Data Leakage Programmes (DLPs) to try and reduce the risk of customer data loss.
“New technology is playing its part here,” says Whitfield. “Clever search and text analysis tools help to find items of customer information lurking outside of the banks’ main systems - for example in spreadsheets, text files and documents.”

Getting it right
For financial institutions to demonstrate the quality, availability and traceability of key data and comply with reporting requirements, a central store for risk and solvency modelling data is essential. Achieving this is easier said than done.
“The financial sector is coming under closer scrutiny,” says Glen Manchester, founder and chief executive of enterprise communications technology firm Thunderhead. “New regulation and compliance changes are regularly introduced, which is increasingly onerous for financial institutions to manage efficiently. It makes for a fluid and fast-paced environment that is progressively problematic and complex for organisations to navigate. This is a challenging situation for financial institutions that have much to lose if they don’t manage the complexity of these regulations correctly.”

One major obstacle exposed by the financial crisis is that many companies, particularly multi-disciplinary business, tend to be split into silos with a resulting “silo mentality.” Banks are typically more ‘silo-ed’ and tend to be larger entities, so their data management projects tend to be longer, more expensive and more complicated than those of other financial service organisations, suggests Daniel Simpson, chief executive at Cadis.

“They attempt to put in place large data warehouses, which require everyone’s agreement on which standard should apply,” he says. “This unfortunately never works, as everyone tends to have different demands. Basel III can therefore be regarded as a unifying theme.”

Silos within a financial services organisation also create conflicting ideas of what is essential data and how it should be interpreted. As Daren Cox, managing director and chief executive at business consultancy Project Brokers observes, investment banks include data managers, risk managers, front office, middle office and back office staff; all of whom may be using different numbers from one another. So the risk management team, for example, will take a diverging view from back office or front office staff.

“Front office typically operates on an intraday basis, so uses different systems from middle office and back office, which are more likely to work on a weekly, and sometimes even a monthly, basis,” he points out.

“The back office or compliance functions must report to the FSA or other regulator, so their requirements are different from those of the front office, with its focus on settlement,” adds John Mason, chief operating officer at financial data management specialist Netik.

“These very differing views of the same set of data create a problem with several interpretations of which one is ‘right’. Each is perfectly correct taken in isolation, but all of these interpretations have to be brought together. There is, unfortunately, no ‘Esperanto’ in the data world.”

Fortunately, if this obstacle is recognised banks can then decide how to cross-reference or matrix the data, adds Mason. The matrix approach is, he believes, probably a better approach than trying to enforce a single code. “The difficulty of the task isn’t confined to the banking sector; the oil and gas industry is another of many that has also decided there’s no such thing as a single code.”
According to Soren Mortenson, managing director at SunGard Global Services, it is evident that the discipline of ‘data governance’ is now emerging in ‘silo-ed’ organisations and its importance is steadily being elevated in importance. In the process silos are dealt with by establishing ownership of data and bringing consistency across the entire organisation.

“In a number of the companies that we work with, a relationship management platform has been established,” reports Kannan Amaresh, who heads the banking domain competency group at Infosys Technologies. “All of the connected data on a specific issue is gathered together in one single place.”
Identifying a single, modular technology platform helps to ensure a consistent standard of data in every market, and expedite processes such as data cleansing, geocoding and address validation, comments Tim Spencer, UK and Ireland insurance practice leader at Pitney Bowes Business Insight. “Maintaining one platform reduces cost of ownership and can speed up system integration. A single interface also simplifies training and education, and makes it easier to gain the skills and capabilities needed to achieve competitive advantage.”

Grasping the challenge
Each area of the financial sector faces different challenges, suggests Thunderhead’s Manchester. Insurers have typically had many different types of system in place as well as multiple data silos, reducing their ability to unify customer communications. Banks tend to have fewer systems, but communicate with their customers more frequently and across a broader range of products and transaction types, adding greater complexity to managing customer data.
Companies may, unintentionally, have compounded their difficulties, says Cadis’ Simpson. As storage becomes cheaper they have kept steadily increasing volumes of data but also increased the task of mining it for the most relevant information. “Many banks still don’t really know the precise extent of their exposure to counterparties,” he suggests.

So there is general agreement that the financial sector’s willingness to address these challenges is welcome - and overdue. “In recent years we’ve not only seen silos but also a temptation to ignore potential solutions because of their perceived cost,” says Rob Batters, technical director at IT consultancy and solutions provider Northdoor.

“There was also a general reluctance to go back to the very first principles and ensure there was a solid foundation established at the outset. Both the first and second pillars of Basel II re-established this base level for security, based on the premise that there was no point trying to ‘reinvent the wheel’. You can chuck technology at a problem - and niche technologies have been developed to respond to specific needs - but instilling basic good practice and ISO standards are also vital.”

Readiness to tackle the challenge is due in large part to regulation, but also because sponsorship for data management has been secured at executive level says Simpson. “At this senior level, reputational risk is now driving the initiatives, which typically are very data-hungry. On the banking side we’ve seen chief risk officers, who are among the biggest consumers of data, with both the necessary budget and authority to exert influence. “People are also realising that, when they get it right, data is a fantastic asset. It can, for example, free up significant capital so there is a considerable upside.”

One of the main perils of inaccurate data is that insurers may decide they need to hold more liquid capital than actually needed to comply with Solvency II, which in turn could also impact on the cost of their reinsurance. “Incorrect information, missing or misfiled data, duplicate records and inconsistent standards
lead to significant costs, delays and an incomplete understanding of the truth regarding their levels of exposure,” says Spencer. “For example, in the insurance sector a lack of reliable geo-locational data makes it impossible to manage their level of exposure or ensure they’re underwriting deals at the right prices. This could lead to all sorts of compliance issues and leave the company horribly over-exposed should a catastrophe strike, such as the recent earthquakes in New Zealand and Japan.”

Mortenson, who now works with insurers on Solvency II issues, agrees and says his key message to clients is not to limit data management initiatives just to becoming regulations-compliant. “Good quality data and proper data management brings opportunities to lower capital requirements, while greater reliability of management information should improve planning and decision-making. Operational risk should also be reduced, with a lower incidence of errors. Good data management could improve the way in which insurers do business and give them greater competitive advantage.”

One example is that actuaries’ work in checking data and performing other tasks manually could quite easily move to being automated and thus free up more of their time for data analysis. It might surprise some to see Ordnance Survey, an organisation still best known for its maps, to be working with the financial services sector. But Sarah Adams, its banking, finance and insurance sector manager, says that digital data now comprises most of its business. “The top 10 insurers use our digital data and mapping for both individual risk and aggregate assessment, with uses from reviewing the risk of flood ‘surge’ events to countering fraud,” she reports.

So in addition to assisting clients with the problems of legacy systems and incomplete databases ahead of Solvency II, Ordnance Survey is also working with lenders on the issue of mortgage fraud, such as loans for non-existent properties. “The Council of Mortgage Lenders reports that fraud detection levels are steadily improving, but as areas of fraud are closed or reduced new ones keep opening up,” says Adams. “Fraud has long plagued the motor insurance sector, with an increase in ‘cash for crash’ incidents, but the incidence of fraud is extending to other areas such as travel insurance.”

Various types of technology have developed in response, such as detailed analysis through geographic information systems (GIS) and web mapping services that can be accessed via handheld devices. And one suspects that Solvency II and Basel III isn’t the end of the story. Only a brave individual would bet against the odds of Solvency III and Basel IV - and a whole slew of new data management requirements - being on the agenda before the next decade arrives.

    Share Story:

Recent Stories


Safeguarding economies: DNFBPs' role in AML and CTF compliance explained
Join FStech editor Jonathan Easton, NICE Actimize's Adam McLaughlin and Graham Mackenzie of the Law Society of Scotland as they look at the role Designated Non-Financial Businesses and Professions (DNFBPs) play in the financial sector, and the challenges they face in complying with anti-money laundering and counter-terrorist financing regulations.

Ransomware and beyond: Enhancing cyber threat awareness in the financial sector
Join FStech editor Jonathan Easton and Proofpoint cybersecurity strategist Matt Cooke as they discuss the findings of the State of the Phish 2023 report, diving into key topics such as awareness of cyber threats, the sophisticated techniques being used by criminals to target the financial sector, and how financial institutions can take a proactive approach to educating both their employees and their customers.

Click here to read the 2023 State of the Phish report from Proofpoint.

Cracking down on fraud
In this webinar a panel of expert speakers explored the ways in which high-volume PSPs and FinTechs are preventing fraud while providing a seamless customer experience.

Future of Planning, Budgeting, Forecasting, and Reporting
Sage Intacct is excited to present FSN The Modern Finance Forum’s “Future of Planning, Budgeting, Forecasting, and Reporting Global Survey 2022” results. With participation from 450 companies around the globe, the survey results highlight how organisations are developing their core financial processes by 2030.