Solid foundation

With the Solvency II deadline approaching, financial services organisations will be able to reap the benefits of a well-designed cross-functional technology platform. But challenges remain. Dave Howell reports

With the Solvency II deadline almost upon us, many companies within the insurance sector are reporting that their IT systems may not be ready to fully implement the directive. Moreover, insurers also have to contend with additional frameworks including IFRS 4 Phase II (insurers) and Market Consistent Embedded Value (MCEV) that are placing additional burdens on the data integrity within insurers’ organisations.

The insurance industry as a whole is striving to meet the deadline imposed by the EU Directive. A recent survey from Deloitte indicated that over 95 per cent of respondents were ‘fully aware of and engaged in the regulatory responsibilities and opportunities arising from Solvency II’. This is a rise of over 83 per cent on the previous year’s survey.

What Solvency II is illustrating is that many companies in the sector do not have robust data management. The new directive is tough and demanding, but the industry as a whole should benefit from the overhaul of its IT systems that Solvency II is driving forward. In addition, companies can also expect savings and efficiency gains once it has been fully implemented. Steve Young, head of business development, Citisoft says: “If firms take the correct approach and look to implement a true enterprise data management policy that should drive savings and improved operational efficiencies.”

Data challenges

As insurers have begun to understand how their IT systems must evolve to meet the requirements of Solvency II, it has become clear in some quarters that data integrity must be maintained while the changes take place. “Solvency II presents several challenges: obtaining the broader data set of trusted data necessary to drive asset and liability models, changes to valuation and accounting standards, new reporting requirements, and the need to implement an auditable data governance process,” comments Steve Engdahl, SVP, product strategy, GoldenSource. “Each of these points to the need for a secure, scalable data management infrastructure which can cover the appropriate data sets - securities and related reference data, prices, rates and curves, customer and counterparty hierarchies, positions, and transactions.”

What many companies are finding is that they need to develop their IT systems to implement Solvency II and have a level of granularity that offers their companies the depth of data integrity that enables their companies to make accurate predictions. What is a concern is that some companies may only move their IT platforms to minimum requirements to meet the directive and not offer their data users the level of granularity or flexibility they need.

Colin Rickard, business development director, EMEA at DataFlux also points out: “The technology exists to drill down into great detail, i.e. to the individual field level within policy documents and to understand the interrelationships between data. However, cracking the data dilemma isn’t a technology issue alone, it’s got a lot more to do with cultural challenges.”

For many companies the development of more efficient data warehousing is the key to them meeting the Solvency II deadline and also ensuring that their model for managing data passes scrutiny. Duncan Ash, Solvency II strategy manager, SAS UK comments: “The main challenge in achieving this is the traditionally siloed nature of many firms. This siloed structure has lead to a disconnect in data usage and communication between business lines. While individual sectors may have a view of their risk, enterprise-wide, risk is much harder for organisations to define. Firms must review their infrastructures, communication and data sharing between business lines to achieve Solvency II compliance.”

Has Solvency II prompted a rethinking of the way IT is used across the industry to deliver not only better services to customers, but also more in-depth data that companies can use to base their decision-making on?

“It would be nice to say yes, however insurance IT providers, both internal and external are still faced with many challenges and as such are probably not focused purely on building an infrastructure to support the Solvency II regime,” says Larry Jacobson, insurance consulting engineer EMEA, FICO. “Some have seen Solvency II data warehouses as a silver bullet. However these projects have struggled to deliver the value that was initially promised. The issue is not the creation of an arbitrary analytic model based on disparate meaningless data, but the harmonisation of people, processes and systems within an insurer across business areas (underwriting/claims), product lines and geographies - of course taking into account the many complex and varied parties involved with the sales and servicing of insurance products.”

IT systems

What insurers are finding is that the requirements of the directive have meant in some cases a complete overhaul of the data management tools that are currently in use. Rob Stavrou, director of consultancy at Northdoor, comments: “The data quality in the underlying systems is essential - but one of the key requirements of Solvency II is the ability to evidence the data lineage from the internal model through to the underlying systems. What the Solvency II requirements exposed in some companies was an excessive use of spreadsheets in the linkage between the internal model and to underlying systems. This leads to issues with proper control and governance.”

Another real and present danger is that companies will fail the Internal Model Approval Process and will have to revert to the FSA’s own, more restrictive model. Earlier this year, a first batch of insurers underwent the Internal Model Approval Process, and the FSA noted that: “Few firms provided sufficient evidence to show that data used in their internal model was accurate, complete and appropriate.” SAS UK’s Ash continues: “It’s clear that, with regard to Solvency II, not only will the FSA inspect the insurer’s model and the results produced by the model, but they also require evidence of data governance - that is, how data has arrived into the model and how the insurer can prove that data is validated.”

Meeting deadlines

Has Solvency II prompted a rethinking of the way IT is used across the industry to deliver not only better services to customers, but also more in-depth data that companies can use to base their decision-making on?

Mark Dunleavy, financial services manager at Informatica, comments: “With Basel II we saw the industry take on data quality in a new way and realise the potential that it can bring for financial organisations. However, the rethinking here took place as a result of the introduction of the standard. In the same way, Solvency II will prompt a rethinking of the way IT is used to deliver better services to customers and uses its data across
the industry, but it is likely and therefore unfortunate that this will happen afterwards, rather than during the migration process.”

With Jane Tweddle, financial services industry principal at SAP UK advising: “Overall those companies that take a positive view of Solvency II are already seeing the commercial benefit; others may be losing out. Those insurers looking at Solvency II in a positive light regard it as potential competitive advantage, not about simply complying with the rules. It is one of those regulations that can actually improve an insurer’s business - but only if the directive is approached in a wholehearted manner and embrace the process and disciplines it calls for.”

“If firms embrace the regulation and implement robust and effective data management solutions it will be to their long term benefits, and that of the overall industry,” concludes Citisoft’s Young. What is clear is that the IT challenges that Solvency II are presenting are not simply a technology exercise, but also require a complete cultural shift within organisations to deliver data that is accurate and useful. If nothing else, when companies have overcome the hurdles to developing their IT systems for Solvency II, they will have a foundation of robust data that they can build their future businesses upon.

    Share Story:

Recent Stories


Safeguarding economies: DNFBPs' role in AML and CTF compliance explained
Join FStech editor Jonathan Easton, NICE Actimize's Adam McLaughlin and Graham Mackenzie of the Law Society of Scotland as they look at the role Designated Non-Financial Businesses and Professions (DNFBPs) play in the financial sector, and the challenges they face in complying with anti-money laundering and counter-terrorist financing regulations.

Ransomware and beyond: Enhancing cyber threat awareness in the financial sector
Join FStech editor Jonathan Easton and Proofpoint cybersecurity strategist Matt Cooke as they discuss the findings of the State of the Phish 2023 report, diving into key topics such as awareness of cyber threats, the sophisticated techniques being used by criminals to target the financial sector, and how financial institutions can take a proactive approach to educating both their employees and their customers.

Click here to read the 2023 State of the Phish report from Proofpoint.

Cracking down on fraud
In this webinar a panel of expert speakers explored the ways in which high-volume PSPs and FinTechs are preventing fraud while providing a seamless customer experience.

Future of Planning, Budgeting, Forecasting, and Reporting
Sage Intacct is excited to present FSN The Modern Finance Forum’s “Future of Planning, Budgeting, Forecasting, and Reporting Global Survey 2022” results. With participation from 450 companies around the globe, the survey results highlight how organisations are developing their core financial processes by 2030.