Feeling the strain

Two-thirds of financial firms believe they have analytics programmes and infrastructures that are insufficient in handling increasing analytical complexity and data volume, recent research showed. Christopher Andrews reports

Data is the oxygen of financial firms, but there is so much of it floating around these days that it’s beginning to make them lightheaded. The results of a report last year looking at data and analytical capacity in the financial sector, while somewhat worrying, were probably not particularly surprising. The IT systems of many firms on both the buy and sell side have been groaning under the sheer weight of data flowing into them, and the capacity to efficiently analyse all of that data is being seriously tested.

Indeed, that report, from Platform Computing, SAS and The TABB Group, found two-thirds of financial firms believing their analytics programmes and infrastructures were insufficient to handle increasing analytical complexity and volumes of data. The report cited a lack of scalability, inflexible architectures and inefficient use of existing computing capacity as the major concerns, and to ask ‘what can be done about this’ is to question the very nature of IT systems, policies and procedures as they currently exist.

So why now? Financial services outfits have obviously always been data dependent, but these concerns come down to the immense volume of data they are now having to deal with. This is, in large part, being generated by increasingly onerous regulations following the financial crisis, as well as the rise and rise of automated trading. Both of these over the past few years have created a “data explosion” according to Alex Foster, global head of sales at BT Radianz, and it’s no wonder financial firms are feeling the strain.

“Algorithmic trading is definitely a big driver here,” she says. “If you consider a company like Vodafone, for example, on average the price of its stock changes 1,700 times per second. So if you’re trying to work out best price for a client, or you’re trying to trade and you’ve got that amount of change, particularly in light of the volatility we’ve had recently, that’s just huge amounts of data.”

Nervous times

Add to this both existing and imminent regulatory burdens, and the nervousness highlighted in the above report makes sense. The various directives coming out of the UK and the EU, including Solvency II, MiFID II and Basel III, will, among other things, vastly increase the amount of information which firms will need to catalogue and retain. That nervousness is further quickened by some unknowns, which could make those burdens even more arduous.

“There is talk with MiFID II, for example, that absolutely every key stroke of every decision of every algo may or may not have to be stored so that you can then go back and audit those,” says Foster. “That hasn’t been ratified yet, so it’s all eyes on 23 October when we get the next installment.”

Those eyes might end up weeping and bloodshot, as the sheer volume of potential regulatory pressures coming in could mean that, under current conditions, many institutions will simply not have the compute power necessary to cope. “One global investment bank that I’ve spoken to has said that their grid infrastructure is around 10,000 nodes, and they have predicted that to meet their requirements for risk reporting under Basel III they’d need to move that to 100,000,” says Nik Whitfield, head of investment banking at Detica. “Not only is that obviously expensive to run 90,000 more machines effectively, it’s also quite a complex task to try and get that many internet worked, virtualised machines working together. So that’s a key problem for them at the moment.”

A potential solution to that particular problem could mean, rather than expanding nodes, investing in ‘hardware acceleration’ with specialist machines using FPGAs (field programmable gate arrays) or GPUs (graphics processing units) for example, effectively to run specific algorithms in hardware rather than software. These, however, are expensive to programme, and IT departments used to Java or C++ environments may not have the know-how, or the impetus, to push for them.

However that particular new issue is dealt with though, these increasing data and analytical requirements come on top of the many-fold issues which have always existed for financial institutions, and even those haven’t always had the best run of it. These include managing customer data, the long-winded process of on-boarding new clients, and dealing with sweeping market changes, among myriad others.

As an example of the latter point, Bill Meenaghan, global head of ALERT at Omgeo which provides systems for clearing and settlement of trades, says that when the French, Belgian and Dutch markets changed their numbering systems back in January 2009, moving to the ESES platform (Euroclear Settlement of Euronext-zone Securities) for securities settlement and safekeeping services, investment managers and brokers both had huge system amendments to carry out as a result.

“And that’s important, because that’s the data that is used to settle the trades, so if they get that data wrong it can cause trades to fail, and that means they have to use their resources to make sure they get those trades re-booked, which usually involves agent fees, and also there’s the risk of interest loans on the back of the fail because the cash isn’t where it’s supposed to be,” says Meenaghan. “So there are a lot of data points that can fail, but we’ve hopefully added enough validations now so that the users of our database do have the confidence that that data is correct.”

The obvious point is that failing to get systems right could result in a lot of lost money, as well as damage to reputation, which is true for the client on-boarding side as well. While this process is of blatant importance, Whitfield says that many financial institutions are suffering, again because of data volumes and complexity.

“There are examples where a client, say a hedge fund, approached a sell side investment bank saying , ‘we would like to start trading these different types of assets’, and two months down the line they’re still twiddling their thumbs unable to trade with the bank, three months down the line they are still unable to trade and decide to take their business elsewhere,” says Whitfield. “So you can see the problem there. There’s a lot of frustration on behalf of the customer, and the hedge fund community is a pretty small community and so that word is going to spread.”

This idea really highlights the general difficulties of data management and analytics within financial firms. In the on-boarding case, various elements of the business have to bring disparate data together for compliance issues, legal issues, credit and so on, which is a complex process. If systems aren’t joined up, or data isn’t being managed effectively, this process is made all the more difficult. Whitfield says that there are new techniques to help with this, including ‘social network analysis’ to pull the necessary information together from the vast swathes of data available both in the system, and from outside sources.

Along similar lines, Graeam Condie, head of sales for EMEA at DST Global Solutions says: “It’s a middle office data crunch. So there is lots of information floating around, and lots of stakeholders that require parts of that information, and it all seems to be gathering in the middle office of the asset manager.”

Condie thinks there are certainly ways to alleviate this crunch, starting with automating the gathering of data coming from different systems and across geography. “But then we think the key is that once you’ve got that info, you’ve got to be able to store it intelligently rather than just storing it for the sake of storing it. So you’ve got to have a database or model that’s suitable for the type of data it’s receiving. Because then you can start to add value.”

As one example, Condie says a fund’s measurement and risk data can be linked to the firm’s finance data, allowing the finance director to compare his budget forecasting with the performance or risk of individual funds. “That process is probably done in Excel at the moment, but you need an intelligent tool and a consolidation tool which allows you to do that with the database and within the structure.”

Omgeo’s Meenaghan furthers the point about intelligent data handling, saying that tools which allow for data suppression to avoid overload, as well as the ability to cross reference data, are proving useful. “Those aren’t new systems,” he says, “but over the past two or three years we have worked more with those systems to try and help clients deal with volumes, rather than having to go through major infrastructure changes.”

As the influx of data is only set to gather apace, it will require a combination of more intelligent data management, infrastructure upgrades, and increasingly, use of cloud services to provide computing power on demand. There is no panacea, however, and it comes down to the individual firm to decide how best to keep all of that oxygen flowing, as they can’t simply turn off the valve.

    Share Story:

Recent Stories


Safeguarding economies: DNFBPs' role in AML and CTF compliance explained
Join FStech editor Jonathan Easton, NICE Actimize's Adam McLaughlin and Graham Mackenzie of the Law Society of Scotland as they look at the role Designated Non-Financial Businesses and Professions (DNFBPs) play in the financial sector, and the challenges they face in complying with anti-money laundering and counter-terrorist financing regulations.

Ransomware and beyond: Enhancing cyber threat awareness in the financial sector
Join FStech editor Jonathan Easton and Proofpoint cybersecurity strategist Matt Cooke as they discuss the findings of the State of the Phish 2023 report, diving into key topics such as awareness of cyber threats, the sophisticated techniques being used by criminals to target the financial sector, and how financial institutions can take a proactive approach to educating both their employees and their customers.

Click here to read the 2023 State of the Phish report from Proofpoint.

Cracking down on fraud
In this webinar a panel of expert speakers explored the ways in which high-volume PSPs and FinTechs are preventing fraud while providing a seamless customer experience.

Future of Planning, Budgeting, Forecasting, and Reporting
Sage Intacct is excited to present FSN The Modern Finance Forum’s “Future of Planning, Budgeting, Forecasting, and Reporting Global Survey 2022” results. With participation from 450 companies around the globe, the survey results highlight how organisations are developing their core financial processes by 2030.