Data & Feeds Team
Artificial intelligence is already boosting productivity levels in financial services and helping businesses to make more accurate, efficient and – hopefully – more profitable decisions. With big data getting ever bigger automation is helping financial services businesses make swifter and smarter decisions. But what guardrails need to be put in place to ensure the technologies don’t go wildly off track?
In a 2023 survey of financial services businesses by UK recruiter Harvey Nash, more than 60 per cent said they were actively considering, piloting or implementing AI.
The sector is using predictive AI, for instance, in areas such as fraud detection and risk management. It has not yet adopted generative AI to the same degree, but it is currently testing it in operations and risk management.
In investment management, businesses are looking at ways to use AI and machine learning to generate greater alpha, improve operational efficiency, tailor products and interactions for clients, and strengthen compliance and risk management functions.
Helen Zhu, Managing Director and Chief Investment Officer at NF Trinity, says that big data analysis gives the Hong Kong-based investment firm the confidence to explore opportunities and scale up investments. “This big data foundation also bolsters our ongoing portfolio management and monitoring,” she says. “It can provide higher-frequency cross checks to test whether our theses are still intact and are progressing as expected.”
1. Flawed guardrails = flawed outputs
To access the benefits of machine learning, financial services businesses need to take care with how they design the frameworks that shape extractions and analysis. This means putting in place the right control parameters around how the algorithms operate and how they will be leveraged by employees before pulling in a broad spectrum of data sources for contextual insight.
The importance of human expertise in this process stands as a recurring theme in discussions about AI and machine learning in general. “Humans should be the parties setting the boundaries of AI and machine learning”, says Chris Ainscough, Portfolio Manager and Director of Asset Management at Charles Stanley, “because if you let a quantitative system set the parameters, they can be oversimplistic”.
“Never take AI answers as a given,” says Ainscough. “There needs to be a huge amount of governance for AI to be implemented successfully within financial services. Qualitative judgement from humans must always validate AI outputs.”
This is particularly true for the more structured and automated areas of risk assessment and administrative departments’ functionality, where there is more repetition in the process. Here, even a small error in the original set-up can taint the entirety of the output.
2. Establishing a clear line of sight
Just as in law, where precedent is everything, a key question as to the quality of data comes from its origin. Organisations that can collect, check and maintain the accuracy of their data in a timely way will gain a competitive advantage. Because data has quickly become an intrinsic part of the entire financial services sector, a small incremental gain in each department can steadily build into a substantial overall edge.
This puts a lot of importance on data lineage, or traceability – tracking the origin, history and destination of a product or component, typically through each phase of its use – to ensure transparency and accountability. For a sector where the regulatory burden is higher, such as financial services, having a clear record of the ownership and use of data is central to managerial oversight and governance.
More data suggests more automation, so by default, less human input. Yet when combined with a quantum leap in technology that is developing at a speed that few of us can comprehend, the human influence on data authenticity and usage will need to increase. Recent inventions like large language models (LLMs) known more familiarly to us as things like ChatGPT, are fascinating and intimidating in equal measure. But like most things, they have their place when used under the correct conditions.
“LLMs are good at certain things, but awful at others – generative AI will always hallucinate in one way, shape or form, so humans need to know how to constantly verify its output,” says Sarah Gadd, Chief Data Officer and Head of Data and Process Engineering at Swiss private bank Julius Baer. “Organisations have to educate their workforces on how to construct trustworthy questions and verify answers in a traceable, user-friendly way.”
The challenge of managing a surge in automation may bring with it its own solution. For a start, automation is faster, and while an algorithm checking another algorithm may seem nonsensical, once the base parameters are set by humans the mechanical and repeatable nature of the process makes it transparent, explainable and prime for automation. It can, for instance, swiftly identify risk factors or trends that fall outside the norm – a useful tool to support employees responsible for governance.
“Using machine learning techniques can enhance data governance efforts by automating tasks such as data classification and data quality assessment,” says Jeremy Hunt, Global Head of Data, Analytics, AI and CRM at investment management multinational Schroders. “Data ownership by the right people and in the right place is required, so data and governance awareness will become a fundamental requirement for financial services professionals across various roles.”
3. Decision-making is still our responsibility
Many financial services companies have been quick to recognise the benefits of big data and sophisticated algorithms and are keen to use them. But others are adopting “a “watch and wait” approach,” as they take a more cautious and circumspect view on data providence, lineage and integrity. Evidently, although stronger analysis of more information should lead to better decision-making, humans need to control where the data comes from and how it is managed and treated. The newest inventions have incorporated words like “intelligence” and “learning” into their definitions, but decision-making and accountability still lie with us.
Read more about
Legal Disclaimer
Republication or redistribution of LSE Group content is prohibited without our prior written consent.
The content of this publication is for informational purposes only and has no legal effect, does not form part of any contract, does not, and does not seek to constitute advice of any nature and no reliance should be placed upon statements contained herein. Whilst reasonable efforts have been taken to ensure that the contents of this publication are accurate and reliable, LSE Group does not guarantee that this document is free from errors or omissions; therefore, you may not rely upon the content of this document under any circumstances and you should seek your own independent legal, investment, tax and other advice. Neither We nor our affiliates shall be liable for any errors, inaccuracies or delays in the publication or any other content, or for any actions taken by you in reliance thereon.
Copyright © 2024 London Stock Exchange Group. All rights reserved.
The content of this publication is provided by London Stock Exchange Group plc, its applicable group undertakings and/or its affiliates or licensors (the “LSE Group” or “We”) exclusively.
Neither We nor our affiliates guarantee the accuracy of or endorse the views or opinions given by any third party content provider, advertiser, sponsor or other user. We may link to, reference, or promote websites, applications and/or services from third parties. You agree that We are not responsible for, and do not control such non-LSE Group websites, applications or services.
The content of this publication is for informational purposes only. All information and data contained in this publication is obtained by LSE Group from sources believed by it to be accurate and reliable. Because of the possibility of human and mechanical error as well as other factors, however, such information and data are provided "as is" without warranty of any kind. You understand and agree that this publication does not, and does not seek to, constitute advice of any nature. You may not rely upon the content of this document under any circumstances and should seek your own independent legal, tax or investment advice or opinion regarding the suitability, value or profitability of any particular security, portfolio or investment strategy. Neither We nor our affiliates shall be liable for any errors, inaccuracies or delays in the publication or any other content, or for any actions taken by you in reliance thereon. You expressly agree that your use of the publication and its content is at your sole risk.
To the fullest extent permitted by applicable law, LSE Group, expressly disclaims any representation or warranties, express or implied, including, without limitation, any representations or warranties of performance, merchantability, fitness for a particular purpose, accuracy, completeness, reliability and non-infringement. LSE Group, its subsidiaries, its affiliates and their respective shareholders, directors, officers employees, agents, advertisers, content providers and licensors (collectively referred to as the “LSE Group Parties”) disclaim all responsibility for any loss, liability or damage of any kind resulting from or related to access, use or the unavailability of the publication (or any part of it); and none of the LSE Group Parties will be liable (jointly or severally) to you for any direct, indirect, consequential, special, incidental, punitive or exemplary damages, howsoever arising, even if any member of the LSE Group Parties are advised in advance of the possibility of such damages or could have foreseen any such damages arising or resulting from the use of, or inability to use, the information contained in the publication. For the avoidance of doubt, the LSE Group Parties shall have no liability for any losses, claims, demands, actions, proceedings, damages, costs or expenses arising out of, or in any way connected with, the information contained in this document.
LSE Group is the owner of various intellectual property rights ("IPR”), including but not limited to, numerous trademarks that are used to identify, advertise, and promote LSE Group products, services and activities. Nothing contained herein should be construed as granting any licence or right to use any of the trademarks or any other LSE Group IPR for any purpose whatsoever without the written permission or applicable licence terms.