Article

Leading From The Front

As White House advisers work on a Bill of Artificial Intelligence Rights, the SEC is taking a proactive approach to the AI regulation conversation, including how to retain the human factor, avoid replicating bias, and still reap the benefits of this developing, innovative technology

06 July 2022 6 min read
by Alex Viall

This article was featured in Issue 5 of Orbit TRC Magazine, Global Relay’s exclusive publication focusing on Technology, Risk, and Compliance.

With artificial intelligence (AI) seeping into our everyday lives, in most cases in a disturbingly invisible way, parts of society are questioning why this new application of technology has been deployed before it is formally regulated. Automation can be used to determine outcomes that can have a significant impact on personal life. Humans want to know more, such as when an algorithm is driving the decision and the model behind that algorithm.

Nowhere in the world is more advanced in the academic and commercial application of AI than the US. Advisers to the White House are working on a Bill of Artificial Intelligence Rights. Eric Lander, chief science adviser to the current administration, has talked about needing this Bill of Rights ”to guard against the powerful technologies we have created”.

Taking the lead

In the field of finance, regulators have been quick to acknowledge the modern use of machine learning that has already become part and parcel of customer engagement and analysis.

Leading the charge is the ever-energetic Chair of the Securities and Exchange Commission (SEC), Gary Gensler (pictured above). Gensler’s background in finance and his role as a professor of global economics and management makes him well qualified to assess the intersection of technology, finance, and regulation.

Since being appointed to the SEC in April 2021, Gensler has been extremely busy on any number of fronts related to technology change and market structure but, significantly, he said this in a speech in January last year.

”To me, the most dramatic change to our markets is the use of predictive data analytics and artificial intelligence.“ He then qualified this as any public regulator should.

”When new technologies come along and change the face of finance, how do we continue to achieve our core public policy goals? While these developments can increase access and choice, they also raise important public policy considerations, including conflicts of interest, bias, and systemic risks.”

Learning from the past

Gensler is a keen proponent of learning from history. He harks back to the 1930s and accepts that investors can decide what risk they want to take as long as they receive full and fair disclosure before transacting.

He is clearly excited by progress and innovation, concluding: ”No regulation can be static in a dynamic society.” And it does not get much more dynamic right now than the digital engagement practices (DEPs) being deployed by asset managers and brokers that tailor product and marketing for individual investors, where predictive data analytics are under the hood of the digital engine that drives this engagement.

The modern features within DEPs offer more than game-like elements or ’gamification’. They encompass the underlying predictive data analytics, as well as a variety of differential marketing practices, pricing, and behavioral prompts.

Gensler questions whether robo-advisers have been trained with the best interests of the investor in mind, or if the natural bias towards corporate profit that has characterized human advice is baked into the models deployed.

He commented in August of 2021 that, despite all the positives, the rules written in another era might not be a good fit for the market now and that AI application might require a totally different approach for user interface, user engagement, fairness, and bias.

Gaming the system

Analyst Women Looking At KPI Data On Computer Screen
Vibrant colors and other sensory triggers encourage activity

Gamified engagement enabled by disruptive fintech providers like Robinhood is blurring the boundary between what is deemed marketing but might actually be viewed as investment advice. Could that automated push through a trading app be construed as a recommendation where the higher duty of care requires subjective assessment that a bot cannot provide?

Critics who contend that the SEC is getting ahead of itself here are asking what empirical evidence the regulator has to prove that digital engagement practices do directly affect investor decision-making.

The financial institutions most disadvantaged by such a judgment probably have the best insight on this question but zero incentive to reveal it.

Research into gambling and gaming suggests that audio, vibrant color, and sensory triggers simulating live experiences can encourage activity.

How can we safeguard against the perhaps inevitable outcome of the data used in these analytics for machine and deep learning replicating society’s existing biases?

More risk concerns – is discrimination lurking?

Early analysis of AI’s application in other sectors leads to further concerns related to potential discrimination based on profiling and the training data used to package and price products offered to customers – how can we safeguard against the perhaps inevitable outcome of the data used in these analytics for machine and deep learning replicating society’s existing biases?

The SEC is concerned beyond the bot or robo-adviser to customer relationship. Its requests for comments made in September last year probe the risks in using predictive data analytics that might trigger systemic issues in the capital markets. The greater concentration of data sources, interconnectedness (of credit rating models), and herding (into certain datasets, providers, or investments) might result in unprecedented risk. Subprime mortgages pre-2008, the 1980s Savings and Loan crash, and the millennium dotcom bubble are all good examples of this.

Digital analytics represent a significant change compared to previous advances in data analytics. Gensler describes them as “increasingly complex, non-linear, and hyper-dimensional; they are less explainable” and he fears that ”existing regulations are likely to fall short when it comes to the broad adoption of new forms of predictive digital analytics in finance“.

He also predicts that modern data analytics may bring more uniformity and network interconnectedness, and could expose gaps in regulations developed in an earlier era. Financial fragility could come from a different source, such as a critical data aggregator, or in particular model designs.

Gary Gensler’s obvious enthusiasm for technological innovation is tempered by his principal role as chief public protector of investors. He notes that finance platforms must comply with investor protections through specific duties that include: fiduciary duty, duty of care, duty of loyalty, best execution, and best interest.

But his concern, and this is shared by other regulators, is that some or all might conflict with the platform goal to optimize revenue. It will be fascinating to see how regulators approach this tricky new field as they attempt to protect investors without stifling innovation.

Orbit TRC, offers a unique blend of perspectives for corporates and regulated entities on the latest developments that impact technology, risk and compliance.

About Article

Published 06 July 2022

About Author

Alex Viall Director, Regulatory Intelligence

Share Article