The Rapid Evolution of AI. The field of artificial intelligence (“AI”) is exploding. As the headlines proclaim, the race is on in the private sector to be the most successful AI company, with huge amounts of money driving the rush to be the leader in the field. The push to incorporate AI into banking and other financial markets is equally intense. Spending by financial services companies on AI now exceeds spending on AI in all other industries, even tech. Wall Street megabanks are the drivers of this growth. For example, the five largest investment banks filed 94 percent of AI related patents between 2017 and 2021, published two-thirds of the AI research papers, and accounted for half of AI investments. Experts expect that financial institutions’ spending on AI will continue to expand, doubling from 2023 to 2027 and topping $400 billion.
Potential Benefits with Certain Risks. As AI technologies are refined and new ones are developed, AI advocates highlight the potential benefits, including greater efficiency in financial services, lower costs, and better financial outcomes for clients. While some of these are real, AI applications in finance also present serious risks to markets and financial stability by exacerbating existing channels of instability and creating new ones. They are also powerful tools for investor exploitation, fraud, and other illegal conduct.
The Regulatory Challenge. Regulators have begun taking a few initial steps to address the use of AI in finance, largely amounting to policy statements, guidance, and consumer advisories (as described in the appendix below). In a few areas, including the SEC’s proposal on predictive data analytics, substantive standards are emerging. But much more needs to be done much more quickly to keep pace with AI’s evolution so that investors; all financial markets, from securities to banking and derivatives; and overall financial stability are protected. AI’s growth trajectory and penetration into all corners of the financial industry demands a new approach to regulation, one that effectively incorporates agile and forward-looking regulatory frameworks and a focus on consumer protection, ethics, transparency, accountability, and financial stability. Specifically, we will need affirmative regulatory standards beyond mere disclosure, enhanced enforcement, and substantially more resources and expertise for regulators to keep pace with the efforts of a well-funded and highly motivated private sector to develop ever more advanced AI systems.
Key Better Markets Materials on AI
Fact Sheet: Regulators Must Carefully Consider Benefits and Risks of AI in the Financial Markets (3/21/24)
Selected Better Markets AI Media
Bloomberg: Wall Street Braces for More SEC Scrutiny of AI, Private Funds (12/28/23)
“Supporters of the rule, including the Washington-based group Better Markets, which generally advocates for tougher financial rules, said the proposal is necessary to keep pace with technological innovations and make sure firms don’t use technology to prioritize their own interests over investors’ interests.”
Politico Morning Money: Ideas on AI (3/22/24)
“Better Markets, which advocates for tougher financial regulation, is out with a new fact sheet on AI oversight that calls for enhanced enforcement and regulatory standards that go beyond disclosure.
“’AI demands a new approach to financial regulation, one that effectively incorporates agile and forward-looking regulatory frameworks and a focus on consumer protection, ethics, transparency, accountability, and financial stability,’ Better Markets legal director Stephen Hall said.”
AI in the News
Coin Desk: The SEC’s Shot Across the Bow on ‘AI Washing’
The SEC has trained its sights on “AI washing:” when companies lie about using artificial intelligence. Last week, SEC Chair Gary Gensler posted a video to X warning that investment advisors might falsely claim to use AI models to get their clients a better return, and that public companies might falsely tout their AI technology to boost stock prices.
AI and the Policymakers
As directed by an October 2023 Executive Order, the U.S. Treasury issued a report titled Managing Artificial Intelligence-Specific Cybersecurity Risks in the Financial Services Sector in March 2024. The report recognizes the significant opportunities and challenges that AI presents to the security and resiliency of the financial services sector. It also outlines ten action items that Treasury and other federal agencies should take to address immediate AI-related operational risk, cybersecurity, and fraud challenges.