(Bloomberg) -- Wall Street’s main regulator is unveiling proposed restrictions for brokerages and money managers that use artificial intelligence to interact with clients.
The US Securities and Exchange Commission approved a plan on Wednesday to root out what Chair Gary Gensler has said are conflicts of interest that can arise when financial firms adopt the technologies. The agency also adopted final rules requiring companies to disclose serious cybersecurity incidents within four business days after they’re deemed significant.
The AI proposal is the latest salvo from Washington regulators concerned about the technologies’ power to influence everything from credit decisions to financial stability. Companies would need to assess whether their use of predictive data analytics or AI poses conflicts of interest, and then eliminate those conflicts, according to an SEC release. They would also have to beef up written policies to make sure they stay in compliance with the rule.
“These rules would help protect investors from conflicts of interest and require that regardless of the technology used, firms meet their obligations” to put clients first, Gensler said during the meeting. “This is more than just disclosure. It’s about whether there’s built into these predictive data analytics something that’s optimizing in our interest or something that’s optimizing” to benefit financial firms, he said.
Banks and brokerages have been using AI for fraud detection and market surveillance for years. More recently, the focus has shifted to trading recommendations, asset management and lending. The SEC wants to make sure that companies don’t put their interests before those of clients when recommending trades or products.
The proposal is broader than existing requirements for brokers to act in their clients’ best interests when making recommendations, an agency staffer said on background during a press briefing on Tuesday.
The plan will be open for public comments, which the agency will review before bringing a final version to a vote, likely some time in 2024. The rule would require a majority of the five-member commission’s approval to be finalized.
The commission’s two Republicans criticized the rule for being over-broad in requiring companies to assess their use of too many types of technologies for potential conflicts.
For example, “a myriad of commonly used tools could qualify such as a simple electronic calculator, or an application that analyzes an investor’s future retirement assets based on, for example, changing the broad asset allocation mix among stocks, bonds and cash,” Commissioner Mark Uyeda said. The “vagueness” of the proposal and the compliance challenges “may cause firms to avoid innovation,” he said.
In recent weeks, regulators have made it clear that they’re stepping up oversight of artificial intelligence.
Rohit Chopra, director of the Consumer Financial Protection Bureau, signaled that new restrictions are coming on the use of AI in lending. Michael Barr, the Federal Reserve’s vice chair for supervision, said lenders need to ensure that such tools don’t extend biases and discrimination in credit decisions.
The Federal Trade Commission has already opened an investigation into Microsoft Corp.-backed OpenAI Inc., the maker of ChatGPT, to examine whether the chatbot poses risks to consumers’ reputations and data. The probe was first reported by the Washington Post.
President Joe Biden said July 21 that his administration would take new executive actions in the coming weeks to set a framework for “responsible innovation” with the technology.
Since taking the helm of the SEC in 2021, Gensler has raised concerns with AI’s potential to draw on reams of data to target individual investors and nudging them to alter their behavior when it comes to trading, investing or opening financial accounts.
Last week, he called the tools “the most transformative technology of our time” but warned that concentration of the technology among just a few firms, or a few foundational data sets, poses risk that could lead to future instability in financial markets.
On Wednesday, the SEC also approved a plan requiring companies to disclose significant cybersecurity breaches.
The final rule keeps the proposed version’s requirement to publicly disclose breaches within four business days after determining that they’re “material” to a company’s operations or financial condition. However, it adds an option for delaying disclosure if the US attorney general determines that making the incident public would pose risks to public safety or national security.
Industry groups like the Business Roundtable have cautioned that a four-day time line would give valuable information about company operations to bad actors.
Another proposal on the SEC’s agenda would allow investment advisers operating solely online to register with the commission. The agency estimates the current exemption affects about 200 investment advisers.
--With assistance from Rick Green.
(Updates with conflict rule vote, Gensler quote beginning in second paragraph)
©2023 Bloomberg L.P.