Insights
SEC Proposes Stringent New Conflict-of-Interest Rules Regarding Broker-Dealers’ and Investment Advisers’ Use of AI
Sep 01, 2023Summary
On July 26, 2023, the majority of the Commissioners of the U.S. Securities and Exchange Commission (the “SEC”) voted to propose new rules under the Securities Exchange Act of 1934 and the Investment Advisers Act of 1940 to regulate the use of artificial intelligence technologies (“AI”) by broker-dealers and investment advisers registered, or required to be registered, with the SEC (collectively, “Firms”) to prevent Firms from using such technologies in ways that advance their interests ahead of those of their investor-clients. Interested parties have until October 10, 2023, to submit comments to shape any final rules that may be promulgated. The proposals implicate two central themes that the debate on emerging AI regulation commonly presents: (1) what constitutes “AI” or other covered technology for purposes of such regulation, and (2) does existing law effectively address the perceived harms or risks to be regulated?
Covered Technologies
The SEC’s definition of covered AI under the proposed rules includes technologies with “an analytical, technological, or computational function, algorithm, model, correlation matrix, or similar method or process that optimizes for, predicts, guides, forecasts, or directs investment-related behaviors or outcomes.” Such methods and processes would likely include, but would not be limited to, (1) technologies that analyze investors’ behaviors to provide curated research reports on particular investment products; (2) algorithm-based tools that provide tailored investment recommendations; (3) use of a conditional auto-encoder model to predict stock returns; and (4) third-party, off-the-shelf AI technology used by Firms to draft or revise advertisements guiding or directing investors or prospective investors to use its services. The SEC specifically notes that this definition is designed to capture machine learning algorithms, neural networks, natural language processing, and large language models (including generative pre-trained transformers, commonly referred to as “GPT”).
Based on this definition, the SEC’s proposed rules appear intended to regulate each of the major categories of AI, which are often described as reactive (the classification of data and completion of pattern-recognition tasks), limited-memory (the use of historical data to predict outcomes), and theory-of-mind, a more aspirational type of AI at present that would deliver personalized outputs based on algorithms mimicking the human emotional response.
A Statement of Perceived Risk
Proponents of the proposed rules have hailed them as a way to create multiple checks and balances in both development and deployment of AI to prevent it from subjugating investors’ interests to those of Firms. According to the SEC, though AI has the capability to decrease costs and maximize value for investors, the proposed rules are necessary to ameliorate the potential harms of Firms’ use of AI including “prompt[ing] investors to enroll in products or services that financially benefit [Firms] but may not be consistent with [investors’] investment goals or risk tolerance, [encouraging] investors to enter into more frequent trades or employ riskier trading strategies (e.g., margin trading) that will increase [Firms’] profit at the investors’ expense, or inappropriately steer[ing] investors toward complex and risky securities products inconsistent with investors’ investment objectives or risk profiles that result in harm to investors but that financially benefit the [Firms].”
Covered Investor Interactions and Basic Requirements
According to the proposed rules, the SEC seeks to eliminate or neutralize AI-related conflict-of-interest risk by requiring Firms to (1) identify and mitigate the effect of conflicts of interests arising from their use of AI in “investor interactions” in ways that put their own interests ahead of those of their investors; (2) implement policies and procedures reasonably designed for this purpose; and (3) maintain associated compliance books and records.
The core requirements of the proposed rules would apply slightly differently to broker-dealers than they would to investment advisers. In particular, for broker-dealers, the rules would only apply to certain interactions with “retail” investors – those with a natural person, or the legal representative of such natural person, who seeks to receive or receives services primarily for personal, family, or household purposes. For investment-advisers, on the other hand, the rules would apply to covered interactions with any client or prospective client, whether consumer/retail or institutional in nature, including those in a pooled investment vehicle advised by the investment adviser, although the SEC solicits comment specifically on whether or not this difference is appropriate.
Whether as applied to broker-dealers or investment advisers, the proposed rules would not cover a Firm’s interactions with investors solely for purposes of meeting legal or regulatory obligations, including for anti-money laundering purposes, such as where AI is used to identify potentially suspicious activity, or for providing other clerical, ministerial, or general administrative support (such as for “back office” purposes like the routing of investor’s instructions). The SEC notes that covered interactions would include, however, those that have generally been viewed as outside the scope of investment “recommendations” for broker-dealers. They would also include an investment-adviser’s exercise of discretion to manage a client’s account.
The SEC’s proposals also include a limited exception for conflicts of interest that exist solely because a Firm seeks to open a new client account, recognizing the attraction of new investors as being “essential to the business” of any Firm. However, the SEC explicitly notes that technologies that otherwise take into consideration the profits or revenues of the Firm could present the type of conflict of interest targeted by the proposed rules. It is easy to see how a wide variety of AI uses by Firms could satisfy this test. The SEC concedes this and explains that the focus of the Firm’s analysis and monitoring should instead be on whether or not such a “conflict of interest” places the interests of the Firm ahead of those of investors, making it necessary to eliminate or neutralize under these rules.
Eliminating or Neutralizing the Effects of Covered Conflicts of Interest
As an example of how a Firm might eliminate such a conflict, the SEC cites a Firm’s ending its receipt of revenue-sharing payments that give rise to the conflict. The SEC says effective “neutralization” could include, for example, adjusting the relevant AI model to apply a “counterweight” (requiring the consideration of additional investor-favorable information that would not have otherwise have been considered in order to counteract consideration of a Firm-favorable factor) or changing how the AI model’s inputs are analyzed or weighted in order to systematically avoid a Firm-biased outcome.
Identifying and addressing conflicts of interest in this way would require expertise in the covered technology, such as the applicable programming language or code as well as any input data and associated model training and weighting.
Criticism and Defense of the Proposed Rules
Opponents have noted that the proposed rules could stymie innovation in Firms’ use of AI in delivery of new investor-demanded technologies. This criticism echoes concerns raised by some in connection with how the SEC has approached cryptocurrency-related matters in recent years.
While the SEC says it intends the proposed rules to be technology-neutral, it recognizes that its proposals may restrict the use of, or require the adoption of back-end controls in connection with, third-party AI that may effectively be “black box” in nature to Firms. This is illustrative of concerns that have resulted in interest in the establishment of broader “explainability” standards for such technology. The SEC also recognizes that its proposed rules may pose significant barriers to entry for smaller Firms looking to enter the market or adopt new technologies (in terms of necessary compliance resources) but suggests that competition among third-party providers should over time lower these barriers.
And while the SEC also acknowledges that existing law, such as fiduciary obligations applicable to investment advisers and the relatively new Regulation Best Interest applicable to broker-dealers, already requires Firms to identify, disclose, and/or mitigate or eliminate conflicts of interest, the SEC posits that AI “could rapidly and exponentially scale the transmission of any conflicts of interest associated with such technologies” in relation to Firms’ investor-clients in a way that existing regulation does not adequately address.
With respect to the concern raised by opponents that the proposed approach places too little reliance on disclosure, the SEC explains its position: “[D]isclosure may be ineffective in light of . . . the rate of investor interactions [through AI technologies], the size of the [AI] datasets, the complexity of the algorithms on which the [AI] is based, and the ability of the [AI] to learn investor preferences or behavior, which could entail providing disclosure that is lengthy, highly technical, and variable, which could cause investors difficulty in understanding the disclosure.” The SEC offers that some AI uses “may expose investors to unique and opaque conflicts of interest for which disclosure may not possible or sufficient and which may not otherwise be sufficiently addressed by the existing legal framework.”
Conclusion
In connection with these proposals, the SEC draws parallels between AI and older technologies such as computer punch cards and telex machines that Firms have historically introduced and overseen responsibly, resulting in better pricing, greater product choice, and improved user experience for their clients without compromising investor protection, capital formation, and fair, orderly, and efficient markets. This is a hopeful tone, but the complexities of the SEC’s proposed rules and the subject matter it purports to regulate will require a great deal of further attention.
With these complexities in mind, it remains to be seen exactly how and when Firms and other financial institutions (and their service providers) will be permitted to incorporate AI into the products and services that they offer. The SEC’s proposal highlights recent technology-oriented enforcement matters under less tailored existing rules—regulation by enforcement—in a pattern that is often the alternative or precursor to such a complex regulatory proposal. As a result, we expect that any AI technologies successfully brought to the market will ultimately need to be delivered and managed in ways that meaningfully address actual and apparent conflicts of interest, along with data security and other risks.
BCLP will continue to monitor these and other AI regulatory developments. Keep current on U.S. state-by-state initiatives of this type with our AI Tracker and take advantage of our other AI resources.
Related Practice Areas
-
Corporate
-
Finance
-
Regulation, Compliance & Advisory