Artificial Intelligence (AI) is set to reshape the compliance function across financial services.
Based on the second annual ACA and NSCP AI Benchmarking Survey, firms are shifting from AI exploration to AI adoption, with 71% of respondents now formally using AI tools and technologies.
It appears that the majority of firms have begun to integrate AI into their internal operations, reflecting a broader recognition of its value in enhancing risk management and operational efficiency and a wish to remain in step with regulators, who themselves are adopting AI.
Companies are now piloting AI compliance tools that support research compliance, policy development and compliance workflow, employee compliance, trade and electronic communications surveillance, marketing review, AML/KYC, and compliance training. These use cases align with the strengths of generative AI, which can accelerate drafting, automate monitoring, and support decision-making.
Many say they are moving more cautiously with external applications, including client-facing tools and chatbots, due to heightened regulatory expectations and scrutiny.
Compliance and AI
As firms make the transition to AI, compliance officers are advocating for AI oversight policies that require a “human in the loop” and ensure security, explainability and good governance, risk management, and regulatory compliance. Human oversight of AI tools is particularly important for use cases that require judgment and contextual analysis.
For example, a first AI draft of a compliance policy should be reviewed by an experienced compliance officer who knows the firm’s operations and the regulations well enough to tailor a generic policy to the particular needs of that business and to spot any gaps that AI output may have missed.
The momentum toward AI adoption in the compliance realm is being driven by leadership within compliance teams and reinforced by expectations from boards, senior management, investors, and regulators. AI is increasingly viewed as a strategic capability; one that supports firms in meeting regulatory obligations while improving effectiveness, efficiency, competitiveness and scale.
We believe compliance officers are beginning to see the benefits of AI innovation; 53% of firms surveyed report that AI is having a positive impact on their compliance program, including improved efficiency, reduced false positives, and expanded risk coverage. These gains, in turn, allow compliance officers to improve accountability at their firms. These gains will accelerate as compliance practitioners become more familiar with and comfortable using AI.
Of course, compliance also has a role with respect to AI tools that support other business functions, including decision-making, operations and marketing. To meet the expectations of regulators across a firm’s many AI use cases, compliance officers are formalizing AI authorized use policies, oversight structures, use case inventories, model testing and validation, cybersecurity and privacy, and service provider oversight. To the extent that firms market their AI capabilities, compliance officers also work to ensure that they can substantiate their claims.
Room for Improvement
The survey revealed, however, that there is room for improvement. As firms integrate AI tools into their workflows, cybersecurity becomes an ever more critical concern. While most companies who responded to our survey (71%) have incorporated AI into their cybersecurity training programs, fewer have implemented technical safeguards, such as penetration testing (27%) or network segmentation (13%). Strengthening these controls helps identify potential avenues of cyberattack, protect sensitive data, and maintain operational integrity in the event of an incident.
Oversight of third-party AI tools is another area requiring attention. Despite growing reliance on external providers, only 24% of respondents have established formal policies to govern third-party AI use. As vendors embed AI into their platforms, firms must ensure that their vendors have taken reasonable steps to ensure that these technologies meet compliance standards.
Corporations will likely be reassured by the White House’s recent direction to regulators to avoid placing undue burdens on responsible AI innovation. Firms are still expected to demonstrate governance, transparency, and control in their use of AI technologies, and to ensure that claims about their AI capabilities can withstand scrutiny. However, I expect the SEC staff to focus on whether companies have taken reasonable steps to identify and address potential weaknesses, rather than requiring them to act as insurers against every issue that may arise during implementation.
Regulatory Response
Regulators are also evolving their own approach to AI. The Office of Management and Budget has called on federal agencies to lessen bureaucratic restrictions on the use of AI within their agencies and to develop AI strategies that enhance agency efficiency and increase the quality of agency services.
The SEC has responded by appointing an internal AI Task Force that will collaborate with other federal agencies to accelerate the agency’s use of AI and establish and maintain appropriate AI governance.
The SEC anticipates that AI tools will augment the staff’s capacity, accelerate innovation, and enhance efficiency and accuracy. Among the use cases the SEC has identified are tools to identify manipulative trading activity, tools to test AI algorithms and models used by firms, and machine learning tools to identify firms with certain risk characteristics. There will be more.
The integration of AI into compliance programs marks a strategic evolution. It demands thoughtful governance and adaptation, along with technical rigor. To keep pace with evolving expectations, firms must remain proactive, adapting governance, refining oversight, and preparing for regulators who are increasingly tech-enabled themselves.
The path forward is not without challenges, but the direction is clear. AI is becoming embedded in the fabric of compliance, offering new capabilities to manage complexity and deliver value. Success will depend on how well firms align technology with purpose, risk appetite, and regulatory expectations.
Carlo di Florio is president of ACA Group, a provider of compliance, risk, and technology solutions for financial services firms.
© Arc, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.