Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards
ThinkAdvisor
Regulations and compliance gears

Technology > Artificial Intelligence

How Advisors Can Manage 6 Big AI Compliance Challenges

X
Your article was successfully shared with the contacts you provided.

Artificial intelligence (AI) has been a buzzword with an ambiguous meaning for some time. And as the number of financial professionals considering using AI within their practices has steadily increased — even more so with the introduction of ChatGPT in November 2022 — we’re seeing a surge of inquiries related to AI and the financial services industry.

Applications of generative AI in the industry are diverse, with an emphasis on financial professionals using it to support supplemental tasks like the creation of marketing materials or to analyze client lists. As its capabilities continue to evolve, we are seeing new ways to leverage the OpenAI framework, including streamlining analyses, transcribing notes, and facilitating compliance reviews of video and audio material.

As this technology continues to gain popularity across the industry, it’s increasingly important to have a comprehensive understanding of what AI is and the risks it poses. Below are key compliance challenges to be aware of when leveraging generative AI to support activities within a financial advisory firm.

1. Tracking a Moving Target

It’s nearly impossible to pin something down that is constantly evolving, which is exactly what financial professionals are experiencing with generative AI. Consider the industry events we regularly attend. Last year at a conference, there were very few, if any, AI vendors in attendance. But this year nearly 10% of all vendors are offering products built with OpenAI frameworks that are designed to streamline daily, back-office tasks for financial professionals.

In the realm of compliance, AI is proving most useful when it comes to streamlining tasks using various plug-ins — for example, monitoring written communications, video, audio, etc. Within financial services more broadly, we’re seeing increased interest in adaptations of the technology as millions of individuals are imagining new ways to leverage AI, from modeling portfolios to running portfolios, to name a few. This can create compliance problems when it comes to protecting client information.

Key Takeaway: The best way to stay on top of this changing landscape is to dedicate time to tracking the newest advancements in AI and how those can fit into different areas of work. This can help firms establish and keep updated utilization policies to help users leverage the technology in an appropriate way. 

2. Adhering to Regulation S-P

Coined the “safeguards rule,” Regulation S-P requires financial advisory platforms to create and adhere to strict policies protecting client records and information. This includes protecting against potential threats to client record breaches or any anticipated hazards. In its present state, AI operates as an advanced search engine, but the uncertainty of its future calls for caution in fully trusting it with sensitive data.

Key takeaway: Incorporating AI as a framework for everyday tasks can alleviate time-intensive projects, but to remain compliant with Regulation S-P, financial professionals should refrain from sharing sensitive client information in any platform leveraging the OpenAI framework.

3. Safeguarding Business Information

Preserving the confidentiality of client information is a top fiduciary obligation, but equally important is safeguarding a business’s proprietary interests. Considering the unknown nature of generative AI, it’s important to avoid sharing any proprietary business information when using this new technology. Should an outsider or competitor gain access to any inside information, it could give them an edge to leverage business strategies and information on their platforms.

Key takeaway: Financial professionals should hold business information in the same regard as client information when it comes to standards confidentiality. In other words — avoid sharing proprietary business information when leveraging AI.

4. Remaining Compliant With Cybersecurity Protocol

Cybersecurity remains a primary concern for financial professionals as they navigate the integration of AI. Many firms unfamiliar with AI express concern about potential breaches of their technology infrastructure.

At present, widely known generative AI tools like ChatGPT function as web engines and lack access to infiltrate systems. However, given the dynamic nature of cybersecurity threats, attempts to hack systems leveraging the AI framework are likely to emerge over time.

Key takeaway: In general, it’s important for a firm’s cybersecurity infrastructure to be strong enough to ward off an attack. Enabling tools like multi-factor authentication, the use of complex passwords, and ensuring passwords are updated regularly should be standard practices that will help circumvent threats. But it will be important to keep an eye on any risk-related developments as AI technology continues to evolve.

5. Diligently Documenting Advice

Financial professionals must thoroughly document the rationale behind any advice they provide. Generative AI, while advanced, is unable to draft client notes independently or run portfolios, as that removes the trained professional from the equation and causes a disconnect between rationale and next steps. AI is only as knowledgeable as the algorithm it’s utilizing, which could lead to an algorithmic bias that’s not necessarily in the best interest of the client. In addition, the advice it puts out could be inaccurate, unreliable, or not considerate of a client’s full financial picture, especially considering that some forms of generative AI are pulling from older databases of information instead of real-time search engines.

Key takeaway: There are ways to leverage the technology in compliance with the rules, including using it to compare funds or create generic outlines, but it’s imperative to take the framework for these recommendations and create personalized advice that’s unique to the client’s situation, with a meticulously documented thought process.

6. Preparing for Future Legislation

The regulatory landscape for generative AI is still developing. Over the next few years, as the tool continues to progress, we can expect clearer definitions around how financial professionals can and cannot leverage the technology. The SEC just released a proposed rule on using predictive analytics for a firm’s best interest rather than their client’s (https://www.sec.gov/news/press-release/2023-140). This rule is likely the first of many to follow.

Key takeaway: The ability to be nimble in navigating changes and updating policies as the regulatory environment takes shape will be important. And demonstrating a proactive approach to AI compliance will ensure a smooth and secure utilization of the technology.

Conclusion

When taking a 10,000-foot view of the use of AI in financial services, our industry’s awareness of AI and its compliance challenges will continue to unfold as it becomes more widespread. What started as technology to write brief snippets of content has evolved for our industry into a tool that can be used to create formulas, review assets and investments, and research portfolio options.

With the exciting innovations generative AI provides, the biggest challenge for financial professionals navigating the tool is using it appropriately. As long as AI is properly understood and used safely with outputs vetted by humans, it can be a tool that will benefit financial professionals as they serve their clients.

The best way to ensure compliance with generative AI is to put in place a clear policy that covers the tool’s dos and don’ts while staying on top of – and knowledgeable about – changes to the technology as it evolves.


Mike Pedlow is executive vice president, chief compliance officer at Kestra Financial.


NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.