Skip to main content

AI—To use or not to use?

Artificial intelligence (AI) is rapidly emerging as a powerful catalyst for growth, enabling organizations to innovate faster, operate more efficiently and unlock new opportunities across almost every part of business. As the expert system evolves, more credit professionals are looking to adapt, or at least explore, what it is capable of.

Artificial intelligence (AI) is rapidly emerging as a powerful catalyst for growth, enabling organizations to innovate faster, operate more efficiently and unlock new opportunities across almost every part of business. As the expert system evolves, more credit professionals are looking to adapt, or at least explore, what it is capable of.

Why it matters: While highly efficient, AI has limitations in both its applications and the extent to which it can be relied upon in credit management. Understanding its advantages and risks, and establishing clear protocols, allows credit professionals to leverage AI to its fullest potential—efficiently, responsibly and ethically.

A Force of Efficiency
In business credit, AI is being used to streamline a range of processes, such as data gathering, collections, contract review and workflow management. It can facilitate internal and external communication by helping professionals draft clear, efficient emails or messages.

“Using an EDGAR report, I can ask AI to generate a short summary, a detailed report or even a PowerPoint presentation suitable for executives,” said Eve Sahnow, CCE, director of credit at OrePac Building Products (Wilsonville, OR). “By combining my own judgement with external sources and AI-generated insights, I’m able to make more informed decisions.”

AI can significantly improve collections by automating routine communication such as payment reminders, follow-ups and status updates, allowing credit teams to focus on higher-risk accounts. “We have leveraged machine learning capabilities such as real-time monitoring, built-in ERP controls and third-party risk indicators to help detect fraud,” said Sahnow. “These tools allow us to identify unusual payment activity, changes in banking behavior and inconsistencies in customer information or anything that falls outside of normal patterns. The greatest benefit is scale, as it enables us to quickly focus on exceptions and respond in real time.”

In place of completing time-consuming tasks, credit professionals can focus their attention on developing key skills, strengthening best practices, building relationships and making strategic decisions. Generative AI tools like OpenAI’s ChatGPT and Microsoft’s Copilot are commonly used to research customer information. Their ability to interpret, summarize and generate reports from large volumes of data has led to more accurate risk scoring and improved decision-making.

Best Practices for AI Use
Some credit professionals are reluctant to use artificial intelligence as it poses privacy concerns if confidential information is added in and stored and shared with the public. Sometimes generative AI models are prone to hallucinations, the production of incorrect, nonsensical or fabricated information presented as factual, stemming from pattern-based learning on vast datasets. For this reason, it is recommended as a support tool rather than a sole source when making credit decisions. To fully realize its benefits, it must be used thoughtfully, ensuring it complements human judgment rather than replacing it.

“We’re experimenting with it for financial analysis, but we’re keeping in mind that it does not replace experience and intuition,” said Staci Cima, CCE, director of credit at Echo Electric (Saint Louis, MO). “I advise that credit professionals ensure they’re verifying the information that it generates with their own research and insights.”

Without clear guidelines on when and how employees can use expert systems, organizations increase their risk exposure. Having well-defined policies and procedures in place helps mitigate that risk. For example, in the event of an audit or legal claim, the absence of documentation supporting AI usage can create significant compliance and liability issues.

“We strongly encourage organizations to develop clear usage policies and maintain strong records so that, if their use of the application is ever challenged, there is a transparent and justifiable basis for how it is applied,” said Kathleen McGee, partner at Lowenstein Sandler LLP (New York, NY). “Too often, we see AI used in a purely supportive role without adequate oversight, leading to carelessness and ultimately, improper use. Conducting routine reviews on policies and procedures and language in contracts about the use of it for internal and external-facing departments is critical.”

ChatGPT may be used at an enterprise level with built-in safeguards to ensure that any information entered remains private to the company. “Our use of robotics is very limited, as my company takes a conscientious approach to its adoption, ensuring that we do not input private customer information,” said Eleanor Hartman, CCE, credit manager at Autodesk, Inc. (Portland, OR). “We also have proper AI training that outlines what it can and cannot do, as well as appropriate use cases.”

Companies have the option to use private AI, which keeps data and models within a secure, controlled environment, preventing exposure to public networks and third parties. This isolation significantly reduces risks of data breaches, unauthorized access and misuse, allowing businesses to leverage powerful neural networks while protecting proprietary information and meeting strict regulations. “Our parent company has a policy in place where we can only use one AI tool that’s on our servers so that data, especially customer data, is protected,” said Cima. “We’re not allowed to use another engine except for the one we’re approved to do so. However, we can use other engines if we’re gathering information from publicly traded companies.” 

The bottom line: AI offers meaningful efficiency gains in credit management, but its value depends on how responsibly it is used. Clear policies, secure systems and human judgment remain essential to ensuring AI supports credit decisions rather than undermining them.

Jamilex Gotay, senior editorial associate

Jamilex Gotay, a Towson University alum, holds a B.S. in English. Her creative writing background fuels her success as a writer, journalist and award-winning poet. Fluent in English and Spanish, with intermediate French skills, she’s passionate about travel and forging connections. When not crafting her latest B2B credit story, she enjoys quality time with loved ones, outdoor pursuits and creative activities.