Welcome to the Research Assistant Weekly Newsletter - a subscriber-only resource for insight into emerging compliance challenges, details on peer calls, and links to new Research Assistant reports, documents, tools, and more.
Like many industries, the ARM industry is looking to include chatbots as a consumer communication option. Consumers seem to prefer chatbots as a self-serve option instead of speaking directly with a collector, so why not add this communication method to your company’s offerings? Like most things in the ARM industry, it’s not so simple.
First, what exactly is a chatbot? A chatbot is a software application that aims to mimic human conversation through text or voice interactions. For our purposes, the term represents online text interactions. Not all chatbots have used artificial intelligence (AI). For years, rule-based chatbots have used keywords and other language identifiers to trigger pre-written responses or actions. These are not built on conversational AI technology.
In the era of ChatGPT, however, chatbots have evolved into large language models (LLMs) as a type of artificial intelligence (AI) system and are trained, using large amounts of text data, to understand natural language and produce human-like responses to inputs. These systems constantly learn from additional data information and interactions and use advanced machine learning (ML) algorithms.
Unsurprisingly, the CFPB has entered the conversion.
Last week, the CFPB weighed in by sharing some of its concerns regarding chatbots in the financial services industries. In the CFPB’s view, though chatbots can be helpful with simple questions, they can create challenges in the consumer’s customer service experience by not being able to process more complex situations and questions. This can lead to the consumer being caught in a communication loop, having wasted time while still having unanswered questions, or having been provided with misinformation. Per the CFPB, chatbots are less likely to negotiate settlements.
The CFPB also reminds us that chatbots must comply with all applicable federal consumer financial laws, and entities may be liable for violating those laws when their chatbots fail to do so. This includes proper consumer verification. Additionally, chatbots are not perfect at recognizing a consumer dispute or that a consumer is struggling to understand the language being used.
The CFPB warned, “Deficient chatbots that prevent access to live, human support can lead to law violations, diminished service, and other harms. The shift away from relationship banking and toward algorithmic banking will have a number of long-term implications that the CFPB will continue to monitor closely.”
Organizations using or considering chatbots should monitor for problems outlined by the CFPB. Make sure your organization has strong testing and auditing controls for your Chatbot so you are not creating the issues the CFPB has warned against or causing consumer frustration or unintended UDAAPs. When a consumer’s Chatbot questions get too complicated or include specific phrases, a trigger should be in place to refer to an agent.
During this week’s Research Assistant call, one member stated that he has been testing a chatbot as a “live chatter” for about 8 months and has only interacted with 228 consumers. He further stated that 50% of the 228 consumers he chatted with were asking for their account number because they had lost it and needed it to make an online payment.
Are Chatbots worth the time and effort it will take to vet them? I guess only time will tell.
Documents and Crowdsourced Materials:
Top Reads:
Upcoming Webinars/ Other Announcements:
|