CBA ChatGPT error links customers to wrong business


A Commonwealth Bank (CBA) employee used a public version of ChatGPT to help a customer find a business phone number, but instead gave out the contact details of a different company. The incident has raised questions about CBA’s governance and staff training around the use of generative AI tools in customer service.

The number provided by the employee connected customers of game chair manufacturer SecretLab to Tasmanian game development studio Secret Lab.

Related Article Block Placeholder

Article ID: 321249

Studio co-founder Paris Buttfield-Addison told SmartCompany the calls have been occurring for months and number more than 60 so far, forcing the business to plan a phone number change. 

“At first we just assumed it was confusion over similar names… but the volume kept increasing,” Buttfield-Addison said.

After asking a caller, the studio discovered that CBA had given out the number.

Smarter business news. Straight to your inbox.

For startup founders, small businesses and leaders. Build sharper instincts and better strategy by learning from Australia’s smartest business minds. Sign up for free.

By continuing, you agree to our Terms & Conditions and Privacy Policy.

According to Buttfield-Addison, CBA’s response to Secret Lab’s subsequent complaint was that there was no privacy breach.

“[They said] that it was fine because they (the Retail Complaints Specialist) was also able to get her personal, unauthenticated ChatGPT to provide our phone number as if it was the chair company. She even sent me a screenshot of her phone doing it from her CBA email address,” Buttfield-Addison said.

Source: Supplied

CBA confirmed the AI tool was used in error and was not part of its process, saying that no confidential information was shared. 

The bank has also said this was not a privacy breach due to Secret Lab’s business number being publicly available online.

Related Article Block Placeholder

Article ID: 318779

“We are aware that a CBA employee who was trying to help a customer locate a contact number for a merchant, incorrectly used a publicly available AI tool to locate that information,” a spokesperson said. 

“We can confirm that no customer or confidential information from CBA systems was shared with third parties or with the publicly available large language model. 

“The use of this AI tool to search for public information to assist a customer was a mistake and not part of our process. We have taken steps to address it.

“We take our privacy obligations seriously, including having secure tools to help protect customer data, and processes and procedures in place for our teams.”

SmartCompany understands the employee used an unauthenticated, public version of ChatGPT rather than a CBA-issued AI product.

While CBA trains customer service staff on Microsoft’s Copilot AI, which is used internally, the use of public generative AI platforms such as ChatGPT was not part of sanctioned workflows at the time, and no specific training for such tools was in place before this incident.

According to Secret Lab, it’s an issue that CBA gave out customer contact information to a third party.

“It doesn’t matter where they got the information from, and arguably getting it from ChatGPT, potentially on a personal device, makes it worse,” Buttfield-Addison said.

Timing: CBA announces OpenAI partnership

Related Article Block Placeholder

Article ID: 320577

The disclosure of this incident comes in the same week that CBA announced a multi-year strategic partnership with OpenAI to roll out ChatGPT Enterprise across its workforce.

The bank says the enterprise version offers enhanced security, privacy, and administrative controls, and that no customer data will be used to train OpenAI’s models.

CBA has framed the partnership as a way to “equip our people with the most advanced AI tools and capability”, with comprehensive training and upskilling programs promised as part of the rollout.

While ChatGPT Enterprise will be introduced progressively, CBA has not confirmed whether customer-facing staff, such as those in contact centres, will have access. It’s also unclear how the tool might be used in day-to-day customer service in the future. 

The bank did not say whether the new training under its OpenAI partnership would address the risks highlighted by the recent unauthorised ChatGPT use.

CBA maintains the phone number was publicly available and that this was an isolated incident. However, the case highlights the governance challenges large organisations face as generative AI tools become more accessible.


Source

Visited 1 times, 1 visit(s) today

Recommended For You

Avatar photo

About the Author: News Hound