CBA makes AI abuse detection model available to all

CBA Commonwealth Bank financial abuse AI detection

The Commonwealth Bank of Australia (CBA) has announced it will make its artificial intelligence and machine learning model, designed to detect “harassing, threatening or offensive messages” in payment transactions, available for free to all banks.

CBA said the model and source code will be available this week on GitHub, a popular open source code resource base for developers.

The model was built by CBA, with the source code developed in partnership with the bank’s partner,, an open source generative AI and machine learning platform provider, with the pair having established an exclusive partnership back in late 2021.

According to the developers, the approach uses a combination of machine learning (ML), natural language processing (NLP), and pre-trained large language models (LLM) on public data, text analysis, and elements of graph concepts to identify abusive relationships.

The model was trained using labelled data of verified cases of high-risk abuse from CommBank.

CBA group customer advocate Angela MacMillan said the model can be used to “scan unusual transactional activity and identify patterns and instances deemed to be high risk”, enabling the bank to “investigate… and take action.”

Since its implementation in 2021, the model has detected around 1,500 high-risk cases of technology-facilitated abuse.

“By sharing our source code and model with any bank in the world, it will help financial institutions have better visibility of technology-facilitated abuse,” MacMillan said. “This can help to inform action the bank may choose to take to help protect customers.”

“We developed this technology because we noticed that some customers were using transaction descriptions as a way to harass or threaten others.

Technology-facilitated abuse is commonly defined as the use of technology such as mobile, online or other digital technologies, as a tool for people to engage in behaviours such as coercive control, intimidation, stalking, monitoring, psychological and emotional abuse, consistent harassment and unwanted contact, sexual harassment, to cause harm and distress to the recipient.

The introduction of real-time payments network, the NPP, in 2018 created a new avenue for abusers to target victims, with the payment description field – which was expanded to up to 280 UTF-8 characters, alonside an additional 35 printable ASCI characters made available in the payment reference – providing another method to contact and harass their victims.

MacMillan said financial abuse remains “one of the most powerful ways to keep someone trapped in an abusive relationship”.

“Sadly we see that perpetrators use all kinds of ways to circumvent existing measures such as using the messaging field to send offensive or threatening messages when making a digital transaction,” she said.

CBA said its AI model complements its automatic block filter introduced in 2020 across its digital banking channels, which is designed to stop transaction descriptions that include threatening, harassing or abusive language.

CBA’s Community Awareness survey, released in July this year, found that more than two out five (42 per cent) of Australians have experienced financial abuse themselves or know someone who has.