Love, Lies, and Algorithms: how AI has industrialized pig-butchering scams

Love, Lies, and Algorithms: how AI has industrialized pig-butchering scams

An apropos term, “pig-butchering” originates from the Chinese phrase Shazhupán and refers to
romance scams, investment schemes, and various forms of cryptocurrency fraud (Han and Button, 2025). The proliferation of AI-driven technologies has enabled cybercriminals and fraudsters to foster relationships with their victims, metaphorically fattening them up before they are “butchered” and financially exploited (Bai, 2020; Ye and Duan, 2020). In December 2025, we were asked to present on this phenomenon during the Global Initiative Against Transnational Organized Crime’s annual OC24 conference, exploring how AI has been weaponized by criminals and what role public safety and community care is poised to play in mitigating the impacts of “pig-butchering” on the elderly and most vulnerable. While some policy makers have temporized on how best to tackle these scams and support discussions on victimology, we present an interdisciplinary approach from the sociological theory of symbolic interactionism, and the lens of law enforcement, social service work and developmental service work, with a particular focus on how to educate people on the use of AI and the social engineering methods criminals use.

Although a comprehensive history of the origins of “pig-butchering” is beyond the scope of this note, we would be remiss if we did not highlight the role the COVID-19 pandemic played in accelerating these scams throughout Southeast Asia, in particular Myanmar and Cambodia. Jensen (2025) notes that criminal networks operating in the latter, for example, were impacted severely by the adoption of new gambling regulations in 2019, along with the worldwide pandemic. Pronounced declines in tourism created a vacuum for crime organizations, as they quickly pivoted to online scams and the exploitation of forced labor, often using casinos, hotels, office buildings, or residential areas as fronts for their illegal operations. Referred to as “Scambodia”, Cambodia features many scam centres which are controlled by international, organized criminal networks specializing in online fraud, generating between US$12.5 and $19 billion annually (Mosqueda, 2025). In 2022, the FBI conducted an investigation of a Chinese money-laundering network in New York, which led to the identification of 259 U.S. victims who collectively lost US$18 million to cyber fraud connected to Cambodian compounds (U.S. Department of the Treasury, 2025). In a related vein, COVID-19 and the February 2021 coup d’état in Myanmar created another opportunity for a sophisticated cyber-scam industry spanning across the Mekong region and which has exacted annual losses of up to US$37 billion in East and Southeast Asia (International Institute for Strategic Studies, 2025). These scams entail cross-border operations with a structured organization based in Southeast Asia which exploits weak governance and law enforcement gaps. The victims are usually based in North America, Europe, or Australia, complicating investigations and prosecution of those behind these operations.

Automated Intimacy

What is of particular interest to us is the use of AI in “pig-butchering” scams and how this challenges our understanding of symbolic interactionism, especially when we consider cybercommunities as (somewhat) recent cultural phenomena. A core theoretical framework in the discipline of sociology, symbolic interactionism is a micro-level theory that explores the meanings attached to human interaction and how communication allows people to make sense of their social worlds (Cooley, 1902; Mead, 1934; Goffman, 1959; Blumer, 1969). Cooley’s contention that we “see” ourselves when we interact with others is intriguing when we look at AI-driven “pig-butchering” because the victim’s sense of self grows out of socially engineered and deceptive interactions with cybercriminals – in other words, we must theorize how scammers and victims co-construct meaning through their interactions. Indeed, cyberspace forces us to ponder these online social relations, as criminals grift thousands of dollars from their victims through extended interactional sequences and repeated exchanges (Kassem and Carter, 2024). Carter (2021) uses symbolic interactionism to research the various language strategies scammers use to manipulate victims such as setting up scenarios, drip-feeding information, eliciting visceral responses, and isolating the victim. With scammers cultivating trust over time, gradual commitment continues, building an interpersonal relationship which transforms the relationship into a channel for financial reward/guidance. The cadence of conversation is (usually) as follows: through casual exchanges, a sense of normalcy between both parties is established. This is followed by the scammer constructing a false situation of either friendship or romantic interest, leading to financial requests which culminate as an extension of the relationship that has been established (Buil-Gil and Zeng, 2022). The patterns detected in “pig-butchering” scams are the techniques fraudsters use to distort the victim’s sense of reality, leading them to comply with the sense of urgency created by the “butcher” (Carter, 2021).

It is the construction of trust and identity that must, per force, be flushed out when we consider the prevalence of AI-driven scams: trust is symbolically negotiated through interactions based on consistent language patterns and emotional manipulation which makes it harder for the victim to withdraw from the relationship, revealing how language carries emotional weight. Aborisade et al. (2024) highlight that scammers are skilled manipulators who construct narratives to adapt victims’ responses, enhancing the emotional effectiveness of their deception through a false sense of commitment, symbolic attentiveness and relational investment. Intimacy and emotional consistency, then, undermine the norms of authentic interaction and exchange, becoming a strategic performance by the criminal.

According to Merkle Science (2025), a cryptocurrency risk and intelligence platform, “pig-butchering” scams are being rapidly optimized by large language models (LLMs), producing deepfakes, voice cloning, and AI chatbots like ChatGPT. Referred to as deepfake-driven grooming, criminals employ synthetic media to create deepfake videos and face-swapping tools to facilitate real-time video calls, posing as attractive, successful romantic partners. Presented as AI-generated audio, video, or images, deepfakes are used to impersonate real people. According to CoinCover, a crypto security service and digital asset protection company, 87 deepfake scam rings across Asia were dismantled during the first quarter of 2025. With regards to organizations, the scammer impersonates a trusted role, such as support, compliance, or a security team, and then claims there is a threat to the account, providing “steps” that transfer control over to the criminal (CoinCover, 2025). These types of operations include getting the victim to override their normal caution and treat a forbidden action as a security procedure, but cybercriminals are now using these techniques among the elderly and vulnerable.

One such case involved Barry May, a divorced and retired insurance adjuster living in Mississippi, who was contacted over social media by a woman named Anna. May started to receive explicit photos from Anna and, not long thereafter, an online relationship started. According to Anna, her aunt was holding $3 million of her money, and she needed May to invest in cryptocurrency so that her money could be released, allowing her to travel to Mississippi so the two of them could get married. Anna promised May huge returns and he ended up selling his property and liquidating his 401(k), sending Anna more than US$500,000. Unfortunately, Anna was a mere construction of generative AI and deepfake technology, and May ended up losing all of his life savings (Dilanian, 2024). The use of voice cloning has also garnered the attention of law enforcement: in Ontario, Canada, a senior named Marilyn Crawford received a phone call in 2021 from a scammer impersonating her grandson, Ian. Crawford was informed he’d been arrested for stealing a car and that he needed $9,000 sent to police for his release. Within 30 minutes, the scammer arranged to have a taxi arrive for her and drove her to a nearby CIBC branch in Oshawa, Ontario. Fortunately, a customer service agent flagged the transaction as fraudulent (Vellani and Common, 2025). According to the Canadian Anti-Fraud Centre, Canadians reported losing nearly $3 million to this type of scam in 2024 as a result of the technology becoming more sophisticated. For example, retired CIA officer, Peter Warmka, notes that:

“You need three to five seconds of a [voice] sample. You
can get it from a social media post. You can get it from a phone
call. And the scammers can make off with five [thousand],
10 [thousand], 15 [thousand], $100,000, $200,000 …
because [the victim] believes it’s somebody that it’s not.”

(quoted in Vellani and Common, 2025, para. 9)

 

AI-generated videos and images are also being used to convince people to invest in cryptocurrency scams, according to W5, a multi-platform investigative reporting unit of CTV News. One woman from Markham, Ontario named Denise lost $1.7 million in a cryptocurrency scam after viewing an AI-generated video featuring a deepfake Elon Musk who claimed that by following his instructions, Canadians could get rich fast. The video asked people to e-transfer $250 to Musk’s platform, yielding a profit of US$30 in 48 hours. Denise complied and sent the money, receiving verification that her investments were growing. Naturally, she was encouraged to send more money, resulting in her taking out a second mortgage on her home, and transferring $300,000 – $350,000 at times. Inevitably, when she tried to access her profits, she was informed that the videos were fake and were part of an elaborate “pig-butchering” scam (Foran, 2025).

A recent report by the United Nations Office on Drugs and Crime (UNODC) (2024) suggests that deepfake-related services utilized by criminal groups rose by 600% between February and July in 2023. Illustratively, the use of face-swap software surged by 704% in the second half of 2023. With new and emerging generative AI, criminals can create online personas with a plethora of curated profile pictures, fabricated careers and backstories. It is AI’s ability to adapt tone, language, and emotional cues to each victim that is especially concerning because LLMs have enhanced language fluency across countries, adopting idiomatically, culturally fluent messages to victims. The use of auto-generated replies is also on the rise with new generative AI tools like WormGPT and FraudGPT, both of which are AI chatbots marketed on the dark web as having no ethical safeguards or content filters, unlike its counterpart ChatGPT (Merkle Science (2025).

Care as Infrastructure for Crime Prevention

While a strong care economy promotes social networks and community ties, how can we create an infrastructure to prevent scammers from exploiting the most vulnerable? We contend that the “care economy” is poised to play an instrumental role in mitigating “pig-butchering” scams against the elderly and most vulnerable in our communities through emphasizing relationships, well-being, and mutual support in this sector. Chief among the strategies used is empowering those with limited digital literacy, educating select groups on how to distinguish between legitimate and fraudulent investment platforms. Targeted education by social service workers and developmental service workers, for example, can tailor outreach to older adults, other marginalized groups, or recently divorced individuals who face compounded vulnerabilities.

According to the Canadian Association of Social Workers (CASW), Code of Ethics, social service workers must provide competent professional practice, which includes engaging in continuing professional development to foster knowledge, skills, and abilities, as well as collaborating with professional colleagues and other disciplines to promote and expand ideas, knowledge, theory and skills (CASW, 2024). In the context of pig butchering scams, this includes researching, understanding, and educating others about manipulation tactics, risk factors and warning signs of these types of scams. There is ample opportunity to collaborate with law enforcement for knowledge mobilization, as evidenced by a 2024 study by Acharya and Holz, who documented five distinct emotional and psychological manipulation techniques that scammers performed on victims: love bombing, guilt tripping, urgency and pressure, isolation and future promises (Acharya & Holz, 2024). These concepts may already be familiar to many, but social service workers and developmental service workers can share this data with law enforcement, taking extreme caution and ensuring that they are exercising trauma-informed practices when dealing with victims of these scams. These principles work to also prevent future instances of pig butchering by reducing victim shame and self-blame, and by creating a safe space that encourages disclosure and help-seeking. This can be extremely important to victims of pig butchering scams, as they may experience issues with trust after being scammed.

We practitioners of the “care economy” must be vigilant of new threats to our community members, collaborating more with law enforcement to break down silos and adopt a more holistic approach to care. Such examples include Elder Abuse Prevention Ontario (EAPO), a provincial organization leading elder abuse prevention, who has partnered with the Canadian Anti-Fraud Centre on co-hosting educational webinars and promoting reporting of scams targeting seniors. South of the border, also, is the American Association of Retired Persons’ (AARP) ElderWatch program, which offers peer-to-peer counseling for residents with questions about their credit report, finances in general, or who may have recently been the victim of a scam. Such initiatives open limitless vistas of inquiry, and action, into mitigation efforts against “pig-butchering” scams.

References

Aborisade, R. A., A. Ocheja, and B. A. Okuneye. (2024). Emotional and financial costs of
online dating scam: A phenomenological narrative of the
experiences of victims of Nigerian romance fraudsters – sciencedirect. (n.d.-a).

Acharya, B., & Holz, T. (2024, December 19). An explorative study of pig butchering scams.
arXiv.org.

Bai, A. (2020). “Research on Countermeasures for Investigating “The Pig Butchering Scam”
Telecom Network Fraud in the Context of Big Data.”. Cyber Space Security 10:1-5.

Blumer, H. (1969). Symbolic Interactionism; Perspective and Method.

Buil-Gil, D., & Zeng, Y. (2022). Meeting you was a fake: Investigating the increase in
romance fraud during COVID- 19. Journal of Financial Crime, 29(2), 460-475.

Carter, E. (1970, January 1). Distort, extort, deceive and exploit: Exploring the inner wo. The
British Journal of Criminology.

CASW code of Ethics, values and guiding principles 2024. Canadian Association of Social
Workers. (2025, January 13).

CoinCover. (2025, December 16). Deepfakes and pig-butchering scams 2025: Inside
crypto’s evolution.

Cooley, C. (1902). Human Nature and the Social Order.

Foran, P. (2025, December 19). “I was heartbroken”: Two Canadians lose $2.3 million to
crypto scams. CTVNews.

Goffman, E. (1959). The Presentation of Self in Everyday Life.

Han, B. & Button, M. (2025). An Anatomy of ‘Pig Butchering Scams : Chinese Victims’ and
and Police Officers’ Perspectives.

How AI is supercharging pig butchering crypto scams. The Predictive Crypto Risk &
Intelligence Platform. (2025, May 28).

International Institute for Strategic Studies. (2025). The Geopolitics of Southeast Asia’s
Cyber-scamming Crisis. Regional Spotlight: Asia. In the Armed Conflict Survey.

Jensen, S. (2025, December). Border Casinos Caught in Thailand-Cambodia Crossfire.

Kassem, R. & Carter E. (2024). “Mapping Romance Fraud Research – A Systemic Review.”.
Journal of Financial Crime.

Mead, G. (1934). Mind, Self, and Society.

Mosqueda, S. (2025, October). Outrage Over Cambodian Scam Centers Grows, Prompting
Action from Global Leaders.

UNODC publications. United Nations : UNODC Regional Office for Southeast Asia and the
Pacific. (2024, October).

U.S. Department of the Treasury. (2025, October). U.S. and U.K. Take Largest Action Ever
Targeting Cybercriminal Networks in Southeast Asia (Press Release No. SB0278).

Vellani, N., & Common, D. (2025, March 20). How con artists are using AI voice cloning to
upgrade the grandparent scam | CBC News. CBCnews.

Ye, H. & Duan, M. (2020) “Case Analysis and Simulation Experiment of “Pig Butchering Scam”
Online Fraud Behaviour.”. Journal of China People’s Public Security University (Social
Sciences Edition) 5:10-16.