Security professionals are warning of surging threat actor desire in voice cloning-as-a-provider (VCaaS) offerings on the dark web, developed to streamline deepfake-dependent fraud.
Recorded Future’s newest report, I Have No Mouth and I Must Do Crime, is dependent on threat intelligence investigation of chatter on the cybercrime underground.

Protect and backup your data using AOMEI Backupper. AOMEI Backupper takes secure and encrypted backups from your Windows, hard drives or partitions. With AOMEI Backupper you will never be worried about loosing your data anymore.
Get AOMEI Backupper with 72% discount from an authorized distrinutor of AOMEI: SerialCart® (Limited Offer).
➤ Activate Your Coupon Code
Deepfake audio technology can mimic the voice of a goal to bypass multi-factor authentication, spread mis- and disinformation and enrich the performance of social engineering in small business email compromise (BEC)-style attacks, between other items.
Browse much more on deepfakes: FBI: Beware Deepfakes Used to Utilize for Distant Work.
Recorded Future warned that ever more, out-of-the-box voice cloning platforms are readily available on the dark web, lowering the bar to entry for cyber-criminals. Some are absolutely free to use with a registered account though many others price minimal extra than $5 for every month, the vendor claimed.
Amongst the chatter observed by Recorded Upcoming, impersonation, contact-back again frauds and voice phishing are usually mentioned in the context of these kinds of instruments.
In some circumstances, cyber-criminals are abusing legitimate applications this kind of as those people meant for use in audio book voiceovers, film and television dubbing, voice performing and marketing.
Just one seemingly well-liked solution is ElevenLabs’ Key Voice AI program, a browser-based mostly text-to-speech instrument, that permits buyers to upload customized voice samples for a premium charge.
Nevertheless, in proscribing the use of the resource to paid clients, the vendor has encouraged far more dark web innovation, according to the report.
“It has led to an raise in references to risk actors advertising compensated accounts to ElevenLabs – as nicely as marketing VCaaS choices. These new restrictions have opened the doorway for a new type of commodified cybercrime that requirements to be tackled in a multi-layered way,” the report ongoing.
Fortunately, lots of current deepfake voice systems are confined in building only a person-time samples that are unable to be applied in actual-time extended discussions. Even so, an business-wide approach is essential to tackle the threat right before it escalates, Recorded Foreseeable future argued.
“Risk mitigation approaches need to be multidisciplinary, addressing the root will cause of social engineering, phishing and vishing, disinformation, and a lot more. Voice cloning technology is even now leveraged by individuals with precise intentions – it does not conduct attacks on its personal,” the report concluded.
“Therefore, adopting a framework that educates staff members, end users, and consumers about the threats it poses will be extra efficient in the quick-phrase than preventing abuse of the technology by itself – which need to be a long-expression strategic aim.”
Infosecurity approached Recorded Upcoming for even more comment, but it was unwilling to offer anything at all further than the report.
Some elements of this posting are sourced from:
www.infosecurity-magazine.com