SHARE
Facebook X Pinterest WhatsApp

1.6 Million Voices Stolen: Your Voice Could Be Next

A cybersecurity researcher’s recent discovery from yesterday should make every gym member’s blood run cold. Jeremiah Fowler uncovered something that defies belief, 1,605,345 audio recordings sitting completely exposed online. No password. No encryption. No protection whatsoever. These were not random files. They were five years of personal phone calls and voicemails from gym members spanning […]

Sep 10, 2025
eSecurity Planet content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More

A cybersecurity researcher’s recent discovery from yesterday should make every gym member’s blood run cold.

Jeremiah Fowler uncovered something that defies belief, 1,605,345 audio recordings sitting completely exposed online. No password. No encryption. No protection whatsoever.

These were not random files. They were five years of personal phone calls and voicemails from gym members spanning 2020 to 2025, managed by the Minnesota-based Hello Gym communication platform. With AI voice cloning now requiring just three seconds of audio to create convincing replicas, this breach turns every exposed voice into a weapon criminals can use for targeted phishing attacks.

Personal security details freely available to anyone with internet access

In a Sept. 9 blog post, Fowler noted how every recording sat as an easily accessible .mp3 file in what appeared to be a VoIP storage repository intended for internal use only. And the voicemails contained far more than simple gym inquiries, they included names, phone numbers, and specific reasons for calling, a ready-made profile for attackers.

It gets worse. Some recordings captured gym staff sharing their names, gym numbers, and personal passwords during routine calls.

One recording revealed a gym manager calling a security monitoring service to disable alarms for testing, freely providing their name, location, and password credentials. That single slip essentially handed criminals keys to multiple security systems, plus the authentic voices needed to exploit them.

Voice theft meets artificial intelligence in a perfect storm

The timing could not be worse for the 1.6 million affected individuals. Microsoft researchers announced back in 2023 that their AI tool VALL-E-2 can achieve human-quality voice cloning with just three seconds of audio, and real-world attacks are already hitting victims.

Financial institutions have been receiving waves of voice clones trying to access customer accounts. Parents are getting ransom calls with their children’s cloned voices demanding immediate payment. Last year, criminals in Hong Kong used deepfake audio and video of corporate executives to steal $25 million. Earlier this year, a Florida woman lost $15,000 after scammers cloned her daughter’s voice using social media videos.

The sophistication of these attacks keeps rising. Since most cyberattacks rely on human interaction and social engineering, authentic voice samples make scams far more convincing.

Your voice cannot be changed like a password

This breach is more dangerous than a typical data exposure.

Unlike passwords or credit cards, you cannot swap out your voice once it is compromised. Voice data could be considered biometric information under US policy and state laws, which means this breach may have taken something as permanent as a fingerprint. Chilling, right?

Cybercriminals can now build detailed victim profiles by cross-referencing this audio data with information from the dark web or other breaches. The fitness industry has already shown its soft spots, Total Fitness fell victim to a cyberattack in January 2021 that exposed bank account information and membership agreements dating back to June 2018.

Organizations are being advised to implement encryption, conduct vulnerability testing, segment data, and conduct third-party risk management (TPRM) to prevent similar exposures. For individuals, the guidance is straightforward. Assume your voice may already be compromised and prepare accordingly. Monitor financial accounts closely, verify phone calls through alternate channels, and warn family members about voice cloning scams that could use your authentic vocal patterns.

The era of trusting voices over the phone just ended, and 1.6 million people discovered their most personal biometric identifier is now in the hands of the bad guys.

Recommended for you...

SQL Injection Prevention: 6 Ways to Protect Your Stack
Matt Gonzales
Jul 9, 2025
Microsoft Defender vs Bitdefender: Compare Antivirus Software
Jenna Phipps
May 27, 2025
Bitwarden vs Dashlane: Comparing Password Managers
Jenna Phipps
May 14, 2025
What Is Malware? Definition, Examples, and More
Davin Jackson
Feb 10, 2025
eSecurity Planet Logo

eSecurity Planet is a leading resource for IT professionals at large enterprises who are actively researching cybersecurity vendors and latest trends. eSecurity Planet focuses on providing instruction for how to approach common security challenges, as well as informational deep-dives about advanced cybersecurity topics.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.