That Gig Might Be a Trap: Protecting Your Voice from AI Harvesting Scams
- Tom Dheere
- Apr 9
- 5 min read
Updated: 6 hours ago
We know our voice is our unique instrument, our craft, and our business. We spend years honing it, investing in equipment, and building relationships. But there's a growing threat lurking in seemingly legitimate job offers: scams designed not for a real project, but to steal your voice for AI cloning.
It sounds like science fiction, but it's happening right now. Artificial intelligence technology can create synthetic voices – sometimes startlingly realistic ones – from audio data. While this tech has legitimate uses, bad actors are exploiting it. They're tricking voice actors into recording lengthy scripts under false pretenses, harvesting that audio to train AI models that replicate their voices without consent, control, or compensation.
Imagine finding your voice used in ads, audiobooks, or worse – scams and deepfakes – without your permission and without earning a cent. It's a violation of your identity and your livelihood.
How AI Harvesting Scams Work: A Real-World Example
These aren't always obvious scams. Recently, I encountered a script presented as a potential voiceover job from a company called Giglumin. Here's what I and fellow voice actors Brigid Reale and Lynn Norris of VO for VA discovered...
Giglumin is a recently established online platform designed to connect freelancers with clients, facilitating various gig-based services. The website emphasizes features such as 24/7 assistance, budget-friendly options, satisfaction-guaranteed payments, and swift delivery of high-quality work.
However, it's important to exercise caution when engaging with Giglumin. The platform has been assigned a low trust score by ScamAdviser, indicating potential risks. Factors contributing to this assessment include the website's recent creation (registered on March 31, 2025), the use of services to conceal the owner's identity, and its association with a registrar known for hosting a high percentage of spam and fraudulent sites.
The topic? Child malnutrition – a totally plausible subject for an NGO documentary or educational content. The script was well-written, structured, and detailed. FYI if you received a similar email, post your experience in the comments below!
Here's why it was also perfect for voice harvesting:
It Was LONG: Introduction, seven detailed sections, conclusion – pages and pages of text. Recording it would easily yield 30-60+ minutes of clean audio. That's a goldmine for AI training data.
It Used Natural, Flowing Language: It wasn't just a list of words. It had complex sentences, varied vocabulary, transitions, and structure. This allows the AI to learn not just your sound, but your pacing, intonation, and prosody – the unique rhythm of how you speak.
It Covered Diverse Sounds: The text included technical terms, statistics, geographical locations, and general informational language, ensuring a wide range of phonetic sounds were captured.
The scammer's goal is simple: get you into your booth reading their text for as long as possible, capturing the raw data they need to create a digital replica of you.
Red Flags: How to Spot a Potential Voice Harvesting Scam
Protecting yourself starts with vigilance. Be wary if you encounter:
Vague Client Information: The "client" has no professional website, social media presence (like LinkedIn), or verifiable history. Communication might be solely through generic email addresses or messaging apps.
Unusually High Pay for Simple Work: If the offer seems too good to be true, it probably is.
Excessively Long "Audition" or "Test" Scripts: Asking for more than a few paragraphs for an initial audition is suspicious. Never provide extensive custom reads for free.
Focus on Quantity Over Quality (of Direction): They want a clean, raw recording of a long script but provide little to no actual direction regarding performance, tone, or usage. The only goal seems to be getting the audio recorded.
Pressure and Urgency: They push you to accept and record quickly, often bypassing standard procedures like contracts or detailed usage agreements.
Refusal to Sign Contracts or Detail Usage: A legitimate client will have a contract detailing exactly how and where your voice will be used, for how long, and the compensation. Scammers avoid this.
Odd Payment Methods: Requests for payment via unusual platforms, gift cards, or overpayment scams (sending too much and asking for some back) are huge red flags.
Refusal of Video Calls: A legitimate client looking to collaborate often won't mind a quick video call to discuss the project. Scammers prefer to hide.
Protecting Your Voice: Taking Action
Vet Your Clients: Do your homework. Google them. Check their website, LinkedIn profile, and any reviews. If they claim to be from a known company, verify independently.
Demand Clear Contracts: Always use a contract that explicitly states the scope of use (medium, territory, duration). Consider adding a clause specifically prohibiting the use of your voice recording for AI synthesis, training, or cloning.
Show them the NAVA AI Rider: see how they react. It will tell you everything you need to know!
Limit Free Samples: Provide short, targeted demos or audition reads (e.g., 30-60 seconds) relevant to the specific job. Don't record lengthy custom scripts for free.
Trust Your Gut: If something feels off – unprofessional communication, vague details, pressure tactics – walk away. It's better to lose a potentially fake gig than your vocal identity.
Use Reputable Platforms (Wisely): While some online casting platforms offer some protection, scammers can still operate there. Apply the same vetting principles.
Talk to Your Peers: Share suspicious requests or experiences in voice actor communities online. Collective awareness is powerful.
Consider Watermarking (Subtly): While difficult for preventing AI training entirely, subtle audio watermarks might offer some traceability, though this is complex. Focus more on prevention through vetting and contracts.
The Bottom Line
AI voice technology isn't going away, and it has incredible potential for good. But like any powerful tool, it can be misused. As voice actors, we must adapt and become fiercely protective of our most valuable asset. Be informed, be cautious, and don't let anyone steal your voice.
Stay safe out there, and keep using your amazing voices for legitimate, respectful projects!
Keep Going - Watch this next:
AI and Voice Actors
Want to stream as many How-To videos as you want? Get a Video Subscription - use code FORWARD for 20% off your first month!
Here's what I've got going on in the next few months - register to attend and add your voice to the mix!
Check in and Stay on Track...
Don't know where to start? Let's chat.
Grab 15 minutes free - let's see what your business needs...

Through VO Strategist, Tom's provided voiceover business & marketing coaching since 2011. He's also a voice actor with over 25 years of experience who has narrated just about every type of voice over you can think of. When not voicing or talking about voicing, Tom produces the sci-fi comic book Agent 1.22.
Wish this had been published a couple days earlier. They got me. All the background research I did was too early, and they seemed clean. Thanks for publishing stuff like this.