A number of well-liked voice cloning instruments in the marketplace don’t have “significant” safeguards to stop fraud or abuse, according to a new study from Consumer Reports.
Client Experiences probed voice cloning merchandise from six firms — Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify — for mechanisms that may make it tougher for malicious customers to clone somebody’s voice with out their permission. The publication discovered that solely two, Descript and Resemble AI, took steps to fight misuse. Others required solely that customers test a field confirming that they’d the authorized proper to clone a voice or make an identical self-attestation.
Grace Gedye, coverage analyst at Client Experiences, stated that AI voice cloning instruments have the potential to “supercharge” impersonation scams if ample security measures aren’t put in place.
“Our evaluation reveals that there are primary steps firms can take to make it more durable to clone somebody’s voice with out their data — however some firms aren’t taking them,” Gedye stated in a press release.