Crypto exchange giant Binance's CEO Changpeng Zhao (CZ) has flagged a warning regarding AI-enabled deepfake videos that pose significant risks to cryptos.
Referring to an AI-powered video post featuring Joshua Xu, CEO of HeyGen, CZ said that such videos are “pretty scary from a video verification perspective.”
“Don't send people coins even if they send you a video,” he wrote.
The Tweet by Xu included 100% AI-generated videos of his own avatar and voice clone. “We've made massive enhancements to our life-style avatar's video quality and fine-tuned our voice technology to mimic my unique accent and speech patterns perfectly,” the post added.
The HeyGen CEO noted that this feature will soon be deployed to production for everyone to try and that the AI video generator would take only “two minutes” to create a real-looking digital avatar.
According to the Binance website, users must submit video evidence to verify the identity of the customer to withdraw funds from the exchange. The know-your-customer (KYC) process mandates a video verification along with a picture of the user’s ID card or passport.
The rule said, “please do not put watermarks on your videos and do not edit your videos,” the policy reads.”
In May, Jimmy Su, Binance’s chief security officer said that AI tech is very “advanced” and soon deepfakes might become undetectable by a human verifier.
DeepFakeAI, the AI project in the crypto industry gained more traction among the community for its superimposed videos of Tesla and Twitter CEO Elon Musk, SEC Chair Gary Gensler and Ethereum co-founder Vitalik Buterin.
The platform allows users to customize and create AI-powered videos through deepfake technology and can access the platform’s services through a native bot.
DeepFakeA
Read more on cryptonews.com