EntertainmentSports

Sachin Tendulkar Falls Victim to Deepfake: Voice Manipulated for Game Promotion Video, Deems It Fake and Inappropriate

Former cricketer Sachin Tendulkar has also fallen prey to deepfake technology. A video of him promoting the gaming app ‘Skyward Aviator Quest’ has gone viral on social media. Sachin has posted on social media, stating that the video is fake and has been created to deceive people. He has tagged the Indian government, Information and Broadcasting Minister Rajeev Chandrasekhar, and Maharashtra Cyber Police with this message.

Mention of Sachin’s Daughter in the Video

In the fake video, Sachin mentions that his daughter earns a substantial amount daily from the game. He expresses astonishment at how easy it has become to make good earnings nowadays.

Rashmika Mandanna’s Deepfake Video in November

In November, a deepfake video of actress Rashmika Mandanna went viral on social media. The video used AI technology to morph her facial expressions realistically onto an influencer’s face. Thousands on social media mistook the fake video for real due to its convincing expressions.

Not Rashmika but Zara Patel

Interestingly, the woman in the video was not Rashmika Mandanna but a young lady named Zara Patel, who had altered her face to resemble Rashmika’s. After the deepfake video went viral, the revelation was made by an ALT News journalist.

What is Deepfake and How is it Created?

The term “deepfake” was first used in 2017, and it gained popularity on the social news aggregator Reddit, where many celebrities’ videos were posted. Deepfake refers to the act of using machine learning and artificial intelligence to superimpose someone else’s face, voice, or personality onto real videos, photos, or audio. The goal is to make the fake content appear genuine.

It involves the use of machine learning and artificial intelligence, and technology and software are employed to create videos and audio. Deepfake can convincingly alter the appearance and voice of a person in a way that seems real.

Experts like Punit Pandey, knowledgeable in AI and cybersecurity, state that technology is now prepared to use and widely available for creating deepfakes. Anyone with access to the technology can use it. Voice cloning has also seen improvements in current technology and has become a significant concern.

In conclusion, deepfake technology poses a significant risk, and its misuse can have severe consequences. The advancements in technology, particularly in voice cloning, raise concerns about the potential misuse of this technology.

Rasesh Nageshwar

Hi there! I'm Rasesh Nageshwar, and I'm passionate to write about entertainment, movies, web series, and sports. As a writer, I love sharing my insights and opinions on the latest trends, news, and events in these exciting fields.