“I Am a Legend, Gill Can’t Be Like Me”: Kohli Praises Himself, Gets Trolled After Deepfake Video Circulates on Social Media
Renowned cricketer Virat Kohli has once again become the target of a deepfake video, which is now going viral and causing a stir among his fans. Many on social media are criticizing him based on this video.
The clip showcases an old interview of Kohli, where he appears to criticize India’s top batter, Shubman Gill, and even compares himself to Sachin Tendulkar. However, in reality, Kohli never made such statements in the interview.
In this deepfake video, Kohli’s voice and facial expressions are convincingly mimicked, making it easy for viewers to believe that Kohli actually questioned Gill’s potential to become a cricket legend. The video is so realistically crafted that it could easily deceive the general public.
What is Kohli Saying in the Video?
In the video, Kohli can be heard in a fake voice saying, “I have been closely observing Gill. There’s no doubt that he is talented, but there’s a big difference between showcasing talent and becoming a star. Gill’s technique is excellent, but we shouldn’t get ahead of ourselves.
People see him as the next Virat Kohli, but let me make it clear that there is only one Virat Kohli. The dangerous bowlers I’ve faced and the situations in which I’ve scored runs cannot be measured by one of Gill’s innings. It will take him time to achieve that.”
“There is Only One Virat Kohli”
The video goes even further, falsely attributing statements to Kohli where he is shown claiming his unique position in Indian cricket. The video depicts Kohli saying, “People talk about the next Virat Kohli, but there is only one Virat Kohli.
I have faced top bowlers and performed under intense pressure, and I’ve been doing this for over a decade. This cannot be replicated with just a few good innings.”
This is the Second Time It Has Happened
This incident marks the second time this year that Kohli has fallen victim to a deepfake video. Earlier, another fake video surfaced where he was seen promoting a betting app and comparing himself to Sachin Tendulkar.
Following these incidents, people have started questioning AI as well. These deepfake videos have sparked outrage among fans, and concerns about the misuse of artificial intelligence have also grown.
Tendulkar and His Daughter Sara Also Targeted
Former cricketer Sachin Tendulkar has also fallen victim to a deepfake. A video went viral showing him promoting a gaming app called ‘Skyward Aviator Quest.’
Sachin himself posted a clarification, stating that the video was fake and created to mislead people. He condemned the misuse of technology, calling it completely wrong. Along with this message, he tagged the Indian government, Minister of State for Information and Broadcasting Rajeev Chandrasekhar, and Maharashtra Cyber Police.
In the fake video, Tendulkar was shown saying that his daughter Sara withdraws a lot of money from this game every day. He was depicted as saying, “I am amazed at how easy it has become for Sara to earn money now.”
What is Deepfake and How is it Created?
The term “deepfake” was first used in 2017. It gained attention when videos of several celebrities, including actresses Emma Watson, Gal Gadot, and Scarlett Johansson, were posted on the American social news aggregator Reddit under an account named “Deepfake.” These videos were of a pornographic nature.
Deepfake refers to the technique of superimposing someone else’s face, voice, and expressions onto a real video, photo, or audio, making it appear incredibly realistic. The fabricated content can be so convincing that people might believe it to be genuine.
This technology involves the use of machine learning and artificial intelligence (AI). With the help of advanced technology and software, deepfake videos and audio can be created.
AI and cybersecurity expert Puneet Pandey explains that the technology has advanced significantly, and now ready-to-use packages are available, making it accessible to almost anyone. Current advancements have also improved voice replication, making voice cloning highly dangerous.