The deepfake AI has raised concerns over the privacy of normal people and the celebrities. Several celebrities have been fallen prey to the technology and Nora Fatehi is new one on the list. She shared a story on her Instagram account to reveal that her deepfake video is going viral online and she called out that deepfake video by writing that it was not her. A video featuring the actor-performer promoting a brand was gradually becoming more and more popular. Calling it out, Nora shared a screenshot of the video on her Instagram stories.
Nora Fatehi calls out deepfake video
One of the most recent celebrities to fall victim to the ongoing deepfake war was Nora Fatehi, amid the concerning rise in deepfake cases in India. The actor is seen endorsing a fashion brand in the viral video. It would be difficult to deny her because of how flawlessly the video is done. Everything about Nora, including her voice and body language, has been captured in the video almost flawlessly. Nora called out the same thing, posting on her Instagram stories. She wrote, “Shocking!!! I’m not this person.”
What is the scam?
A deepfake video is a type of artificial media created using machine learning and artificial intelligence (AI) techniques. With the use of this technology, videos can be altered, usually by replacing a person’s voice or face in an already-existing video with the likeness of another.
Through manipulation, an illusion is produced, giving the impression that the person in the original video is saying or doing things that never happened. The advent of deepfake technology has raised concerns about its potential for misuse in producing fabricated or misleading content, which poses a threat to the legitimacy and accuracy of visual media.
Keep watching our YouTube Channel ‘DNP INDIA’. Also, please subscribe and follow us on FACEBOOK, INSTAGRAM, and TWITTER