The first picture showed a Bollywood actor and a Hollywood stuntwoman, both of whom were covered in towels, in a fake fight. A changed version, on the other hand, went viral and showed the actor in a low-cut white dress instead of the original towel. Machine learning tools were used to make this fake picture. These tools can change and replace people's looks in pictures, which often leads to the spread of false or misleading information.
(Also Read: 'Tiger 3': Salman Khan's OG Spy Film To Have Highest Action Sequences Till Date!)
Now, on space X a user shared Katrina Kaif's edited pic saying she went naked for the shoot. Users raised their voice against this man but this technology have gone too far.
(UmairSandu/X)
When the picture went viral on social media, many people responded very strongly to it. "Katrina Kaif's towel scene from Tiger 3 gets morphed," said one person. The deepfake picture is getting a lot of attention, which is a shame. AI is a useful tool, but changing women into other people using it is illegal. Has a bad feeling. "Deepfake is really scary!" said someone else.
In the case of Rashmika Mandanna, a video emerged showing a woman dressed in black getting into an elevator, with her face changed to look like the actor. A lot of people on social media thought the video wasn't real, which made people worry about how quickly fake news can spread online. Amitabh Bachchan, a famous Bollywood star, said that the video should be taken to court. Rashmika Mandanna called it "very scary."
(Also Read: Best Picture From Raha's Birthday Bash)
The deepfake video went viral, which made people worry about how AI could be used to spread false information. In response, the government allegedly reminded social media sites of the rules and laws that are already in place.