[ad_1]
New Delhi:
Union Minister Ashwini Vaishnaw has called a meeting with representatives of social media platforms on the issue of deepfakes on Thursday, according to a source.
The move comes amid concerns over misuse of technology and the government’s firm resolve to push digital platforms to crack down on deepfakes.
According to a source, IT Minister Ashwini Vaishnaw will meet social media platforms on the issue of deepfakes on November 23.
Recently, several ‘deepfake’ videos targeting leading actors went viral, sparking outrage and raising apprehensions over misuse of technology and tools for creating fake content and narratives.
On Friday, Prime Minister Narendra Modi cautioned that deepfakes created by artificial intelligence can lead to a big crisis and stoke discontent in society, and urged the media to raise awareness about its misuse and educate people.
Vaishnaw has warned that safe harbour immunity clause will not apply if platforms do not take adequate steps to remove deepfakes.
The government had recently issued a notice to companies on the issue, and while the platforms responded, the minister made it clear that firms will have to be more aggressive in taking action on such content.
Speaking to reporters late last week, Vaishnaw had said, “They are taking steps… but we think that many more steps will have to be taken. And we are very soon going to have a meeting of all the platforms… maybe in the next 3-4 days, we’ll call them for brainstorming on that and make sure that platforms make adequate efforts for preventing it (deepfakes), and cleaning up their system.” Asked if big platforms like Meta and Google would be called for the meeting, the minister had replied in the affirmative.
“The safe harbour clause, which most social media platforms have been enjoying… that does not apply if they do not take adequate steps for removing deepfakes from their platforms,” Vaishnaw had said.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)
[ad_2]
Source link