ECI's AI Warning: How Deepfakes Threaten India's Election Integrity

The Election Commission has issued a crucial advisory about AI-generated content threatening election fairness. Political parties must now ensure all synthetic media used in campaigning carries clear disclosure labels. They're also required to report fake accounts and unlawful content directly to social media platforms. This move aims to protect voter trust and maintain electoral integrity amid rising deepfake concerns.

Key Points: ECI Issues Advisory on AI-Generated Content in Elections

  • ECI warns hyper-realistic synthetic content contaminates electoral level-playing field
  • Political parties must report fake accounts and unlawful content to platforms
  • All AI-generated campaign material requires clear disclosure labels
  • Advisory aims to protect voter trust and ensure electoral transparency
  • Parties can escalate unresolved issues to Grievance Appellate Committee
3 min read

ECI issues advisory to control misuse of AI-generated content during elections; urges parties to report fake accounts, deepfakes

Election Commission warns political parties about deepfake misuse, mandates disclosure labels for AI content and reporting of fake accounts to protect electoral fairness.

"The use of technology for creating, generating, modifying and altering information is a deep threat and challenge because of its ability to masquerade as the truth - Election Commission of India"

New Delhi, October 25

The Election Commission of India on Friday issued an advisory to all national and state-recognised political parties regarding the usage of Artificial Intelligence-generated and synthetic content during elections.

As per the advisory issued by the ECI, the misuse of synthetically generated or AI-altered content, like fake videos or deepfakes of political leaders, is harming the fairness and integrity of elections.

"I am directed to state that it has been brought to the notice of the Election Commission of India (ECI) that the misuse of hyper-realistic synthetically generated information, including depicting political leaders making electorally sensitive messages, is contaminating the level- playing field in the electoral arena, disrupting fair and equal conditions for all political participants, which is a sine qua non for preserving the integrity of the political campaigning during elections," the press release said.

"The use of technology for creating, generating, modifying and altering information and publishing and transmitting synthetically generated information is a deep threat and challenge because of its ability to masquerade as the truth and unwittingly trap political stakeholders into incorrect conclusions and therefore, ECI finds it particularly imperative to ensure that transparency and accountability is maintained to preserve electoral integrity and voter trust," the release added.

According to the ECI, such types of content can appear real, mislead voters, and distort the electoral process, posing a serious challenge to transparency and trust. The electoral body had earlier issued guidelines on May 6, 2024 and January 16, respectively, related to the ethical use of social media and advisory on labelling synthetic/AI-generated content.

The advisory issued by the ECI also states that, under Article 324 of the Constitution, all parties must strictly follow the IT Rules, 2021, ensuring due diligence and responsible content use.

Any AI-generated or altered image, audio, or video used in campaigning must have a clear and visible disclosure label. Political parties, candidates, and campaign teams are accountable for ensuring compliance, the advisory said.

The advisory aims to protect electoral integrity, ensure a level playing field, and uphold voter trust in the democratic process.

As per the advisory, the political parties are required to report any unlawful or fake content, as well as fraudulent user accounts, to the respective social media platforms. In cases where such concerns remain unresolved, they must be escalated to the Grievance Appellate Committee in accordance with Rule 3A of the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.

- ANI

Share this article:

Reader Comments

R
Rohit P
Good move by ECI but implementation will be the real challenge. Political parties themselves create these deepfakes, so expecting them to report their own content seems unrealistic. Need stronger monitoring mechanisms.
A
Arjun K
As a tech professional, I appreciate this advisory. Deepfakes are becoming so advanced that even educated people can't tell the difference. Mandatory disclosure labels are a good first step to protect our democracy.
S
Sarah B
This is crucial for maintaining election integrity. In my country we've seen how AI misinformation can sway voters. India taking proactive measures shows commitment to democratic values. Hope other nations follow suit.
K
Kavya N
My elderly parents keep forwarding these fake videos thinking they're real. The government should also run awareness campaigns to educate common citizens about identifying deepfakes. Jai Hind! 🇮🇳
M
Michael C
While I support the intent, I'm concerned about enforcement. India's digital ecosystem is massive and monitoring every piece of content will be incredibly challenging. The advisory needs teeth - strict penalties for violations.
V
Vikram M
Better late than never! We've already seen how deepfakes can create unnecessary controversies. Hope all political parties follow this advisory in true spirit and not just on paper. Democracy needs this protection.

We welcome thoughtful discussions from our readers. Please keep comments respectful and on-topic.

Leave a Comment

Minimum 50 characters 0/50