New Delhi: Fake news has been described as a “serious threat” to public order and democracy by a Parliamentary Standing Committee, which has called for tougher penalties, mandatory fact-checking, and greater accountability across media platforms.
In its draft report adopted on Tuesday, the Committee on Communications and Information Technology, chaired by BJP MP Nishikant Dubey, unanimously recommended a raft of measures to curb the spread of misinformation. The report is expected to be tabled in Parliament during the next session.
The committee urged the Ministry of Information and Broadcasting to make it mandatory for all print, digital, and electronic media outlets to have both a fact-checking unit and an internal ombudsman. It further suggested penal provisions be amended to impose higher fines and assign responsibility at multiple levels—editors for editorial oversight, publishers for institutional lapses, and digital platforms for circulating false content.
Calling for inter-ministerial and international cooperation, the panel also proposed a dedicated task force to counter cross-border misinformation. It cited examples such as France’s law on election misinformation as models worth considering.
Artificial intelligence emerged as a key focus in the report. The committee warned against AI-generated fake content, particularly targeting women and children, and recommended licensing requirements for AI content creators, mandatory labeling of AI-generated media, and the use of AI tools with human oversight to track offenders.
In a post on X, Nishikant Dubey stressed that misinformation poses a danger comparable to instability in neighboring countries, vowing that anti-national propaganda would be curbed.
The panel also highlighted the role of algorithms in amplifying sensational or fake stories on social media platforms and urged the government to consider legal mechanisms to address this bias. It further recommended a media literacy curriculum in schools and training for teachers, alongside public awareness campaigns to encourage critical thinking.
Stakeholders raised concerns over the “safe harbour” clause under Section 79 of the IT Act, which exempts platforms from liability for third-party content. The report underlined that such loopholes, coupled with revenue models that reward virality, have intensified the spread of misinformation.
The draft concluded that unchecked fake news not only distorts democratic debate but also risks public safety, individual reputations, financial markets, and the credibility of the media.