Courts Crack Down on AI Misuse in Legal Work, Demand Human Accountability

Indian courts are raising alarms over the unchecked use of artificial intelligence in legal practice, noting instances of incorrect or fabricated citations in petitions. The Supreme Court and several High Courts have begun regulating its use, cautioning against employing AI for writing judgments or unrestricted legal research. Legal experts emphasize that while AI can boost efficiency for drafting and research, human lawyers must rigorously verify all output and retain ultimate accountability. The consensus is that AI must remain a supervised tool to preserve the integrity of the justice system.

Key Points: AI in Legal Work: Courts Flag Misuse, Experts Stress Verification

  • Courts flag AI-generated errors in petitions
  • Judiciary restricts AI in judgments and research
  • Experts stress verification and human accountability
  • AI seen as efficiency tool, not ethical replacement
3 min read

Courts flag AI misuse in legal work, experts call for accountability, verification

Indian courts regulate AI use in law, citing fabricated citations. Experts call for human oversight and verification to maintain legal integrity.

"AI is a tool, not a substitute for professional responsibility. - Sanjeev K Kapoor"

New Delhi, April 15

The increasing use of artificial intelligence in legal documentation is triggering a wider debate within India's legal ecosystem, with courts and experts raising concerns over ethics, accuracy and professional accountability.

AI tools are being rapidly adopted by legal practitioners for drafting petitions, summarising case laws and conducting research due to their ability to process large volumes of information quickly. However, recent judicial observations indicate that unchecked use may undermine the integrity of legal proceedings.

The Supreme Court has taken note of the growing reliance on AI by lawyers and flagged instances where petitions contained incorrect or fabricated citations generated by such tools. Courts have termed such lapses as misconduct, stressing that accountability ultimately rests with the advocate.

Several High Courts have also moved to regulate AI usage. The Punjab and Haryana High Court has cautioned judicial officers against using AI for writing judgments or conducting legal research. The Gujarat High Court has restricted its use in judicial decision-making, allowing it only for limited administrative purposes.

In another development, the Haryana Real Estate Regulatory Authority (HRERA) relied on an AI-generated overview of local property prices while directing a developer to pay enhanced compensation to a homebuyer.

Legal experts have emphasised the need for a balanced approach. Sanjeev K Kapoor, Senior Partner at Khaitan & Co, said, "The concern is not with AI per se, but with its indiscriminate and unverified use. There is nothing inherently unethical about using AI to generate a first draft of a petition or to assist with research--provided the lawyer exercises independent judgment and rigorously verifies the output before relying on it. Ultimately, the duty remains unchanged: lawyers must apply their minds and must verify every word and citation submitted to the court. AI is a tool, not a substitute for professional responsibility."

CV Raghu, President and Founding Member of the General Counsel's Association of India, said, "AI is a powerful ally for efficiency, but it lacks the ethical compass and critical judgment essential to the practice of law. While we embrace innovation, the ultimate accountability for every citation and argument must remain firmly with the human advocate to ensure the integrity of our justice system."

Industry voices echoed similar concerns. Jagdish Mitra, Founder and CEO of Humanizetech.ai, said, "Use of AI needs to move beyond being passive assistants to autonomous agents that drive real-world outcomes. In critical fields like legal, this is not just about automation, but augmented accountability, where the speed and precision of AI are guided by context, judgment and human acumen in doing the right and fair thing."

As adoption grows, experts stressed that AI must remain a supervised tool, with strong human oversight to ensure that technological convenience does not come at the cost of justice and due process.

- ANI

Share this article:

Reader Comments

R
Rajesh Q
Finally! Courts are waking up. I've heard of cases where lazy lawyers just copy-paste from ChatGPT and it has wrong laws. They charge lakhs in fees but can't do basic verification? Shameful. The human lawyer must be held responsible, no excuses. 👏
A
Aditya G
Balance is key. AI can help manage the massive backlog in our courts by speeding up research and admin work. But for judgments? Never. The wisdom, context, and mercy required in a verdict can only come from a human judge. Good move by the High Courts to restrict its use in decision-making.
S
Sarah B
Interesting to see HRERA using it for property prices. If the data is accurate and verified, it could bring more transparency. But the risk of "garbage in, garbage out" is huge. We need clear guidelines and maybe even certified AI tools for specific legal use cases in India.
K
Karthik V
I respectfully disagree with the blanket caution. For small-town lawyers or those handling many pro-bono cases, AI is a godsend. It levels the playing field against big law firms with huge libraries. The focus should be on training lawyers to use it *responsibly*, not scaring them away from it.
M
Meera T
The Bar Council of India should step in and create a mandatory module on ethical AI use for all practicing lawyers. Technology is here to stay. "Augmented accountability" as the expert said is the perfect term. Use the tool, but your signature on the document means you stand by every word.

We welcome thoughtful discussions from our readers. Please keep comments respectful and on-topic.

Leave a Comment

Minimum 50 characters 0/50