The Supreme Court of Israel is sending a clear message to the legal community: the use of artificial intelligence in legal submissions must be handled with extreme caution, or face serious consequences. Last week, the court dismissed a petition after discovering it cited non-existent legal precedents created by AI, and ordered the petitioner to pay 7,000 shekels (approximately $1,900) in legal costs.

This incident marks the second instance in a week where the court has addressed the issue of AI misuse in legal proceedings. Justice Noam Sohlberg, who wrote the ruling, acknowledged the potential of AI while emphasizing the need for vigilance. “This tool holds great promise, but its pitfalls must be avoided. We must embrace its essence and discard its shell. While the court has been patient, given the novelty of the issue, that patience is not unlimited,” Sohlberg stated. “From now on, legal professionals will be expected to exercise full caution, and judicial responses will adjust accordingly.”
The petition, lodged in January by an Israeli animal rights NGO against the Agriculture Ministry, sought to challenge the ministry’s decision to extend a temporary regulation. This regulation allowed for the euthanasia of stray dogs. The extension followed a surge of stray dogs entering Israel from Gaza after the October 7 attack, which led to breaches in the border fence. The petitioner argued that the extension was disproportionate and violated Israel’s animal welfare laws, advocating for alternative solutions. However, during proceedings, it became apparent that the petitioner had referenced judicial precedents that did not exist and could not be found in official Israeli court records.
”My own attempt to locate these so-called ‘precedents’ yielded nothing. The petitioner failed to provide an explanation or the requested documents,” Justice Sohlberg stated. The organization also requested that the court waive the legal fees, citing financial difficulties, but this request was denied. Sohlberg explained, “The petitioner offered no legal sources to support its case, leaving only one conclusion… Courts cannot accept legal filings containing false claims of any kind, including references to nonexistent legal sources. The rise of AI tools…also carries risks. Those who fail to use these tools with proper caution may unknowingly submit filings containing serious legal flaws.”

Sohlberg rejected any notion that AI-generated misinformation should be treated differently from other forms of deception. He wrote, “Petitioners are required to come before this court with clean hands and honest intent. Presenting fictitious references that could mislead the court is unacceptable and warrants outright dismissal.” While acknowledging the association’s commitment to animal welfare, he ruled that “as important as its cause may be, no party is exempt from the fundamental legal obligations required in court proceedings.” The three-judge panel, including Justices David Mintz and Yosef Elron, unanimously dismissed the petition and ordered the association to cover the Agriculture Ministry’s legal costs. This sends a strong message of accountability.
Earlier this week, the Supreme Court addressed a similar case involving an attorney who cited fabricated AI-generated rulings in a divorce dispute. Justice Gila Canfy-Steinitz noted that while the attorney did not disclose the source of the incorrect citations, the inconsistencies strongly suggested the use of AI. Although the court reviewed the case on its merits to prevent harm to the petitioner, it dismissed the claims due to reliance on unreliable legal references. Canfy-Steinitz commented, “The AI-generated response seemed so convincing to the attorney that she did not bother verifying its accuracy.”
Attorney Ariel Dubinsky, a specialist in both intellectual property and AI law (who was not involved in either of the cases), indicated that these rulings represent a substantial shift. “Unlike the previous case, where the court refrained from imposing financial penalties, this time the Supreme Court ordered legal costs of 7,000 shekels, sending a clear warning to the legal community,” Dubinsky explained. “The message is clear: the grace period for AI-related mistakes is over. The judicial system is moving from an educational approach to full accountability. Lawyers must thoroughly verify their legal sources, as the consequences of relying on unverified AI-generated content are becoming increasingly tangible.”
