Law Reform: New Supreme Court Order Could Change How Legal Professionals Use AI
The courtroom is no place for AI experiments. Arkansas Supreme Court releases a serious warning.
The Arkansas Supreme Court just released a draft order that could redefine how AI is used in the justice system. This is due to a growing concern over how private court data is being shared with AI tools like ChatGPT. This newsletter breaks down what the new order says, who it affects, and why it matters to anyone who cares about digital privacy, legal ethics, or the future of justice.
🏛️ Court: Supreme Court of Arkansas
🗓️ Date: 5 June 2025
🗂️ Number: 2025 Ark. 117
The Arkansas Supreme Court Is Taking AI Seriously
The Supreme Court of Arkansas has decided it is time to get very specific about how artificial intelligence interacts with the court system.
This development reflects a deliberate and growing awareness that AI is being used more frequently across legal professional and everyday settings, often in ways that are unnoticed or misunderstood.

The Court recognized that people working in and around the judicial process are already relying on AI tools whether for convenience, speed, or research assistance, and some of that use carries hidden risks.
The Court’s opinion is clear that while technology itself is not inherently dangerous, the way confidential court data may be processed by generative AI tools poses a serious problem.
When someone copies and pastes sensitive information about a case into an AI system, that data does not always disappear after use. In fact, many tools are designed to retain user input in order to improve their own performance.
What that means in practice is that sensitive case details, protected under multiple state laws and court rules, could easily end up in databases controlled by private companies with no legal obligation to maintain the same level of confidentiality as a court clerk.
Arkansas is not waiting around to find out what might happen next. Instead, the Court has issued a proposed new administrative order that is already attracting public comment.
This proposal sets out a clear warning: be cautious, be informed, and do not upload confidential or sealed court data into artificial intelligence platforms unless you fully understand the consequences.
The warning applies across the board. It speaks not only to attorneys and judges, but also to clerks, assistants, and any other personnel who might be tempted to ask an AI platform for help drafting a memo or analyzing a court order.
The new forthcoming administrative order prohibits court staff from submitting any internal court data to AI systems under any circumstances.
That includes court clerks and administrative staff who have direct access to protected data.
The prohibition is unlikely to not optional.
It is designed to draw a hard boundary where legal ethics, privacy law, and modern technology intersect.
What makes this development especially interesting is the timing.
The Arkansas Supreme Court has not declared a final position but is inviting the public to also share written comments through August 1. This creates a short but significant window for feedback.
The comment period shall end on 1 August 2025. Comments should be submitted in writing to: Kyle E. Burton, Clerk of the Arkansas Supreme Court, Attention: The Creation of Administrative Order No. 25, Justice Building, 625 Marshall Street, Little Rock, AR 72201, or by email: rulescomments@arcourts.gov. The court will consider any submitted comments and make subsequent changes if necessary.
The administrative order also makes clear that the Court is not treating this as a temporary experiment, but rather, it is preparing to treat AI use as a long-term feature of legal practice that must be managed with care, attention, and formal oversight.
Suffice to say, the court has identified a very specific risk which is to manage the misuse of confidential judicial data inside a system that was never built for courtroom integrity.
The new order is an attempt to address that risk.
Confidential Court Data and AI Tools
The Arkansas Supreme Court has made it crystal clear that one of the most pressing issues surrounding AI is the way it handles sensitive data, and in particular, the kind of data courts are responsible for safeguarding.
There is no confusion about the risk.
When people working in or around the justice system feed confidential court material into tools that produce automated responses, they are often passing that information into systems that are learning from it, store it, and possibly reuse it.
This is where the real problem begins.
Most generative AI platforms are trained on massive amounts of data, and once something is entered, it can become part of that system's memory bank.
That might sound efficient from a tech perspective, but for the legal system, it is a direct invitation to disaster.
If someone pastes a sealed court document or privileged client communication into one of these platforms, there is a good chance that data will not remain private.
The tool might save it, analyze it, and build on it for future use, possibly even making it retrievable under the right conditions (prompts).
How can you argue about whispering a secret into a microphone and then wondering why someone else heard it.
The Court has identified real legal consequences that could follow such disclosures. If someone enters private court data into an AI system without fully understanding how that data is stored and reused, they could be in violation of long-standing rules that are meant to protect the legal process.
That includes state administrative orders, multiple statutes in the Arkansas Code, professional conduct standards for lawyers, judicial ethics rules, and even internal rules for court reporters and clerks.
The number of rules is impressive, and the consequences are not ordinary.
The new administrative order singles this issue out because the potential for harm is wide-ranging and very real.
The Court wants everyone interacting with its justice systems to stop and think before using AI to handle something sensitive.
Whether a person is an attorney under pressure, a court clerk trying to finish paperwork, or a staff member working late, the temptation to submit something into an AI prompt now comes with a clear administrative warning.
You may think you are just asking for help from AI, but what you might be doing is giving away protected court data to a system that does not know how to forget.
Do you have thoughts on how justice systems should interact with AI? Hit the comment button and let us know what you think about this Arkansas’s proposed order.