Civil litigation seems to be entering a new phase: there is a quiet but growing trend of litigants in person using artificial intelligence (AI) tools to draft their own claims. This is creating an increasingly unpredictable, time-consuming and costly problem for defendants.
AI has made it easier than ever to file a claim, but the legal system is still built on expertise, not algorithms. Procedural fairness should cut both ways. However, it feels as though a significant burden is placed on represented defendants when litigants in person are heavily assisted by AI.
Self-represented opponents have always consumed more time. Every letter must be precise and courteous; and hearings require greater court intervention and there is often a difficult balance between being helpful and blurring the lines of representation. That can feel, in practice, as if there is a subtle tilt against represented parties, particularly where the court must consider the need to correct procedural errors or grant leniency. The use of AI generated pleadings may require additional consideration (and therefore time) by an already stretched judiciary.
Of course, self-representation is nothing new. But there is a noticeable evolution in the pleadings from litigants in person. What seems different is how those litigants are preparing their cases, and the availability of AI tools appears to be encouraging them to go it alone.
Free online tools and document generators promise to ‘write your claim’ in minutes. The result is an influx of AI-generated pleadings. These are documents that look superficially professional but can be inaccurate, incoherent and procedurally flawed. Litigants in person may misinterpret procedural steps or take unreasonable positions based on misunderstood online advice.
From a claimant’s perspective, the appeal is obvious:
- It’s free or low-cost
- It feels empowering - no need for a solicitor, no waiting for advice.
- It offers instant results - a ready-made particulars of claim or witness statement, complete with citations and legal terminology.
- It gives the impression at least of accessibility, making pursuing claims easier and more likely.
Arguably, this represents access to litigation that was previously unaffordable to many. But this illusion of capability can be flawed. AI tools lack legal understanding, knowledge of evidential thresholds or awareness of procedural rules. They can generate confident sounding but legally meaningless text.
AI-drafted pleadings can misapply legal tests or misidentify causes of action. For Defendants this can require extensive clarification requests or strike-out applications, increasing costs and causing delay. Even when the defence succeeds, recovering costs from a Litigant in Person is difficult and rare. Enforcement can be time-consuming and often futile.
From a defendant's perspective, AI-generated claims are not quick or simple to deal with. They can be harder and more expensive for clients to defend than claims handled by qualified solicitors. There is no professional opponent to liaise with; correspondence can be erratic, extremely fast paced and sometimes hostile.
It’s not too difficult to recognise certain traits of AI-generated documents, such as:
- Overly formal, repetitive phrasing.
- The use of American legal terminology (‘tortfeasor’, ‘plaintiff’).
- Inconsistencies in dates or facts that change between paragraphs.
- Pleadings that mimic textbook examples but ignore case-specific details.
- Excessive use of jargon or unexplained legal theory.
Managing the AI assisted litigant in person: practical steps for defendants.
- Identify AI-generated pleadings at the outset. If they are incoherent or defective, seek clarification or further information or consider applying to strike out where appropriate.
- Insist on compliance with the Civil Procedure Rules and resist pressure to fill in the gaps of a claimant’s case.
- Maintain professional, plain-English correspondence. AI-assisted litigants may misunderstand procedural language
- Explain to clients at the outset that these claims can take more time not less, to manage their expectations of costs and claim life cycles.
The cost of free legal drafting
While AI may appear to facilitate access to litigation for LIP’s; in practice it can risk misleading claimants into believing that legal expertise is replaceable. This results in futile or exaggerated claims based on flawed AI reasoning, while defendants incur unnecessary legal spend responding to defective pleadings.
Courts are already struggling to keep up with the demand for justice. If AI-generated claims continue to increase, court lists could become increasingly clogged with poorly prepared cases requiring extra court time for procedural correction.
While AI can be helpful when used correctly, it is not a substitute for legal training. The rise of the AI-assisted litigant in person is becoming a significant challenge for defendant practitioners; it demands vigilance, procedural accuracy and strategic communication.