Your AI Conflicts Check Said "Clear." Your Bar Complaint Said Otherwise.
7 min readApril 12, 2026

Your AI Conflicts Check Said "Clear." Your Bar Complaint Said Otherwise.

Conflicts of InterestNJ EthicsAI Workflow

The promise is seductive: a new client walks in, you run their name through your AI-powered practice management system, a green checkmark appears in under three seconds, and you open the engagement. Clean, fast, defensible.

Except it isn't always defensible. And in New Jersey, where RPC 1.7, 1.9, and 1.10 create a layered web of current and former client conflicts obligations, a false negative from an automated conflicts tool isn't just an operational hiccup — it's the kind of mistake that generates grievances, fee disputes, and forced withdrawals at the worst possible moment.

This post is about where the algorithm actually breaks, why solo attorneys are especially exposed, and what a smarter hybrid process looks like in practice.


How "AI Conflicts Checking" Actually Works (And Why That Matters)

Most AI-assisted conflicts tools inside modern practice management platforms — Clio, MyCase, and their competitors — rely on one of two mechanisms: keyword/fuzzy string matching or, in more sophisticated implementations, semantic embeddings.

Fuzzy matching catches obvious variants. It will flag "Jon Smith" when you search "John Smith." But it operates on surface-level text similarity. Embeddings go deeper — they convert names and matter descriptions into numerical vectors and identify conceptual proximity. On paper, this sounds more powerful. In practice, both approaches share a fundamental structural limitation:

They can only find what was entered into your system in the first place.

This seems obvious until you trace the downstream consequences.


Failure Mode #1: Name Variants and Cultural Name Structures

Consider a former client you represented in a landlord-tenant dispute three years ago. She retained you under her married name: "Maria Chen-Vasquez." She has since divorced and reverted to "Maria Vasquez." She now appears as an adverse party in a new client's breach of contract matter — under her maiden name only.

Your AI screen runs "Maria Vasquez." Your file is indexed under "Maria Chen-Vasquez." No match. Green light.

This isn't a hypothetical edge case. It's a predictable failure mode rooted in how hyphenated names, transliterated names (common across South Asian, East Asian, Eastern European, and Arabic naming conventions), and legally changed names are stored and searched. A solo practitioner in a diverse New Jersey market — Essex County, Hudson County, Union County — faces this constantly.

The same fragility applies to business entities. "Tech Solutions LLC" filed as the opposing party in a 2022 matter may now be operating as "TechSol Inc." after a rebranding. An embedding model trained on legal text may not associate those two strings without explicit human knowledge of the rebranding event.


Failure Mode #2: Corporate Parent and Subsidiary Relationships

This is where solo attorneys get caught most badly.

You represented "Greenway Logistics NJ LLC" in a contract dispute two years ago. A new client now asks you to sue "Meridian Supply Chain Partners" — which, as it turns out, is a wholly owned subsidiary of Greenway's parent holding company.

Your AI tool has no way to know that relationship exists unless you manually entered a corporate family tree into your matter records. It almost certainly didn't. No conflicts management tool on the market autonomously ingests and maps live corporate ownership structures from Secretary of State filings or EDGAR without significant custom configuration.

Under RPC 1.7 and the concept of organizational client loyalty — reinforced by the Restatement (Third) of the Law Governing Lawyers — representing a party adverse to a client's corporate affiliate can constitute a conflict. The AI doesn't know the org chart. You need a human who does.


Failure Mode #3: Matter-Level Nuance That Resists Categorization

Conflicts aren't just about names. They're about interests. An AI system that flags "Smith v. Jones" as a potential conflict because "Jones" appears in two files is doing pattern matching, not conflict analysis. The real question — whether your duties to a former client create a material limitation on your representation of a new client under RPC 1.9 — requires legal judgment that no current commercial tool provides.

Two files involving the same individual in unrelated matters may not constitute a conflict. Two files involving different individuals in the same underlying transaction very well might. Embedding-based systems are structurally blind to this distinction unless matter descriptions are written with extraordinary precision and consistency — which, in a busy solo practice, they almost never are.


A Hybrid Workflow That Actually Holds Up

Here's what a defensible conflicts process looks like for NJ solo and small firm attorneys using AI tools:

Step 1 — AI Pre-Screen (automated, every intake) Run all new parties — client, adverse party, affiliated entities, key individuals — through your practice management system's conflicts module. This catches the obvious hits quickly and creates a timestamped audit trail that you ran something.

Step 2 — Name Variant Expansion (human-assisted) Before accepting the AI result, manually run at least three to five name variants: common misspellings, hyphenated/unhyphenated versions, maiden vs. married names, and any known DBAs. Thirty seconds of lateral thinking here closes the single largest gap in automated screening.

Step 3 — Entity Relationship Check (human-directed) For any business client or adverse business party, spend two minutes on the NJ Division of Revenue and Enterprise Services business search and, for larger entities, a quick OpenCorporates or EDGAR lookup. Map one level up (parent) and one level down (subsidiaries). Add those entity names to your conflicts search.

Step 4 — Substantive Interest Analysis (attorney judgment) Review any flagged matches — or any matter involving a former client in a related practice area — and apply the RPC 1.9 "substantially related matter" standard yourself. Document your reasoning in a conflicts memo, even a two-paragraph one. This memo is your protection if a grievance is ever filed.

Step 5 — Log the Clearance Record the date, parties screened, variants checked, result, and your sign-off in your matter file. The AI ran a check is not documentation. Attorney reviewed conflicts on [date], no disqualifying relationship identified for the following reasons is documentation.


The Bottom Line for NJ Solos

AI conflicts tools are a legitimate efficiency gain. They are not a compliance program. The distinction matters enormously in a state where the Office of Attorney Ethics investigates conflicts complaints with genuine rigor.

The attorney who says "my software cleared it" has not demonstrated competence under RPC 1.1. She has demonstrated that she delegated a legal judgment to a pattern-matching algorithm and called it due diligence.

Build the hybrid. Run the AI, then run your brain. The three minutes of human review is the difference between a green checkmark and a green engagement letter you can actually stand behind.


Adam Elias is the founder of Elias Advisory LLC, where he helps solo attorneys and small law firms in New Jersey adopt AI tools responsibly and build practice systems that hold up under scrutiny. Questions about your conflicts workflow? Get in touch.

Get the weekly roundup

New AI Sidebar articles delivered to your inbox. No spam, unsubscribe anytime.