After the ABA's Formal Opinion 512, What Does NJ's Competence Duty Actually Demand From Solo Attorneys Using AI?
Photo by Olivier Leysen on Unsplash
6 min readApril 28, 2026

After the ABA's Formal Opinion 512, What Does NJ's Competence Duty Actually Demand From Solo Attorneys Using AI?

NJ RPC 1.1ABA Formal Opinion 512legal AI competence

The ABA released Formal Opinion 512 in July 2024, and the legal tech community treated it like a thunderclap. Headlines called it the definitive word on lawyers and generative AI. But if you practice in New Jersey, reading Opinion 512 and calling it a day is the wrong move — and potentially a disciplinary one.

New Jersey operates under its own Rules of Professional Conduct, with its own interpretive history, its own ethics opinions, and an ACPE (Advisory Committee on Professional Ethics) that does not simply rubber-stamp ABA guidance. RPC 1.1 in New Jersey requires competence: the legal knowledge, skill, thoroughness, and preparation reasonably necessary for the representation. The ABA's opinion helpfully unpacks how that applies to AI. But the how it applies in your practice question is still yours to answer — concretely, not theoretically.

What ABA Opinion 512 Actually Says (And Doesn't)

Opinion 512 establishes five core duties when a lawyer uses generative AI tools:

  1. Competence — understand the tool well enough to use it responsibly
  2. Confidentiality — know where client data goes when you hit enter
  3. Communication — consider whether and when to tell clients
  4. Supervision — review AI output as you would a first-year's work
  5. Candor — never submit AI-generated content you haven't verified

Nothing in that list is surprising. What's significant is that the ABA explicitly rejected a bright-line rule requiring disclosure in every matter. That's a national default. New Jersey may land differently — the ACPE has historically taken a more granular view of informed consent obligations, and Opinion 742 (covering cloud computing) suggests NJ expects lawyers to investigate, not just assume.

The NJ RPC 1.1 Competence Gap Most Solos Are Missing

Here is the specific gap I see most often when working with NJ solo attorneys: they adopt an AI tool for a category of work — say, drafting demand letters or summarizing deposition transcripts — but they never develop what I'd call a task-specific verification discipline.

ABA 512 says you must understand the technology's benefits and risks. But understanding isn't a one-time event. Generative AI models are updated, retrained, and changed by vendors, often without notice. The tool that reliably cited New Jersey statutes correctly in January may behave differently by September. A competent NJ attorney, under RPC 1.1's thoroughness prong, has to build in periodic re-evaluation — not just trust that the tool "still works fine."

Practically, this means three things:

1. Maintain a short AI task log. For each recurring workflow where you use AI (drafting, research, summarization, intake), document what tool you're using, what prompt structure you rely on, and when you last verified its accuracy against a control sample. This takes ten minutes to set up in a spreadsheet and becomes your evidence of competence if a grievance ever arises.

2. Test against known NJ-specific outputs. NJ civil practice has enough jurisdiction-specific nuances — service rules, the Ferreira conference process in matrimonial matters, the MTCA notice requirements — that a generic legal AI tool can and does produce outputs that are technically plausible but procedurally wrong for New Jersey. Build a short checklist of five to ten NJ-specific accuracy checkpoints for each practice area you use AI in. Run a spot-check quarterly.

3. Read your vendor's model update notices. Most attorneys agree to AI vendor terms and never look again. Put a calendar reminder every 90 days to check the vendor's changelog or update log. If they've updated the underlying model or changed how citations are sourced, that's a material change to your workflow tool — and RPC 1.1 asks whether you've adapted accordingly.

The Supervision Angle Is NJ-Specific Too

RPC 5.1 in New Jersey places supervisory responsibility on partners and managing attorneys. If you're a solo with a part-time paralegal or virtual assistant who is now using AI tools as part of their work — a reality in 2025 — the competence obligation doesn't stop at your own use. You are responsible for implementing reasonable measures to ensure their AI-assisted outputs comply with the RPCs.

That's not just an RPC 5.3 issue (non-lawyer supervision — a topic for another day). It connects directly back to 1.1: you can't demonstrate competence in the final work product if you haven't established a review protocol for work that flowed through an AI tool you don't fully understand yourself.

The Practical Baseline for NJ Solos

If you use generative AI in any part of your practice today, here is a defensible minimum posture under NJ RPC 1.1 in a post-Opinion 512 world:

  • Document your tools — name, version, use case, and date adopted
  • Test NJ-specific outputs — don't rely on national accuracy benchmarks
  • Build a review protocol — treat AI output as a first draft, not a final product
  • Schedule re-evaluation — mark your calendar for quarterly verification
  • Train anyone who touches the output — including staff

Opinion 512 gave us the vocabulary. NJ RPC 1.1 sets the floor. The space between those two things is where your actual practice lives — and where the ACPE will look first if something goes wrong.

Get the weekly roundup

New AI Sidebar articles delivered to your inbox. No spam, unsubscribe anytime.