Binance founder Changpeng Zhao, popularly known online as CZ, floated a striking idea on X this weekend. Train an AI on written laws and past cases, then use it to generate judgment recommendations for public cases. He called it an “AI Judge Companion”.
The idea is simple and bold. It promises speed, consistency and a way to tame human error. But poster-board simplicity does not make for easy practice in courts. Nigeria’s justice system has real problems. So does legal AI. The question is whether the two can be married without breaking something important: fairness and the rule of law.
Nigeria’s courts are hungry for solutions. There are long backlogs, chronic delays and widespread complaints about access to timely justice. Courts move slowly. Files gather dust. Citizens wait years for outcomes.
These are not abstract problems for Nigerians; they translate into lost livelihoods, stalled businesses and weakened trust in institutions. The National Judicial Institute and recent studies all point to delay and case backlog as structural constraints that technology could help address if applied thoughtfully.

At first glance, AI feels tailor-made. An AI trained on statutes, precedent and court transcripts can surface consistent reasoning. It can flag past cases, summarise evidence, and suggest legal pathways in seconds.
For routine, high-volume matters, small claims, traffic disputes, and administrative reviews, a well-designed assistant could cut weeks, even months, from the calendar. That efficiency is the core sales pitch behind CZ’s idea. It is also why Estonia and other jurisdictions have piloted robot-judge projects for small claims and administrative tasks. Those pilots show promise for narrow uses.
How to make the ‘AI Judge Companion’ work in Nigeria
But the road from prototype to courtroom is full of potholes. First, data. AI needs clean, comprehensive and representative datasets. Nigeria’s court records are unevenly digitised. Many judgements and hearings are not machine-readable. Where records exist, they may reflect years of human bias, local practices and gaps that an AI will simply absorb and magnify. Training on flawed archives risks producing a system that replicates historical injustice at scale.
Tinubu’s crypto literacy push: Nigeria needs policy clarity before judicial training – Expert
Second, bias and explainability. Legal decisions are not just pattern-matching exercises. They are moral and constitutional acts. An opaque “black box” that supplies a recommendation without an auditable rationale threatens defendants’ rights and the public’s trust.
Scholars and judicial commentators caution that algorithmic tools must be transparent, validated and subject to oversight before they touch core adjudicative functions. The wrong governance model can compromise judicial independence and human rights.
Third, institutional capacity. Nigerian courts would need reliable digital infrastructure, secure data storage, and staff trained to use and critique AI outputs.


The OECD and other international bodies warn that AI in justice must be accompanied by robust checks and balances, safeguards that prevent tool misuse, protect privacy, and ensure human accountability for final decisions. Without that guardrail, you risk substituting efficiency for legitimacy.
Fourth, politics and public perception. In a system where citizens already doubt impartiality, introducing an AI companion could be read as outsourcing judgement to tech elites. Who builds the models? Who funds them?
CZ’s open offer to support funding changes the political calculus, but it also raises questions about influence, vendor lock-in and the geopolitics of legal tech. Any appearance of private or opaque control will make adoption harder.
So what would responsible adoption look like?
Start small. Use AI where the stakes are lower and outcomes are well-defined. Build transparent datasets and publish model logic. Train judges and clerks, and create an independent audit mechanism.
Make the AI’s role advisory, not determinative. The judge remains the decision-maker, accountable in law. Pair technical pilots with legal reform: digitise court records, standardise data, and pass rules that require disclosure of algorithmic reasoning when used in a case.
There is also a social lens. Justice in Nigeria is not only a technical problem. It is a political and cultural one. Corruption, uneven legal representation, and localised power dynamics shape outcomes as much as delays do.


An AI that only speeds up an unfair process will not deliver justice; it will simply make unfairness faster. Any meaningful solution must couple automation with investments in legal aid, public legal education and stronger judicial ethics.
In short, CZ’s “AI Judge Companion” is a provocative proposition with genuine potential for efficiency gains. But it is far from a plug-and-play cure for Nigeria’s justice woes.
The right path is incremental, auditable and rooted in local realities. The line between assistance and abdication must remain clear. If Nigeria, and indeed any other jurisdiction, is to try this, policymakers should insist on transparency, open data, human oversight and legal safeguards before a single recommendation makes its way into a final judgement.
For a country that needs faster, fairer courts, technology is part of the answer. But it cannot be the whole answer.
The hope is not a machine that decides for us, but one that helps judges decide better, while people and institutions keep the final responsibility. That balance will determine whether an “AI Judge Companion” is a breakthrough for Nigerian justice or just a high-tech detour.