France Bans Judges’ Decision Analytics, 5 Years In Prison For Rule Breakers

ER Editor:  See also this article by Jurist.org titled New France law bans use of analytics to determine judge behavior.
The story below was published on June 4th. Follow-ups have been published this week on the same site titled France’s Controversial Judge Data Ban – The Reaction, and The Judge Statistical Data Ban – My Story – Michaël Benesty. Paris lawyer and ML (machine learning) expert Michael Benesty, the program developer at the centre of this story who originally developed a non-commercial search tool on judges’ rulings, reveals that judges want to keep anonymous their decision-making, which would also reveal how much it may deviate from Civil Law (note that France is a country operating under the Civil Law system, whereas anglo-saxon countries such as the US and UK use the Common Law system).  The argument is about maintaining their judicial independence, but as Benesty notes, it renders them non-accountable, too, to lawyers and citizens whom the law is supposed to serve. Of note:

The basic issue was that some judges had a very high asylum rejection ratio (close to 100%, with hundreds of cases per year), while others from the same court had a very low ratio, and in France cases are randomly distributed among judges from the same courts (there is no judge specialised in Moroccan asylum and the other in Chinese asylum for instance).

Basically, we believed there was no reasonable explanation for such discrepancies, which were stable year after year.

The tool was transparent: you get some measures of bias for each judge plus the related legal case texts to back the numbers. So any measure can be manually checked.

We got plenty of e-mails from judges from courts all around France. Basically, two thirds were angry that we published names and were saying that there was an error somewhere (but they were unable to tell us where, and I always answered that the tool was designed to let you show the cases to back the numbers so that you can manually check, so if they find a bug they can report it).

Those same judges reminded me of the risk to their reasoning independence if we kept on running the website. One third recognised there was an issue, but were not OK to publish names.

.
********

France Bans Judge Analytics, 5 Years In Prison For Rule Breakers

In a startling intervention that seeks to limit the emerging litigation analytics and prediction sector, the French Government has banned the publication of statistical information about judges’ decisions – with a five year prison sentence set as the maximum punishment for anyone who breaks the new law.

Owners of legal tech companies focused on litigation analytics are the most likely to suffer from this new measure.

The new law, encoded in Article 33 of the Justice Reform Act, is aimed at preventing anyone – but especially legal tech companies focused on litigation prediction and analytics – from publicly revealing the pattern of judges’ behaviour in relation to court decisions.

A key passage of the new law states:

‘The identity data of magistrates and members of the judiciary cannot be reused with the purpose or effect of evaluating, analysing, comparing or predicting their actual or alleged professional practices.’ *

As far as Artificial Lawyer understands, this is the very first example of such a ban anywhere in the world.

Insiders in France told Artificial Lawyer that the new law is a direct result of an earlier effort to make all case law easily accessible to the general public, which was seen at the time as improving access to justice and a big step forward for transparency in the justice sector.

However, judges in France had not reckoned on NLP and machine learning companies taking the public data and using it to model how certain judges behave in relation to particular types of legal matter or argument, or how they compare to other judges.

In short, they didn’t like how the pattern of their decisions – now relatively easy to model – were potentially open for all to see.

Unlike in the US and the UK, where judges appear to have accepted the fait accompli of legal AI companies analysing their decisions in extreme detail and then creating models as to how they may behave in the future, French judges have decided to stamp it out.

Various reasons for this move have been shared on the Paris legal tech grapevine, ranging from the general need for anonymity, to the fear among judges that their decisions may reveal too great a variance from expected Civil Law norms.

One legal tech expert in France, who wished to remain anonymous, told Artificial Lawyer: ‘In the past few years there has been a growing debate in France about whether the names of judges should be removed from the decisions when those decisions are published online. The proponents of this view obtained this [new law] as a compromise from the Government, i.e. that judges’ names shouldn’t be redacted (with some exceptions to be determined) but that they cannot be used for statistical purposes.’

Whatever the reason, the law is now in effect and legal tech experts in Paris have told Artificial Lawyer that, as far as they interpret the regulations, anyone breaking the new rule can face up to five years in prison – which has to be the harshest example of legal tech regulation on the planet right now.

Forbidden knowledge…….

That said, French case law publishers, and AI litigation prediction companies, such as Prédictice, appear to be ‘doing OK’ without this specific information being made available. This is perhaps because even if you take out the judges from the equation there is still enough information remaining from the rest of the case law material to be of use.

Moreover, it’s unclear if a law firm, if asked to by a client, could not manually, or using an NLP system, collect data on a judge’s behaviour over many previous cases and create a statistical model for use by that client, as long as they didn’t then publish this to any third party. That said, it’s not clear this would be OK either. And with five years in prison hanging over your head, would anyone want to take the risk? 

But, the point remains: a government and its justice system have decided to make it a crime for information about how its judges think about certain legal issues to be revealed in terms of statistical and comparative analysis.

Some of the French legal experts Artificial Lawyer talked to this week asked what this site’s perspective was. Well, if you really want to know, it’s this:

  • If a case is already in the public domain, then anyone who wants to should have a right to conduct statistical analysis upon the data stemming from that in order to show or reveal anything they wish to. After all, how can a society dictate how its citizens are allowed to use data and interpret itif that data is already placed in public view, by a public body, such as a court?
  • This seems to be like giving someone access to a public library, but banning them from reading certain books that are sitting right there on the shelf for all to see. This is a sort of coercive censorship, but of the most bizarre kind – as it’s the censorship of justice’s own output.

Clearly there are limits to the information tech companies should be allowed to gather on private individuals. But the decisions of judges in open court do seem to be ‘public data’ and hence de facto beyond any censorship or control.

However, as one contact in Paris added, the old law against ‘Scandalising the Judiciary’ was only recently abolished in England & Wales, which shows that judges over here have not always liked to be scrutinised too closelym either.

Clearly this is a hot potato – what do you think?

Is it right to wall off the decisions of named judges from statistical analysis?

Part of the French text covering the new law is below:

‘Les données d’identité des magistrats et des membres du greffe ne peuvent faire l’objet d’une réutilisation ayant pour objet ou pour effet d’évaluer, d’analyser, de comparer ou de prédire leurs pratiques professionnelles réelles ou supposées.

La violation de cette interdiction est punie des peines prévues aux articles 226-18, 226-24 et 226-31 du code pénal, sans préjudice des mesures et sanctions prévues par la loi n° 78-17 du 6 janvier 1978 relative à l’informatique, aux fichiers et aux libertés.’

(* Translated version above, via Google.)

************

Original article