3 juin 2004 CEPEJ (2004) 15
Commission européenne pour
l’efficacité de la justice
(CEPEJ)
GRILLE-PILOTE
POUR
L’ÉVALUATION DES SYSTÈMES JUDICIAIRES
Note d'information
préparée par le Secrétariat
Objectifs de la Grille
1. La “Grille d’évaluation des systèmes judiciaires” a été adoptée par la CEPEJ à sa 2ème réunion plénière les 3-5 décembre 2003 et par les Délégués des Ministres le 4 février 2004, lors de leur 870ème réunion.
2. Le objectif est de permettre aux Etats membres de comparer le fonctionnement (d’éléments-clés) de leur système judiciaire avec celui d’autres Etats, et de fournir des indicateurs pour évaluer ce fonctionnement, fondés sur les principes issus de la Convention européenne des droits de l’homme et de la jurisprudence de la Cour de Strasbourg.
3. La Grille a aussi pour objet de stimuler la collecte des données par les Etats dans les domaines où ces données ne sont pas encore disponibles.
4. Cette Grille contient des indicateurs tant qualitatifs que quantitatifs pour l’évaluation des performances de chaque système judiciaire. Elle n’a pas pour objet de contenir une liste exhaustive d’indicateurs ni de donner lieu à une étude universitaire ou scientifique. Elle contient ces indicateurs qui, dans tous les différents aspects du fonctionnement d’un système judiciaire, ont été considérés comme réellement nécessaires pour permettre aux Etats de mieux comprendre le fonctionnement de leur propre système.
Exploitation de la Grille-Pilote par la CEPEJ
1. Collecte et traitement des données
2. Publication des résultats
3. Suites données aux conclusions tirées du traitement des données
4. Aménagement et pérennité de la Grille d'évaluation
13. L’intérêt de la démarche engagée tient à son approfondissement sur la durée.
14. Les réponses à la grille devraient être fournies régulièrement (les échéances étant à définir). Cette régularité permettra en effet de dégager des tendances et des axes de travail prioritaires en conséquence.
15. Afin de remplir de façon optimale son rôle d'outil au service de l'efficacité des systèmes judiciaires, la grille fera l’objet d’une réévaluation régulière.
The Cepej bureau has called for an extra meeting of the working party involved with the evaluation scheme. I hereby send some input for that meeting. I’ll give my view on the work to be done, and on the issues that we think should be addressed at the extra meeting.
In general, regarding the evaluation scheme there are two tasks ahead. The first is preparing a good report. The second is revising the scheme. So far, the idea has been that the evaluation scheme is distributed and reported on every year. Keeping that schedule means that the current working party will have to do all necessary preparations for a new round next spring.
When scheduling our effort – that of the research team I mean – in December last year, we aimed at producing a complete report for the second meeting of the working party (in September). In that way, the third meeting could have the revision of the scheme as main focus.
In the extra meeting called for now will, I hope we’ll reach a consensus on what activities related to the scheme will be done this year. That could prevent unexpected ‘surprises’ from popping up at a (too) late stage.
If we have the time, we could also start discussing some of the problems in measurement. I will prepare a hand-out with some preliminary results and problems that occurred during the data-entry.
1. The Report
1.1 Time plan for the report(abbreviated version of the original)
This schedule was made shortly after the plenary meeting in December 2003. It was used for internal use (planning capacity, holidays, and facilities).
2004 |
Activities |
Output |
Deadlines approx. |
||
Activities before May 15th |
|||||
Improve the lay-out of the scheme |
Scheme, ready for distribution. Design of report, tables, charts |
May |
|||
Pre-design of report, outline of paragraphs, design of tables and charts. Design of work files (SPSS, Excel, Word) |
|||||
Activities from May 15th on |
|||||
Week 21 - 23 |
Data-entry |
||||
Week 23 -24 |
Rough scores |
Appendix on comments |
Appendix rough scores & comments |
June 4th |
|
Week 24 - 25 |
Identifying responsegroups on key-items |
Contacting respondents with only a few key-data missing |
Analysis of response |
June 11th |
|
Week 25 |
Producing key tables and charts |
10 key tables & charts |
June 18th |
||
Week 26 - 29 |
Analysis |
||||
Week 30 - 33 |
X (holidays) |
||||
Week 34 - 35 |
Writing |
Analysis |
Appendix on problems in measurement |
||
Week 36 |
Lay-out |
||||
Week 37 |
|||||
Week 38 |
Report to GT |
Conceptversion of report |
September 13th |
||
Week 39 |
2nd meeting |
September 22 – 24 |
|||
Week 40 - 42 |
Extra’s & changes wanted by GT |
||||
Week 43 |
Writing, Lay-out |
2nd version of report |
|||
Week 44 |
Report to GT |
October 25th |
|||
Week 45 |
3rd meeting |
November 3 – 5 |
|||
Week 46 |
Processing final comments, lay-out |
||||
Week 47 |
Final report ready |
||||
Week 48 |
November 26th |
1.2 Contents of the Report
The outline of the report, so far, looks as follows. We aim at producing a compact main text, and putting much of the detail in appendices. Methodological issues will find a place in the appendices as well. Appendices on rough scores and focus-reports on each of the contributing countries may be part of the report.
Chapter 1 Introduction
The first chapter states the aim of the study, the development of the scheme, the collection and analysis of the data and a general text on comparing judicial systems.
Chapter 2 Public expenditure on justice
The first of three chapters on the ‘content’. This chapter focuses on costs and budgets. Includes as well the length of procedures and general country information.
Chapter 3 The Judiciary and the courts
This chapter focuses on the core of the judiciary system: courts and judges. Also includes the ‘safeguards’ in the system (appeals, complaints) and matters of access (legal aid, court fees).
Chapter 4 The legal professions
This chapter focuses on public prosecutors, lawyers, enforcement agents and mediators.
Chapter 5 Summary
Appendix a: working group & contributors
Appendix b: response statistics; overall & per item; (possibly) information on non-response
Appendix c: country information. Studies, internetaddresses. Money-exchange rates used.
Appendix d: methodology, problems in measurement and comparison. Includes comments by respondents, per item.
2. Improvement of the Scheme
First of all, the working party will have to develop some idea on the future of the scheme. Will it become an annual activity? In what form will it be distributed? The answers to this type of questions set the agenda regarding the further development of the scheme.
The other ‘input’ for further development are the problems experienced by the datacollection and analyses; which problems have occurred, and how can they be solved for the next round?
2.1 Issues that deserve attention
On the future use of the scheme:
- Should we (the cepej) ‘do’ the scheme every year?
- Should every question be repeated every year?
- Should the survey develop in the direction of highly defined measures (like the European Sourcebook)?
- Would it be desirable to develop an (more) analytical framework regarding the ‘efficiency of justice’?
- Should the survey be distributed in other form (for instance: automated questionnaire, internet questionnaire)?
2.2 ‘Evaluating’ the scheme
The research team will report on all problems of measurement: which questions lead to comments, which were not understood, in which cases unforeseen responses occurred? One of the ‘products’ to be reported to the working party is a list of measurement problems and a list with all comments made per item.
It would be desirable for the Cepej bureau to report on all activity done regarding the collection of data.
An extra activity that might be very helpful would be an e-mail inquiry under the non-respondents; mainly aimed at finding out why people did not respond. (No need to say that such an inquiry can only be very short and simple). An inquiry under those who did respond could be interesting as well: what would they rather see differently; why did some use older versions of the scheme instead of the one distributed?
3. Other issues
A few other points that I would suggest for the agenda:
- Communication between working party, Cepej-bureau and research team
- Preliminary data and ‘reports in concept’: public?
- Scheduled date of formal publication by Cepej
Roland Eshuis, June 3rd 2004