Blogs

Commission’s own internal review condemned CleanIT’s incoherence and cost

By EDRi · January 9, 2013

The CleanIT project has received a huge amount of criticism from outside of the EU institutions.

But imagine if the Commission had been alerted to the incoherence of the planned project. Imagine if, before investing 325.796 Euro in CleanIT, the European Commission had been warned that the project lacked methodology and did not represent value for money. Imagine if the Commission’s independent checks of the initial proposal gave the project a “value for money” rating that was substantially less than half the minimum average score necessary.

Remarkably, this is exactly what happened. Documents (pdf) that have been made available to EDRi by the European Commission now reveal a damning analysis of the project in independent evaluations that were carried out prior to the award of the money by the Commission.

The evaluations confirm what EDRi has previously reported about the project itself and, more generally, the underlying problem of outsourcing law enforcement to private entities.

According to the Commission’s call for proposals, projects funded under this framework are intended to contribute:

• To the development of instruments (at EU level), strategies and activities/measures in the field of the effective protection of critical infrastructure (at both EU and MS levels) and/or;
• To the development of a common framework for the effective protection of critical infrastructure at EU level.

In an attempt to understand why this project received a grant in the first place, we requested access to the initial project proposal. The Commission sent it to us – along with the three independent evaluations it carried out as part of the grant-giving process. So how and why was such a weak and directionless project selected for funding by the European Commission?

The first evaluator thought that the methodology of the proposed project deserved the impressively low score of one out of five and explained:

The proposal does not clearly explain how the objective is to be reached. (…) Therefore I have substantial doubts if it is possible to achieve the desired objective this way. (…) In my opinion there is a high risk of low value of the guidelines and not achieving the goal. The applicant identifies the risk of loosing (sic) trust from the private sector and difficulties in cooperation between the public and the private sector.

Regarding the expected outcome, the evaluator stated that

(t)he outputs are general principles for countering illegal use of the Internet with support from public and private parties and guidelines for implementation of best practices. The results are improvement of prevention and fight of illegal use of the internet. I have serious doubts if the applicant will be able to deliver the results due to lack of methodology.

Regarding the planned costs, all evaluators thought the project proposal to be disproportionate. The Clean IT project estimated total costs of 661.927 Euro, it requested a contribution of 529.541 Euro from the Commission, with staff costs of 407.720 Euro. The evaluator was

convinced that the planned costs do not represent the most economic and efficient solution and the best value for money. Especially the daily rates are very high : 752 for the project manager, 484 for the assistant, 1000 for the consultants. There are no additional information as to calculation of the costs.

This criticism was echoed by the second evaluation considering that
the amount of the grant requested does not seem to be proportionate with regard to the expected results.

Amazingly enough, the project proposal was accepted in the end and received a total of 325.796,71 euro from the European Commission – despite the fact that the Commission was warned that the project had no methodology and a very clear warning that it was very likely to fail.

Key figures from the evaluations from the three reviewers:

Minimum overall percentage necessary for approval: 65%

“Value for money” score:
Evaluator 1: 5 / 20 (25%)
Evaluator 2: 4 / 20 (20%)
Evaluator 3: 7 / 20 (35%)
Average: 5,33 / 20 (26%)

Total score:
Evaluator 1: 48%
Evaluator 2: 70%
Evaluator 3: 66%
Average: 61,33%
Funding threshold: 65%
Average after elimination of lowest score: 68%