Blogs

Creating a safer Internet for children – some solid progress

By EDRi · June 6, 2012

This article is also available in:
Deutsch: [Ein sicheres Internet für Kinder – erste Fortschritte | https://www.unwatched.org/EDRigram_10.11_Ein_sicheres_Internet_fuer_Kinder_erste_Fortschritte?pk_campaign=edri&pk_kwd=20120606]

The European Commission hosted a meeting of the “CEO Coalition” on
Friday of last week. This is a project where Commissioner Kroes invited
industry to produce measures to make the Internet a safer place for
children. The dangers of such an approach are clear – industry will be
tempted to choose the easiest and most public relations-friendly
measures rather than evidence-based measures that will make a positive
difference and, potentially, to favour “solutions” which give them a
competitive advantage. On the other hand, the Commission is faced with
the temptation to take whatever is on offer from industry, particularly
due to the very short time-frame (end of 2012) that it gave industry to
come up with proposals.

There appears, however, to be a new awareness in the Commission that
child protection must be taken more seriously, based on credible
research. One indication of this is the distribution on Friday of the
latest round of the EUKidsonline research project. This project provides
a comprehensive overview of the experience of European children in the
online environment, providing an evidence base for future
decision-making. To date, however, the EU institutions have referred
this work disappointingly infrequently, although this may change as the
research becomes better known.

The discussions on Friday, despite the less than promising framework for
the project, displayed some of the first significant evolution in
thinking of this subject for several years. The quality and progress of
the work of the CEO Coalition is, however, very different depending on
the subject being discussed and which companies are taking the lead.
There are four working groups, WG 1 on “reporting tools”, WG 2 on
“age-appropriate privacy settings”, WG 3 on “notice and takedown” and WG
4 on “parental controls”.

The most disappointing work is being done in working groups 1 and 2.
Working Group 1 on “reporting tools” appears to believe that any
reporting button of any description will be a good thing. Many of the
proposals are measures that have already been tried, with varying
degrees of success and failure. Instead of learning from past mistakes
and building on past achievements, the working group appears to prefer
to start from scratch. Despite the recent revelations on the Gawker
website and the Daily Telegraph, no particular attention is being
devoted to the treatment of reports that are filed through reporting
buttons.

Working group 2 on “age-appropriate privacy settings”, led by Facebook,
offers the least progress. The task at hand appears to be “spin”
existing policies and laws that should be applicable for both children
and adults as a special service for children. This approach is the one
that was followed in the already existing “The Safer Social Networking
Principles for EU”. The presentation made on Friday, for example,
suggested that “an application or service that is directed at children
or adolescents should ensure that the collection, access and use of
personal information is appropriate in all given circumstances and
compatible with national law.”

In Working Group 3 on “notice and takedown”, there are worrying
discussions on the roll-out of upload filters. Hardly coincidentally,
Microsoft both leads the Notice and Takedown working group and developed
the PhotoDNA software that is used by, for example, Facebook UK, to
check filter images being uploaded to its service against a blacklist of
known child abuse material. No discussion of abuse of the software,
unintended consequences or positives and negatives that can be drawn
from experience with the use of the software took place. On the other
hand, however, there was an awareness among participants that the focus
on industry actions such as notice and takedown, leaves much of the
problem unaddressed and even neglected. There has been far too much
attention placed on removing the symptoms of the crime, often outside
the rule of law, and too little on the important problems that
intermediaries cannot solve – victim identification, investigation,
prosecution etc. This was the first meeting that we are aware of when
this realisation was expressed by organisations other than EDRi.

Working group 4 on “parental controls”, which is led by Nokia, showed an
impressive amount of expertise and serious reflection on how to create a
real value added for child protection online. The paper presented by
Nokia stressed the importance of continual research in order to ensure
that the measures being implemented are actually achieving their
intended goals, that parental controls should be at the edge of the
network rather than in the network itself and that measures which
undermine the privacy of the child should not be supported by any such
software.

The biggest danger now is that the rapidly approaching deadline of the
end of 2012 will lead to proposals being made and approved without due
care for unintended consequences for child protection, for fundamental
rights, for online competition and for the open Internet.

EU Kids online project
http://www2.lse.ac.uk/media@lse/research/EUKidsOnline/Home.aspx

Inside Facebook’s Outsourced Anti-Porn and Gore Brigade, Where ‘Camel
Toes’ are More Offensive Than ‘Crushed Heads’ (16.02.2012)
http://gawker.com/5885714/

The dark side of Facebook (2.03.2012)
http://www.telegraph.co.uk/technology/facebook/9118778/The-dark-side-of-Facebook.html

Commissioner Kroes speech on “delivering a better Internet for kids”
(1.12.2011)
http://blogs.ec.europa.eu/neelie-kroes/better-internet-kids/

(Contribution by Joe McNamee – EDRi)