Presentation on theme: "Protection of minors on the Internet : how to regulate ? Etienne Wéry, Partner, ULYS – Attorney-at-law."— Presentation transcript:
Protection of minors on the Internet : how to regulate ? Etienne Wéry, firstname.lastname@example.org@ulys.net Partner, ULYS – www.ulys.net Attorney-at-law (Brussels’ and Paris’ Bars) Senior lecturer at University Paris I (Sorbonne) Cyprus IT Law Group 17 th of November 2006
Part I : Let’s agree on words! « To regulate » … What does it means ?
A lot of very smart definitions by very smart persons … But one should prefer a neutral approach based on common sense Forget about the Internet. Think at a traffic jam in London on a Monday morning when it rains in November : the authorities use a lot of different means to “regulate” the traffic : – law; – Promotion of alternative ways (subway, bicycle, etc.); – physical initiatives (parking, 2 or 3 lines on the roads, etc.) – Payement, etc. To regulate is: “to bring order in the chaos” Very often, a good regulation needs various means. None of them is, alone, sufficient: the interaction between them brings the result.
Legislation State Other public bodies Self-legislation Self-regulation Unilateral Multilateral Contracts Model-law CNUDCI CCI Labels Alternative dispute resolution Technical mesures Access restriction Watermarking ERMS Regulation ? State of the art Netiquette Good usage
Part II : Let’s agree on the scope! « To regulate » … against which treats ?
Illegal content Illegal content if it is online Harmful content : it is legal (both offline and online), but it is liable to harm the minors by impairing their physical and/or mental development Each situation needs a specific approach Three different situations
Apparently, it is simple Illegal content is just … illegal No conditions to make it illegal: the content is, as such, illegal
In reality, it is slightly more difficult Illegal content in country A might well be legal in B Easiest solution: international harmonization No better example than child pornography: What is a « child » ? What is pornography ? Due to historical and cultural differences, this harmonization is possible only in very few matters No better example that the negation of the Nazis’ crimes : Is the negation of the Nazis’ crimes illegal ? When is freedom of speech a safe harbor for illegal content ?
An example of international harmonization: child pornography Recommendation of the Council of Europe Signed by the members Signed by associated countries Working group of the U.N. Working group in the G7/G8 EU framework decision Objective : To introduce at European level joint framework provisions to address certain issues such as the creation of offences, penalties, aggravating circumstances, jurisdiction and extradition.
Former definition of a “minor ” 14 (sauf exceptions) 16 14 18 17 18 17 18 Australia Austria Belgium Denmark Finland France Germany Greece Iceland Ireland Italy Luxembourg The Netherlands Portugal Spain Sweden United Kingdom 16-17 14 16 15 16 15 14 15 14 - 16 17 13/14/16 16 --- 13 15 16 Country Sexual Majority For the purpose of paedo. porno 16
Towards an harmonization of « what is a child ? » for the purpose of child pornography In the framework decision, a “child shall mean any person below the age of 18 years”. Due to the harmonization in UN, CoE, and G7/G8, it is now the same age in USA, Australia, Japan, most European countries (other than UE) and many others.
The challenges The “regulatory” challenge with illegal content is to create common standards in the world The “legal” challenge with illegal content is to determine which law should be applied. Country of origin such as in the e-commerce directive ? Country of reception such as in the traditional national criminal law ? A new criteria (specifically targeted) such as in new international and European international law ?
The example of casinos and lotteries : In many countries (Belgium, France, UK, etc.) : casino’s license is limited to « bricks and mortar » In some of them, lotteries are forbidden to minors under 18 when it is online (the limit in real world is, in that case, 16) The protection of the minor is very often the justification of this « online restriction » : it is easier to detect the presence of a minor in a real casino than on the website of the national lottery
The challenges The first challenge is the (possible) discrimination between offline and online activities. The second challenge is similar to the problem of harmful content (see here after).
"Harmful content" is : –a material which is authorized but subject to distribution restrictions (“adults only” for example); –or material which some users might find offensive even if, on the grounds of freedom of speech, there are no restrictions on publication. Examples : adult pornography ; tobacco ; alcohol ; medicines and drugs ; racist content ; some political ideas ; some films and books (violent content); etc.
The challenges What is the efficiency of the measures ? How can we conciliate the protection of he minors and the freedom of speech ? Can we really protect the minors in a world where so many cultures co-exist and must respect the other ? It raises the harmonization problem.
First challenge : freedom of speech The fundamental democratic principles of freedom of expression and respect for privacy, enshrined in Articles 8 and 10 of the European Convention on Human Rights, must be observed, and any measure restricting these freedoms must be legitimate, necessary for the aim pursued, and strictly proportionate in the limitations it imposes.
The risk : virtual paradises January - June 2003 : 1276 reports received
The second challenge : lack of harmonization Not only between UE and the USA (violent content) Not only between European countries (termination of pregnancy) Not only within the same country (catholic parody) Not only within the same community (Danish parody of Allah) But … what is harmful for me might well be totally accepted by my twin brother
The answer provided for, in Europe, by the e-commerce directive “Each Member State shall ensure that the information society services provided by a service provider established on its territory comply with the national provisions applicable in the Member State in question which fall within the coordinated field”. “Member States may not, for reasons falling within the coordinated field, restrict the freedom to provide information society services from another Member State”. The first French application for “medical CDs” sold by a Belgian website
The third challenge : the efficiency of protective measures The actual approach : I organize regulation in such a way that I create a legal framework for people and companies on which I have control (because I can identify them or because I can reach them physically) I don’t care if those people and companies are the originator of the harmful content, the destination point of the information, or just intermediaries. For example: Attempts to ban access to websites through Internet providers ; attempts to make search engines and robots liable ; attempts to solve the IP issues through P2P operators ; new legal framework related to connection data, etc.
The third challenge : the efficiency of protective measures The adequate approach should be to organize regulation in such a way that it is efficient. And, as a matter of fact, experiments show : Low efficiency when applied at the intermediaries level Medium efficiency when applied at the origin of the harmful content Good efficiency when applied at the destination of the harmful content
Low efficiency when applied at the intermediaries level Filtering software are far from being bullet proof : -They leave access to harmful content -They ban access to non-harmful content The cost is on persons and companies that are not part of the communication (except for technical reasons) The judge for « what is harmful ? » is someone very inappropriate. Notably because it implies freedom of speech, only a judge should order measures and he will limit their scope to a specific content.
Medium efficiency when applied at the origin of the harmful content The principle. Each and any information has an origin : if it is marked at the origin, it is simple to control its access But … everything depends of the « good faith » of the originator(s) of the content Will he rate ? If he rates, will it do it properly ? As long as one person doesn’t rate, the system looses a big part of its efficiency And … the system strongly needs self regulation as well as technical tools
Example : US 2257 notice for pornographic content In compliance with United States Code, Title 18, Section 2257, all models, actors, actresses and other persons appearing in any visual depiction of content whether actual sexually explicit conduct, simulated sexual content or otherwise, displayed on the Playboy.com PlayboyTV webpages (the "PlayboyTV Webpages") were at least eighteen (18) years of age at the time such depictions were created. All other visual depictions displayed on the PlayboyTV Webpages are exempt from the provision of 18 U.S.C. Section 2257 and 28 C.F.R. 75 because such visual depictions do not consist of depictions of conduct as specifically listed in 18 U.S.C Section 2256 (2) (A) through (D), but are merely depictions of non-sexually explicit nudity, or are depictions of simulated sexual conduct, or are otherwise exempt because the visual depictions were created prior to July 3, 1995. Records required to be maintained for such materials pursuant to 18 U.S.C. 2257 and 28 C.F.R. 75 are kept by our Custodian of Records: [address]
Each and any information has an origin : if it is marked at the origin, it is simple to protect the minors Article 2257 has a broad scope of application. But … everything depends of the « good faith » of the originator(s) of the content Respect of article 2257 is high in the USA. But the situation is very different in other countries where the same content is available (80% of child pornographic content originates from Eastern countries and Asia. And … the system strongly needs self regulation as well as technical tools Software are able to detect the 2257 compliant sites and to limit access to those.
Good efficiency when applied at the destination of the harmful content Self-restriction (if I don’t like, I don’t watch) Filtering and Parental guidance software (it is a legal obligation in some countries) Education : the key to success Education of minors Education of parents Better internet governance. Technical choices are crucial when it comes to regulating the network. The.xxx gTLD example. Rethink the « ownership » of the Internet and the way it operates.
& c OMMENTS Q UESTIONS Etienne Wéry, email@example.com@ulys.net Partner, ULYS – www.ulys.net Attorney-at-law (Brussels’ and Paris’ Bars) Senior lecturer at University Paris I (Sorbonne)