www.coe.int/ecri
 

About ECRI

Statute

Other CM Decisions

Internal Rules of Procedure

ECRI members

Activities

Mandate

Country Monitoring Work

Work on General Themes

Statements

Awareness-raising

 

Library

Publications

Search (HUDOC database)

Press Releases

 

 Secretariat

Secretariat

Contact

 

E-news

Subscription

 

Restricted access

Access members

Password reset (expires every two months)

 
 
 
 
 
 

Legal instruments to combat racism on the internet

III. THE RESPONSIBILITY OF THE VARIOUS PERSONS
INVOLVED IN THE INTERNET




Introduction: The position of the problem
3.1. Liability of the author

3.2. Different interveners have different responsibilities

3.3. Legislative solutions and measures in the process of preparation

3.4. Laws on the press/criminal responsibility

Introduction: The position of the problem 

In connection with the fight against racism, in spite of the abundance of racist and revisionist sites, few courts have rules on the questions of liability, whether that of the author of the unlawful contents or that of the technical intermediaries. Several actors come into play: the author of the statement complained of, than the relays, whatever they may be (forum moderators, persons running electronic mailboxes, creators of links), and finally the technical intermediaries, access providers or hosts.

To study the responsibility of these parties is to attempt to determine what law applies in respect of what offences. It must be emphasised here that the problem of liability arises primarily in criminal terms; however, civil actions to have the sites in question closed down, or access to them blocked, are possible.

3.1. Liability of the author 

3.1.1. The limits to criminal responsibility: difficulties in identifying the author

As stated in the introduction, the majority of European countries have criminal laws against racist propaganda and there is no doubt that the authors of racist statements must be criminally liable in respect of such statements on the Internet. On the other hand, unlike other means of expression (the press, radio, television), the Internet does not allow the author of a message or a site (what we mean here is the person who made the racist statement, hereinafter “the author”) to be clearly identified. Sometimes, in order to make this identification easier, in some countries (France, for example) it is mandatory to make a declaration1 prior to opening a website. In the absence of such a means of identification, however, how is the author of the offending conduct to be found?

A significant point is that the technical intermediaries are able to keep “log” files, or records of connections, which are of help in identifying the authors of statements circulating on the Internet. Are they obliged to store them, however, and if so how and to whom are these log files to be communicated? The communication of these files must be subject to certain basic conditions and in accordance with well-defined procedures (court proceedings, for example) so that the confidentiality of the information received can be preserved.

By way of illustration, in a French case2 before a district court the author of racist statements had been found by means of technical channels. This was the first time that a French court had ruled on the dissemination of racist statements on the Internet. In this case the author was identified because access provider lifted anonymity.

France
On 27 August 1999 a surfer was convicted by the Strasbourg Regional Court (Tribunal de grande instance) of incitement to racial hatred and fined FF 10,000, half of this fine being suspended. This individual had expressed racist views on an Infonie discussion group. The management of the access provider had been informed by the person responsible for moderating the discussion groups that a number of unacceptable messages had been posted. The management identified the subscriber corresponding to the IP address of the machine which sent the message and informed the BCRCI (Central Brigade for the Prevention of Computer Crime), which very quickly investigated the matter. Infonie then filed a complaint against a person unknown and agreed to reveal the identity of the subscriber, who admitted the facts.

Another case provides a good illustration of the limits imposed by judicial ignorance of computers; the court had not carried out a more thorough technical investigation, which it rather hastily considered would have been pointless.

France
On 13 November 1998 the Paris Regional Court acquitted Professor Faurisson, who had been charged with placing on line a number of documents entitled Horned visions of the holocaust, on the ground that there was insufficient proof of the ownership of the site in question.

Although Professor Faurisson’s name appeared on the articles in question, he denied that he was the author or that he had placed them on line. The court observed that the name could have been put there by anyone and that to make comparisons with other documents previously written by Professor Faurisson would be to rely on assumptions rather than on facts.

In substance, the court considered that “since no investigations had been carried out into the operating conditions of the ‘AAARGH’ site, its relations with the ABBC.Com server and the technical constraints on access to the information, and on altering and disseminating it, for reasons which, moreover, were set out by the prosecutor in his written submissions, it cannot be established that this site is the accessed property and that he alone can use it”.

In our view this decision, which states categorically that it is not technically possible to go back to the source, is out of date. A court cannot now rely on the absence of technical investigations to discharge an accused. It is frequently feasible, unless the author has deliberately brushed over the trail by employing caches or mirrors, to go back to the source of the information disseminated to discover its authors.

Belgium
The Belgian court circumvented the technical difficulties of identification and were satisfied with “a convergence of presumption” (the accused was known for his racist views) to convict a surfer (a police officer) who had made racist statements in a discussion group.3 No technical investigation had been carried out by the prosecution for the purpose of identifying the TCP/IP address allocated to the computer used by the accused. This address could have been identified with the cooperation of the technical operator, but it was not necessary.

On 22 December 1999 the Brussels Criminal Court (Tribunal correctionnel) imposed a suspended sentence of six months’ imprisonment on a police officer and former candidate on lists of Vlaams Blok in Brussels-Villes for making racist statements in various discussion groups (contrary to the Law of 30 July 1981 on the prevention of certain acts inspired by racism or xenophobia, as amended by the Law of 12 April 1993).

The accused was also ordered to pay damages of FB 100,000 to the civil party, the Centre for Equal Opportunities and the Fight against Racism.

This problem in identifying the authors of the illegal contents is a matter of concern for certain national parliaments, which have suggested that legislative measures be adopted to enable the person committing an offence to be identified in criminal cases.

Belgium
The Belgian Council of Ministers adopted a Bill on computer crime, which was laid before the Chamber of Representatives in October 1999. The Bill provides, inter alia, that access providers will be required to identify their subscribers, to trace their communications via their TCP/IP numbers and to keep this information for a period to be determined by a decree.

Switzerland

On 21 December 1999 the National Council (the Lower Chamber) proposed, in the context of the revision of the Law on the Surveillance of Postal Correspondence and Telecommunications, an Article 12 paragraph 3 bis:

 

“where a punishable offence is committed by means of the Internet, the access provider shall be required to provide the competent authority with any information which will enable the author to be identified”.

This provision is currently being examined by the Council of States (the Upper Chamber).4

France

On 20, 21 and 22 March 2000 the National Assembly considered the Bill amending Law No 86-1067 of 30 September 1986 on freedom of communication, and on 22 March 2000 it adopted, after its second reading, Article 43-6-4 on the obligation to identify subscribers disseminating contents on the Internet, which concerns not only personal pages but also mailing lists, discussion groups and chat rooms. Article 43-6-4 therefore obliges surfers subscribing to a French service-provider to identify themselves to the service-provider and on their sites.

The Law provides that a subscriber who falsely declares his identity is liable to a penalty of six months’ imprisonment and a fine of FF 50,000. The same penalty is provided for a host who is unable to reply to a request from the judicial authorities.5

Apart from being adopted by the Senate, this Bill must be given a third reading by the National Assembly. It has therefore not been definitively adopted.

3.1.2. Civil responsibility of the “author”

A civil action depends on injury to the individual interests of a person who is able to plead direct harm. The injured party may then request the civil court to bring the offending conduct to an end quickly, notably by using a procedure available in urgent cases, such as the French civil-law interim relief procedure (“référé”) provided for in Article 809 of the New Code of Civil Procedure. Such a provisional measure is perfectly suited to the world of networks and provides the courts with a flexible and rapid procedure, but it is difficult to implement where racism on the Internet is concerned, because it is not always a simple matter to determine the author of a hateful statement, the victim and the existence of an interference with individual rights.

Italy
The Law on Immigration of 1998 introduced, in Article 35, a new means of combating racism. It makes provision for a civil action against racism, so that any victim of a racist or merely discriminatory act may request the civil court to adopt any measure necessary to redress it. The court may order that he racist or discriminatory conduct be brought to an end or adopt any measure to put an end to it or to provide compensation for the harm.

In some respects this action resembles the provisional and urgent measures which may be sought to counter actual or potential harm in accordance with the Code of Civil Procedure.

According to the information available, there have not yet been any decisions in which this article has been applied to the Internet.

3.2. Different interveners have different responsibilities 

Owing to the difficulties in identifying the authors, and to the procedural obstacles associated with that fact that these authors take refuge abroad, other possible ways of holding others liable for the dissemination of illicit material have been investigated.

3.2.1. The responsibility of the relayers

It will be recalled that by “relayers” we mean an privileged intermediary who facilitates access to offending contents by a link6, by operating a discussion group or an electronic mailbox. Although he does not control the content, he may make it easier for the surfer to locate sites and guide him in the immense store of information on the web. Does this intermediary risk being held liable, since the creation of the link is ultimately due solely to his initiative? Can he be regarded as appropriating the site or the information associated therewith? The case-law will be described below according to its nature (criminal or civil), and the liability of the relayers is based on the failure to cancel the link.

It should be pointed out at the outset that this heterogeneous and sometimes contradictory case-law does not reveal a clear trend in respect of the responsibility of these intermediaries.

In criminal law there are few examples relating to racist links. The hosts very frequently cut off the links where they are reported to them, in order to avoid proceedings prosecution7.

Switzerland/link
Recently (in March 2000) the personal home page of an assistant lecturer at the Federal Ecole Polytechnique in Zurich was closed down by an internal decision taken by the authorities of the Ecole Polytechnique, acting on their own initiative, because it contained links to racist sites. The matter is currently the subject of an internal administrative and criminal investigation.

The same problem exists in the case of electronic mailboxes. Can those running them be held criminally liable for the messages circulated through them? It seems that they can.

Switzerland/mailbox
In a decision of 7 December 19988 the Obergericht of the Canton of Zurich held that the operator of an electronic mailbox in which pornographic information was circulated was guilty of a punishable offence within the meaning of Article 197 of the Swiss Criminal Code, since he could have cut off access to that information and had not done so, so that users, in particular minors, could consult it.

A more sensitive issue, on the other hand, is the liability of a participant in an electronic mailbox who merely redistributes information provided by others without “adopting” it.

Germany/box
A German court9 held that a surfer was not criminally liable on the ground that the unlawful content concerned was not his and that he had not “appropriated” it.

The accused found on the internet in an anonymous mailbox a file called the "Terrorist´s handbook" which contained instructions for building arms. He filed the handbook in the mailbox of another person. This mailbox was accessible to the more then 800 users of an internet club. The accused claimed that he had found the file by coincidence and he admitted that he knew vaguely about the content of the file. In the first instance, he was convicted of giving instructions for the manufacturing of weapons, an offence punishable according to Art. 53 Weapons Act. The Superior Regional Court of Bavaria (Bayrisches Oberstes Landgericht) acquitted the accused in the second instance. According to the Superior Court, it was not clear whether the accused had turned the instructions of the handbook into his own instructions/ had appropriated the instructions of the handbook. The mere distribution of the instructions is not enough to assume such appropriation.

A question which sometimes arises is that of the criminal liability of the various interveners in that special category of sites, chat rooms and discussion groups, owing to their essentially private nature. Antiracist regulations generally specify the precondition that the material in question must be communicated to the public, and this condition is not satisfied in the case of electronic mailboxes or newsgroups. These might at first sight be considered to constitute private correspondence, but the case-law tends to reject that restrictive interpretation of electronic mailboxes.

Germany/public nature of electronic mailboxes
In a case involving an electronic game with Nazi symbols, a German court10 recognised that a circle of surfers linked to an electronic mailbox was of a public, not a private, nature, even though the circle was restricted.

The accused operated a mailbox in which he had filed a computer game which contained nazi symbols. A small circle of users had access to his mailbox whereby access could be gained anonymously by logging in with a guest login. The accused was convicted for the public use of forbidden symbols (Art. 86a of the German Penal Code). The court made clear that already the fact of making something optically available constitutes a use. There is no need for the physical supply of the symbol. For the requirement of "public", it is sufficient that the contact to the mailbox can be obtained without identifying the person gaining access. Due to the anonymity of the contact, the group which is using the mailbox is not a circle of private friends but a "public" circle. The place on which the content is filed does not have to be public.

Belgium
In a decision of 22 December 1999 (see 3.1.1) the Brussels Criminal Court considered that “newsgroups” or discussion groups “are places which are not public but open to a certain number of persons”. Consequently, they satisfy “the statutory conditions of publicity”.

In civil law, maintaining the link complained of constitutes an interference in respect of which an injunction may be issued or an award of damages made.

The Netherlands/link
In a counterfeiting case the District Court, The Hague, held on 9 June 1999 that an access provider was liable for having maintained a link which connected to a site containing counterfeit material11 :

 

"Declares it to be the law that by having a link on their computer systems which when activated brings about a reproduction of the works that CST (the plaintiff) has the copyright to on the screen of the user, without the consent of the plaintiffs, the Service Providers are acting unlawfully if and insofar that they have been notified of this, and moreover the correctness of the notification of this fact cannot be reasonably doubted, and the Service Providers have then not proceeded to remove this link from their computer system at the earliest opportunity."

Belgium/link
On 2 November 1999 a Belgian district court held that a technical intermediary was liable for having failed to cut off the offending links and convicted it of aiding and abetting on the basis of the following facts12:

Skynet hosts the “somnus” and “freemusic” sites, which offer hyperlinks to sites which allow music recordings to be made in MP3 format. The non-profit-making association ifpi and its member polygram warned Belgacom Skynet sa on two occasions to cut these links. When Belgacom Skynet sa failed to comply with this warning they commenced proceedings for an injunction, claiming that this conduct was contrary to fair commercial practice. The court held that Belgacom Skynet sa could be considered liable since it did not cut the links although it had been informed of suspicious activities. The links in question were conscious links to known pirate websites; Belgacom Skynet sa was therefore guilty of aiding and abetting the offence of making reproductions of music files available to the Belgian public.

Belgacom Skynet sa was therefore responsible for the illegal use of copyright material in Belgium and unlawful conduct. In the operative part of the judgment the court ordered Belgacom Skynet to put an end to the practices and to pay a fine in default and ordered that a summary of the judgment be published on the home page of Belgacom Skynet’s site and in five newspapers.

Germany : information archiving
In Germany, on the other hand, an archive operator was held not to have civil liability, on the ground that compiling an archive does not constitute adopting a personal position on the content of the information disseminated13.

The German section of the Church of Unification lodged a civil claim (cease and desist order) against a civil rights institution which published government documents on his homepage that contained affirmations capable of discrediting the Church. The regional court held that maintaining an archive constitutes distribution only in a technical sense whereas an independent contribution to the potentially wrongful act could not be seen in maintaining an archive. The participation in establishing a market of opinions would not be a sufficient ground for civil responsibility. The notification by the claimant about a potentially discrediting content in the defendant’s archive does not create civil responsibility.

Italy/newsgroup
On 4 July 1999 the Rome Court dismissed an application for an order for the removal of an advertising message with an allegedly defamatory content which had been published in an “unmoderated” discussion group.

The court held that the forum operator could not be considered personally liable for his activity as operator of the news-server Pantheon srl. Nor did a claim lie against Panthéon (the Internet provider), since the latter merely made available to the users the virtual space necessary to host the forum, and since in this case, which concerned an unmoderated discussion group, had no power to control or monitor the messages placed there14.

United Kingdom/news group
The Defamation Act was enacted in the United Kingdom in 1996 to protect service provides against unwarranted requests to cut links. The Defamation Act provides that in the case of defamation the technical intermediary will not be liable if it is not the author or publisher of the content in question, if it has taken appropriate measures and if it was not aware of the content in question.

 

An English citizen complained to Demon (the service-provider) about a message posted in a newsgroup in the United States which defamed him. Since Demon was not the author of the message or the operator of the server of origin it acknowledged the complaint but did not cancel the message. Proceedings were therefore initiated against Demon. In a decision of 26 March 2000 the court of first instance, applying the Defamation Act, found that Demon was liable for disseminating defamatory messages in a discussion group and ordered the Internet service-provider Demon to pay damages to the complainant in respect of a content of which Demon was completely unaware.

We conclude this section on relayers by describing a Swedish law, which is significant because Sweden is the only country to have enacted legislation in this sphere and to have clarified liability in an area in which, as we have just seen, the case-law seems to be rather imprecise. This impression of vagueness is accentuated by the sometimes contradictory nature of decisions and the absence of relevant decisions at last instance.

Sweden
This law on electronic mailboxes (original title: Lag (1998:112) om ansvar för elektroniska anslagstavlor) originated in 1998. It represents the legislature’s response to a line of decisions of the Supreme Court which had exempted the moderator of news and chat rooms from any criminal liability15. The law imposes on the operator of an electronic mailbox an obligation to exercise diligence under pain of being held criminally liable:

however, the law applies only to news rooms and chat rooms, in other words to electronic mailing services which allow users to post messages for other users or to see other users’ messages; it is not aimed at web sites16. Nor does it apply to traditional electronic mail (Article 2 (4)), in other words to messages sent to a specific addressee.

3.2.2. The liability of the host

The question is whether this technical intermediary can be held liable where illicit contents are accommodated; whether he will be criminally liable, for example, for aiding and abetting the dissemination of unlawful statements, or whether the rules of civil law will apply, so that liability will be based on failure to observe the code of conduct: failure to prevent the dissemination of such statements.

3.2.2.1. The host: aiding and abetting for the purposes of the criminal law?

Does the fact of providing space for the storage of unlawful information constitute active participation in the offence?

The host is not deemed not to be aware of the content of the information stored and should therefore not be held liable for aiding and abetting. The fact of concluding a simple contract with a customer and making space available for a website or an electronic mailbox should not be treated as conscious participation in offences committed by that customer. The host merely rents space to the customer or grants a sort of lease in a strictly commercial context.

Immediately the host becomes aware that a content is unlawful, however, he could be found guilty of aiding and abetting the offence18 if he does not take immediate action to prevent its dissemination. Must he therefore assume the role of censor and moral guardian by preventing the dissemination of statements which he deems criminal?

France
In a more or less comparable situation19 (it did not concern the Internet, but Minitel), the Court of Cassation stated that it appeared impossible to imagine that the director of a server centre hosting a telematic service – which often accommodates a great many services – “is in any way liable for the content of the messages”. The Court of Appeal had not convicted the director of aiding and abetting and the Court of Cassation (the highest French court) did not adjudicated on this charge.

Switzerland
In Switzerland, on the other hand, a PTT director was convicted of aiding and abetting20 the publication of obscene material because of the sex chatlines operated by individuals via the telephone networks (and hence accessible by minors). The Federal Court observed that the Attorney General’s department had on several occasions drawn the PTT’s attention to the possibility that children might listen to or participate in pornographic conversations, and made quite clear that an which provides the instruments necessary for the operation of a criminal service and which, despite being made aware of the criminal conduct, does nothing to stop it is guilty of aiding and abetting the offence21.

United Kingdom
Part III of the Public Order Act 1986 is drafted in such terms as to conceptually cover the activities of persons who "host" racist material, in the sense of providing the technical platform to allow the author to make it available on internet. In particular, such a host could be said to be publishing or distributing written racist material under subject. 19(1), distributing, showing or playing recordings of racist material under subsec. 21(1) and/or in possession of racist material under subsec. 23(1). The last mentioned provision is particularly relevant, in that it suffices if the material is stored with a view to its being displayed or played later by another person and in that the material need only be shown to be objective likely to stir up racial hatred in the circumstances, not that it was intended by the ISP to be used for that purpose. On the other hand, it is a defence to each of the offences "for an accused ... to prove that he was not aware of the content of the written material or recording, and did not suspect, and had no reason to suspect, that it was threatening, abusive or insulting".

After learning of the German decision of the "Landsgericht München" on the criminal liability of ISPs as accomplices (see below 3.2.3.1), British ISPs asked for clarification of the position under British law. The authorities take the view that, although the Public Order Act was introduced before the proliferation of internet, and although the inclusion of ISPs within its scope is therefore completely fortuitous, ISPs can nevertheless be prosecuted under Part III if they actually know that they are hosting racist material (i.e. it has been drawn to their attention) and they take no action to remove that material22. Therefore, no ISP will be prosecuted for unconscious transmission of racist material.

None of these provisions or interpretations thereof have yet been tested in court.

3.2.2.2. Civil liability based on the host’s misconduct

In addition to the rules on criminal liability, persons involved with the Internet may incur civil responsibility in negligence or for breach of contractual.

A number of decisions23 seek to recognise that a host is liable in negligence where he has failed to exercise vigilance in respect of the contents which he hosts, in particular where he hosts anonymous sites. This vigilance must only be susceptible of excluding sites which are obviously illicit. "Excluding" means closing down the site without delay, where possible after consulting the author of the pages complained of (which is difficult in the case of anonymity).

France
On 10 February 1999 the Paris Court of Appeal, held, on an appeal from an interlocutory decision of 9 June 1998, that a host who allows anonymous persons to create web pages is liable for their content24.

In the interlocutory order of 9 June 1998 the President of the court had held that “the host is required to ensure that those to whom he provides services observe proper moral standards ... and that they comply with the law and regulations and respect the rights of third parties”. Then, “in order to discharge his responsibility, [the host] will therefore have to show that he fulfilled his special obligations to inform the customer of the obligation to respect personality rights, copyright, trade mark rights that he did in fact carry out checks, if need be on the basis of samples, and that when a breach of the rights of third parties was revealed he acted diligently to put an end to that breach…”

The Court of Appeal confirmed the viewpoint of the judge of first instance and maintained the responsibility of the host:

 

“... by hosting anonymously on the site altern.org which he has created and which he runs any person who, under any name whatsoever, requests space for the purposes of making available to the public, or to categories thereof, signs or signals, words, images, sounds or messages of any kind which are not in the nature of private correspondence, Valentin Lacambre manifestly exceeds the technical role of a mere conveyor of information and must clearly assume, as against the third parties whose rights are infringed in such circumstances, the consequences of an activity which he has deliberately undertaken to carry out in the conditions referred to above and which, contrary to what he alleges, is profitable and on a scale which he himself claims”.

The judgment therefore states that hosting is an activity which goes beyond the mere transmission of data, since it entails the dissemination of the site (a fortiori where that hosting is provided on an anonymous basis) for a fee. Considered implicitly as a director of publication, therefore, the host must assume a certain responsibility where his activity, carried out without prior checks, helps infringe the rights of others.

In this case the liability was civil liability based essentially on wrongful conduct. The judgment refers to a “breach of the right to an image and of the intimacy of private life”, and in France the solution is based on a breach of Article 9 of the Civil Code.

This decision was confirmed in a decision of the Nanterre Regional Court of 8 December 1999 which, on the basis of Articles 9 and 1382 of the Civil Code, upheld an action against the host of a website where photographs representing a nude model were displayed.

The court stated on this occasion that a host is under a general obligation to exercise prudence and diligence. He is therefore required to take the necessary precautions to avoid infringing the rights of others. For that purpose, the host must take reasonable steps to provide information. The court considered that the fact that the host drew customers' attention to certain essential obligations when the service contract was concluded, and that there was a “charter” informing customers of the need to respect the rights of others, constituted sufficient diligence.

The host must then show vigilance. This vigilance does not mean the “detailed and thorough monitoring of the content of the hosted sites”, but need only be of such a kind as to exclude sites whose “unlawful nature” is “obvious”. Finally, the host ensure that he has the facilities to close down dubious sites immediately and ensure that they are not reopened. For the remainder, the judgment contains an important dictum. The fact that the host is unable to provide the identity of the person who created the site in issue does not in any way exempt him from liability. The Regional Court considered that the activity of a host “by virtue of its nature and the conditions in which it is carried out ... gives rise to liability”.

The liability of the host has just been established in a case concerning domain names25. Archives of February 2000, see text of decision and commentary. What should be particularly emphasised in that judgment is the joint and several liability of the actors: the person registering the domain names, the person organising their sale and the host.

Thus the auctioning on the Internet of domain names reproducing well-known brands (les-3suisses.com, la-redoute.net) constituted an act of forgery and “reveale[d] a parasitical intent”. In the court’s view the person who had registered the domain names, the person who had organised the sale and also the host of the site on which the sale took place were all liable.

As regards racism, the Nanterre Regional Court is shortly due to determine a case involving Nazi sites26 :

 

On 18 February 2000 Multimania (a host) removed a Nazi site from its servers at the request of the UEJF (Union of Jewish Students of France). Called “nsdap”, like the Nazi party, the site posted pages glorifying the Third Reich, contrary to the Multimania users' charter. Even though Multimania had removed the Nazi site, the UEJF decided to bring civil proceedings in negligence against it.

 

The UEJF relied on Article 1383 of the Civil Code, which provides that everyone is responsible for his own actions, his omissions and his imprudence. Its counsel maintained that Multimania had been negligent in failing to monitor the content of the site in question and in delaying removing it from its servers. The UEJF claimed damages of one franc and an order that the defendant should set up a security procedure to be followed when new accounts were opened. It is apparent that access to the Multimania host service is not subject to any condition relating to identity.

 

The UEJF is not seeking police-type control, but its counsel argued that “a host cannot be satisfied with the identity provided by subscribers, but must endeavour to know with whom he is contracting. Some hosts require at least the e-mail address of the person, which proves that he has at least registered with an access provider”.

 

Counsel for the UEJF stated that: “Today we ask Multimania to establish a security procedure and to carry out a minimum control of the content of its sites. It could, for example, develop a search procedure based on simple key words, which would enable a considerable number of items to be detected. The intention is that Multimania should be under an obligation to produce results. Furthermore, I [counsel for the UEJF] am in contact with counsel for the other side and am prepared not to proceed with the complaint before the court if Multimania establishes measures in the meantime”.

The UEJF has also lodged a criminal complaint against the authors of the Nazi site, even though their identity is unknown.

On 24 May 2000, the High Court in Nanterre passed down its judgement in this case. The anonymous author had been identified through the application of the usual rules of judicial procedure, and the Court did not deem that Multimania was in any way to be held responsible, considering that the host provider had respected its general obligation to exercise due caution and diligence.

The Court considered that carrying out a search for precise key words could require “a specialised culture which the host provider cannot be held responsible for not possessing”, and, recognising that the profession was also subject to human limitations, called for a sharing of knowledge and experience between organisations devoted to combating incitation to racial hatred and Internet providers.

Likewise, in its judgement of 8 June 2000, the Court of Appeal of Versailles struck down the first-instance judgement passed by the Nanterre High Court on 8 December 1999 in the case between Mrs Lacoste and the Multimania company. The Court of Appeal noted that “the obligation incumbent on the host provider to exercise due caution and vigilance as regards the sites which it hosts is an ‘obligation of means’” and “does not imply a general and systematic examination of the contents of the sites which it hosts”.

Italy
A judge27 found that an author was liable in respect of defamatory statements and made an interlocutory order that the interference be brought to an end. This decision leaves open the question of any liability on the part of the technical intermediary, although it points out that in this case liability was excluded by the contract.

According to another decision, however, a technical intermediary who merely provides access to the network and space on its server for the publication of information services by the provider of information is not liable for any breach of copyright by the latter28.

This part on the liability of the host may be summarised as follows:

1) The host is not automatically and systematically liable in respect of the illicit contents hosted.

2) However, judicial decisions tend to find liability (civil or criminal) on the part of the host where the latter is aware of the contents in issue.

3) In addition, certain countries (France, in particular) mean to impose an obligation on the host to exercise diligence which requires him to show that he censors the information which he accommodates.

3.2.3. The liability of the access

3.2.3.1. Liability of the access provider for aiding and abetting offences

As a simple intermediary between user and host, the access provider is in principle unable to check the millions of items of information which circulate on the network and are frequently altered. He should therefore not be held criminally liable unless the mental element of the offence can be established, since he merely provides a simple connection service.

In order to be guilty of aiding and abetting, the access provider must therefore have actually participated in the criminal act and there must be a link of causality between the activity of the accomplice and the commission of the offence by its author. It should also be shown that the access provider intended to participate in the offence.

A number of theoretical questions arise: can the provision of access to the Internet be seen as actual participation in the offence such as to render the access provider liable for aiding and abetting? Can the access provider’s intent to participate be established merely from the act that he disseminates documents of whose unlawful character he is not aware? The access provider cannot be expected to examine all the information which he disseminates and determine whether it is lawful. The essential issue is whether an access provider who becomes aware that illicit information is circulating by means of the facilities which he provides has the technical and legal resources actually to prevent the unlawful information from being received on the local network which he controls. He has two options: he can either block access to the information or filter the information to ensure that it cannot be consulted, and both of these operations are aimed solely at the surfers on the network which he controls.

It must be emphasised that the access provider has no means of taking action in respect of a server situated abroad which hosts the illicit contents.

A German decision29 answered some of these questions by clearing the access provider:

Germany
By judgment of 8 December 1999 the Landgericht, Munich acquitted the director of CompuServe GmbH, Felix Somm, of providing access to paedophile contents.

Mr Somm was charged with facilitating consultation of paedophile newsgroups (of the “alt.sex.pedophilia” type) by providing access to the news server of CompuServe Inc. In spite of the fact that these news groups were hosted in the United States by CompuServe Inc., he was convicted at first instance by judgment of the Amtgericht Munich of 28 May 1998 and given a suspended sentence of two years’ imprisonment.

The Landgericht set aside the judgment delivered at first instance and confirmed the principle that access providers are not liable for the illicit content to which they provide access.

This principle was already established in the legislation in force in a number of countries (in particular Article 5(3) of the German Teledienstegesetz of 13 June 1997 and Section 512(a) of the United States Digital Millennium Copyright Act of 28 October 1998) and in an international convention (cf. the Joint Declaration concerning Article 8 of the WIPO Treaty on copyright of 20 December 1996) and in the amended proposal for the European Directive on certain legal aspects of electronic commerce of 1 September 1999 (Article 12).

The facts of the case were as follows:

 

The American company CompuServe Inc. hosted on its news server a number of news groups with a paedophile content; the German company CompuServe Information Services GmbH allowed German subscribers to access these news groups at reduced connection fees; CompuServe Inc. was the only company with contractual links with the German subscribers; following a search (on 22 November 1995) the German State Attorney’s Office informed Mr Somm of the existence of the illicit news groups and sent him an initial list of five illicit news groups; since CompuServe GmbH did not have the technical means to cut off access to the news groups, Mr Somm immediately sent the first list to CompuServe Inc. and asked it to cancel the offending news groups; on 29 November 1995 the State Attorney’s Office established that these news groups were no longer accessible; on 8 December 1995 a second list of 282 paedophile news groups was sent to Mr Somm; Mr Somm again immediately forwarded the list to CompuServe Inc. and asked it to cut of access to these news groups; between 22 December 1995 and 13 February 1996 CompuServe Inc. cancelled access to the majority of these news groups; on 16 February 1996 CompuServe Inc. informed the press that it considered it was no longer required to intervene, since CompuServe Inc. and CompuServe GmbH now made available to their customers a control tool called “Cyber Patrol-Parental Control”, also available in German, which allowed subscribers themselves to censor access to the news groups of their choice; since then new unlawful news groups had been accessible and proceedings had been initiated against Mr Somm.

 

On the thorny issue of aiding and abetting, the Landgericht decided that Mr Somm had not aided and abetted the offences. It took the view that this offence was conditional upon proof of misconduct on Mr Somm’s part and that in the instant case such misconduct could only result from the two following omissions:

 

The fact that Mr Somm had not reiterated his request to CompuServe Inc. to cut off access to the news groups in question was irrelevant. On this point, the Landgericht considered that such a step had no prospect of success, in view of the contrary official position (in the press) adopted by CompuServe Inc. Mr Somm was therefore not guilty of misconduct by failing to pursue the matter with the parent company.

 

The Landgericht also considered that Mr Somm should be acquitted pursuant to Article 5(3) of the German Teledienstegesetz of 13 June 1997, which provides:

   

"Providers shall not be responsible for any third-party content to which they only provide access. The automatic and temporary storage of third-party content due to user request shall be considered as providing access".

The crucial point is therefore the access provider’s knowledge of the illicit content of the information conveyed through his intermediary; this knowledge is held to be culpable if nothing is done to put an end to the interference.

Switzerland
The Federal Court has not yet had the opportunity to rule directly on the criminal liability of access providers in respect of the content of the information which they transmit.

However, legal commentators, and a group of experts from the Federal Justice Office, have taken the view that the decision which established that the PTT manager was liable (cf. 3.2.2 above) could be applied by analogy to access providers30. Access providers could be held accountable for the illicit publications which they convey through their access points. In its legal opinion of 24 December 199931, moreover, the Federal Justice Office concludes that even simple access providers might be liable as accessories if the author could not be prosecuted (Articles 22 and 322 bis of the Criminal Code)32. Owing to the distance between provider and author, criminal liability could only be considered within narrow limits; it would mean, in particular, that the access providers had been made clearly aware of the illegal content by a criminal prosecuting authority.

3.2.3.2. Access providers and civil liability

The essential function of the access provider is that of provider of technical services, responsible for connecting its subscribers with sites or other users. In the case of a purely technical activity, an access provider should incur civil liability only where he is aware of or able to control the information complained of.

France
On 15 March 1996 the Union of Jewish Students of France (UEJF)33 lodged an application for interim measures against nine French Internet access providers (Calvacom, Eunet, Axone, Oléane, CompuServe, Francenet, Internetway, GIP Renater and Imaginet) on the ground that these service providers were allowing their customers to access negationist servers and messages. The UEJF requested the court to order the respondents to prevent their customers from accessing messages and servers which did not comply with Article 24 bis of the Law of 1881 (as amended by the Law of 13 July 1990), and to pay a fine if they failed to comply with the order.

The court made an interim order on 12 June 1996 and, taking note of various ethical commitments given by some of the parties, rejected the UEJF’s application, on the ground that:

 

“... an access provider under no legal obligation to regulate the information available on the network, whether this information can be consulted by its customers or whether it is transmitted by them, since the authors alone are liable in respect of such information”.

In a court decision of 22 May 2000, the French legal system enjoined the American access provider Yahoo! to take measures to “make it impossible” for French Internet users to gain access to its auction site offering Nazi objects for sale. The judge of the magistrates’ court in Paris, sitting in chambers to deal with matters of special urgency (juge des référés), gave Yahoo! a deadline of 2 months to present technical proposals as to how the problem might be resolved, commenting that this auction was an “insult to the collective memory” of France.

The Californian company Yahoo! Inc was brought before the courts by the Ligue internationale contre le racisme et l’antisémitisme (Licra) and the French Union of Jewish Students (UEJF). These two associations requested at the hearing of 15 May 2000 that “the necessary measures be taken to prevent, throughout the whole of the French territory, the exhibition and sale on this site of Nazi objects.”

The judge was of the opinion that “in allowing this site to be viewed in France, Yahoo! is committing a offence on French territory, even if this was not the intention.” “Yahoo! is in a position to identify the origin of calls, which should allow it to deny French Internet users access to view this site”, the judge concluded.

To summarise this section on the access provider: owing to its essentially technical functions, an access provider should not bear civil or criminal liability unless he is aware of and able to block access to the illicit contents.

3.3. Legislative solutions and measures in the process of preparation 

3.3.1. Legislation

Two countries, Germany and Austria, have enacted legislation on the liability of technical intermediaries. These legislative approaches are favourable to a purely technical intermediary and preclude any provision for automatic liability; instead, they prefer liability to be established a posteriori, on a case-by-case basis, depending on knowledge of the content and the means of controlling it.

Germany
Following the judgment at first instance concerning CompuServe, Germany legislated by promulgating the law on information and communications services (Informations- und Kommunikationsdienste-Gesetz)34 of 22 July 1997, thus generally defining the liability of a service provider for illegal contents. According to paragraph 5 of that law, liability is on a graduated scale and depends on the extent of knowledge of the illegal content:

 

"§ 5: Responsibility

 

(1) Providers shall be responsible in accordance with general laws for their own content, which they make available for use.

 

(2) Providers shall not be responsible for any third-party content which they make available for use unless they have knowledge of such content and are technically able and can reasonably be expected to block the use of such content.

 

(3) Providers shall not be responsible for any third-party content to which they only provide access. The automatic and temporary storage of third-party content due to user request shall be considered as providing access.

 

(4) The obligations in accordance with general laws to block the use of illegal content shall remain unaffected if the provider obtains knowledge of such content while complying with telecommunications secrecy under § 85 of the Telecommunications Act (Telekommunikationsgesetz) and if blocking is technically feasible and can reasonably be expected."

The distinction which German law draws between the various functions of the provider is similar to the Anglo-Saxon distinction between access provider and content provider. The provider may be a mere technical intermediary whose sole function is to provide access to information on the web.

The provider may also be the person who “uses” foreign contents and processes foreign information in any way whatsoever.

Thus a “provider” who merely conveys the contents is not responsible for them (3)

A provider is jointly liable in respect of illicit contents where he is aware of them, if he is technically able to block them and can reasonably be expected to do so (2).

A provider is fully liable in respect of contents of which he himself is the author. The illegality of the content is determined according to criminal law.

However, a provider is under no general duty to carry out a preventive control of the content accessible to his customers.

Austria
Similarly, Austria has provided, in Article 75 of the Law on Telecommunications35, that an access provider is not liable36 as a technical intermediary. However, an access provider can be reasonably expected to block access to sites whose content he knows to be illegal, and he himself will incur criminal responsibility if he fails to do so.

Because of his function, the “service provider” runs the same risk. He can be expected to employ out reasonable monitoring procedures in the form of specific controls which are not beyond his financial means. He is not held liable if he can show that he followed these procedures; but he will be held liable if it can be shown that he did not effect any control.

Italy
The only rule to refer to the “telematic” dissemination of illicit contents is Article 3 of Law No 269 of 3 August 1998 on the sexual exploitation of minors (which provides that anyone who “by any means, including by telematic means, distributes, disseminates or makes public pornographic material or distributes or disseminates messages or information fro the purpose of attracting or sexually exploiting persons under the age of 18 years” is guilty of an offence). This provision is drafted in very broad terms and might be interpreted as extending liability to all technical intermediaries. It has attracted much criticism for that reason; a Bill providing that the distribution of pornographic material concerning minors is an offence only if it is done “consciously” was filed at the Senate on 14 January 1999.

3.3.2. Measures in the process of preparation

France
In France an initial Bill37 provided that companies operating host websites would not be liable unless “they themselves have contributed to creating or producing the offending content” or if “after being ordered to do so by a judicial authority they have failed to take prompt action to prevent access to this content, provided condition that they store it directly”. This text introduced by the National Assembly dealt with civil liability.

However, the Senate opted for quite a different solution, involving criminal liability. The text adopted by the Senate extended the cases in which access providers are liable. Thus they may be prosecuted if they have participated in creating or publishing the illicit contents or if they are initially responsible for transmitting the contents or for making them available.

They may also face prosecution if they refuse to reveal the identity of the authors of publishers of these contents to “third parties who show that they have a legitimate interest”.

A further innovation is that the Senators imposed an obligation to exercise diligence “to recognise and not to interfere with the technical measures put in place by owners of intellectual property rights to enable the works or recordings transmitted to be identified or protected”.

Following a second reading the National Assembly on 22 March adopted Article 1A of the Bill on audiovisual media concerning the liability of Internet access providers or hosts. It sets out three cases where these service-providers may be liable in respect of the content and not merely in respect of the content and not merely for the breaches of the rights of others resulting from the content:

1) They may face prosecution if they have contributed to the creation or production of the documents in issue.

2) They will also risk prosecution if they have failed to take prompt action to prevent access when ordered to do so by a judicial authority.

3) Finally, these service providers will be liable if “they have been notified by a third party who considers that the content which they host directly and permanently is illicit and harmful to that person and have failed to act with due diligence”.

In the course of the debate it was stated that “appropriate measures” meant bringing the matter before a judge, by application for an interim order or by the normal procedure, and forwarding the complaints received to author of the content so that he could alter it.

Unless the Senate adopts this text in the same terms, it will have to be given a third reading before the National Assembly before a final vote is taken.

Finally, after several debates, the French National Assembly adopted on 16 June 2000 a draft law on freedom of communication to amend the law of 30 September 1986. According to this law, host providers or editors, whether their services be free or fee-based, will in future be held responsible under criminal and civil law for site contents if, after being approached by the judicial authorities, they fail to put in place appropriate measures to prevent access to the sites in question. They will also be held responsible if, after being approached by a third party who considers that the contents they host are illegal or prejudicial to that third party, they have not carried out the appropriate checks.

Belgium
In an opinion of 28 March 1997 the Supreme Council for Audiovisual Media of the French Community expressed its preference for liability in accordance with the general law rather than cascade liability. By way of example, the Council considered “that an access provider who cannot exercise any a priori control over the Internet resources should not be concerned by the fact that he has omitted to exercise such a control”38.

3.3.3. The particular case of the European Union and the United States

3.3.3.1. The European Union

In view of the commercial stakes associated with the Internet, the European Union is currently considering a proposal for a directive on certain legal aspects of the services of the information society, and in particular electronic commerce, in the internal market39. The proposed directive seeks to establish a legal framework to ensure the free movement of the services of the information society between Member States, but not to harmonise the sphere of criminal law as such.

In the case of host services (Article 14), the proposed directive establishes an exemption from liability for a service provider who stores information, provided that:

 

“... the provider does not have actual knowledge that the activity or the information is illegal” or
“ ... the provider, upon obtaining such knowledge, acts expeditiously to remove or to disable access to the information”.

In Article 15 the directive proposes that providers should be under no general obligation to monitor the information concerned.

The directive is at present being debated.

3.3.3.2. Les Etats-Unis: Liability of Internet Service Providers (ISP’s) and Internet Access Providers (IAP’s)

Prior to the adoption of the Communications Decency Act ("CDA") the development of American case law had led to a seemingly paradoxical situation concerning the liability of providers. Where the provider exercised little or no editorial control over the content it provided, the provider would not be liable unless it knew or had reason to know that such content was defamatory40 whereas a provider who exercised such control would be acting as a publisher and, as such, would be liable for any defamatory content41. An operator which assumed responsibility for at least attempting to keep defamatory or offensive material from being posted was liable as a publisher for defamatory postings, but an operator which made no such attempt escaped publisher liability42.

In order to encourage self-regulation within the industry, Congress specifically addressed this situation in the CDA by exempting access providers from liability for providing access or connection to or from a facility, network or system not under their control43 and providing that service providers may not be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be objectionable, whether or not such material is constitutionally protected44. Although other sections of the CDA have been declared unconstitutional, provider defences to liability remain in force and courts have interpreted these sections to provide broad immunity to providers.45

This immunity is not unlimited in all areas. With respect to copyright and trademark rights, the Digital Millenium Copyright Act prescribes specific actions that an Internet provider must take, after it has been informed of a possible copyright infringement being carried on its service, in order to avoid liability. Unlike in defamation cases, the requirement of monitoring content is deemed to be bearable in this context.46

3.4. Laws on the press/criminal responsibility 

The press is generally governed by its own legal regime, especially in respect of liability for the editorial content. Liability is exclusive, the idea being that only one person is to be held responsible47. The question therefore arises whether these laws on the press and the particular types of liability which they establish also apply to services offered by the Internet other than private correspondence services (e-mail).

Italy
Criminal law on the press:
Articles 57 and 57 bis of the Criminal Code govern criminal liability for offences committed by means of the press. In the case of the periodical press, the editor or deputy editor is liable if he has failed to monitor the periodical sufficiently to prevent the commission of offences. The penalty is that laid down for the offence in question, reduced by one third (Article 57). In the case of the non-periodical press, the law provides, in the same conditions as Article 57, that the publisher is liable, or the printer if the publisher is not indicated (Article 57 bis).

The question therefore arises as to whether these provisions can be applied to publications on the Internet and, if so, who is liable. There are no criminal decisions on this point. Thus far, however, legal commentators refuse to treat the Internet in the same way as the press48.

France
A number of existing laws apply to the press: the Law of 1881 on the freedom of the press, which is now applicable to “all audiovisual communications media”49 and the Law of 1986 on freedom of communication50. Although these laws do not refer expressly to the Internet, the courts have not been slow to classify the Internet as audiovisual communication and to apply the provisions on liability to it51.

French law has established cascade liability: in the event of a press offence52, proceedings are first brought against the editor and, in the alternative, the author and then the producer.

Legal commentators and case-law are divided as to whether this liability should be applied to the Internet. Thus in its report concerning the Internet53 the French Council of State accepted that

 

“editorial liability should be maintained in respect of relevant matters, i.e. the publication of the contents, but a system of liability in accordance with the general law should be retained for all other functions exercised on the network and in particular the functions of technical intermediation or website design”.

The French courts do not appear to have followed the line recommended by the Council of State, however. Thus the Criminal Chamber of the Court of Cassation delivered a judgment on 8 December 1998 in which it relied on Articles 92-2 and 93-3 of the Law of 29 July 1982 on audiovisual communications and held that the person described as the producer bore criminal liability. The person in question had opened a telematic service “36-15 Renouveau”, a veritable discussion group, and had then been prosecuted after two anonymous (racist) messages had been disseminated on this site. The lower courts had acquitted him on the ground that since he had no control over the messages disseminated he could not be regarded as the producer. However, the Court of Cassation considered that by taking the initiative to see up an audiovisual communications service for the purpose of exchanging political opinions, the accused knew beforehand what themes which would be dealt with, he actually stored the information found there and was required to ensure that the statements did go beyond the context of the forum. He could therefore be prosecuted as producer without being able to plead that he had no knowledge of the contents.

 

“... having taken the initiative to set up an audiovisual communications service for the purpose of exchanging opinions on pre-defined themes, Mr R. could be prosecuted in his capacity as producer and could not plead in his defence that he had not monitored the offending messages.”

This judgment is an interesting application of cascade liability in respect of a telematic service.

However, the question whether a host, whose role is more remote and who has no influence over the content of the offending site, may be held liable in respect of its content remains open.

As regards the access provider, the Puteaux District Court held in a decision of 28 September 1999 that “the director of an audiovisual communications service is the person who can exercise control before publication, the person who has control of the content of the service”, and held that the access provider did not have editorial responsibility.

Finally, in a specific case involving dissemination of racist statements, the Paris Court of Appeal, in a judgment of 15 December 1999, applied the Law of 1881 on the press and held that changing of the address of an Internet site is an act constituting “a fresh publication” within the meaning of the Law of 1881 on the Press. Accordingly, the three-month period provided for in Article 65 of the Law of 1881 after which the prosecution of offences committed by means of the press becomes time-barred begins to run from the date of the change of address. The fact that the content of the site at the new address is the same as that at the original address does not mean that the prosecution is to be regarded as time-barred54.

The specific function of the Internet based on the vast amount of information accessible via hypertext links is a function based not on control of the content but on an increased ability to consult and access it. This network is therefore far removed from the classic publication of contents and makes it difficult to establish a single regime of liability based on the cascade principle.

Switzerland /liability by default
The recent revision of the criminal law on the media, which entered into force on 1 April 1998, has limited the subsidiary responsibility (based on Article 322 bis of the Criminal Code, see 3.2.3.1, note 78, above, which raises the intentional element of the offence) of the network operator in the sole case where liability cannot be attributed to another person, in particular because the author of the publication cannot be discovered or is abroad55. This may pave the way for liability “by default” for access providers but gives rise to a certain amount of controversy56.

However, this point of view may be tempered where racist matters are concerned. In a recent decision concerning a book shop guilty of having disseminated works of a racist nature, the Federal Court57 refused to give the accused the benefit of Article 27 of the Criminal Code because the application of such a provision would produce a result contrary to the aim pursued by the law.

 

“Where a criminal provision is designed to prevent the publication of certain statements or to prohibit illicit publications, to allow those responsible for such publications to benefit from a special arrangement would amount to deviating from the aim pursued by the legislature”.

In this case the Swiss retailer of certain racist and revisionist works, whose known author (R. Garaudy) had already been convicted abroad in respect of the same publications, was acquitted at first instance of the charge of disseminating racist and revisionist statements in application of Article 27, in accordance with the following reasoning: since the author of the book had already been convicted, all those assuming only subsidiary responsibility to that of the author should avoid punishment, a fortiori a retailer, even though there was no specific reference to retailers in Article 27 of the Criminal code.

The Federal Court rejected that argument, annulled the decision and remitted the case to the cantonal court. It delivered what in our view is a rather political decision which might be seen as a warning to potential disseminators of racist material: Article 27 of the Criminal Code will not allow a hateful statement to be spread with impunity.

Returning to the problem of the Internet, criminal law on the media (Article 27 of the Criminal Code and liability responsibility by default) is therefore not applicable in the case of racial discrimination, hard pornography and the depiction of violence. According to the legal opinion of the Federal Office of Justice referred to above, the situation existing before the entry into force of the criminal law on the media prevails: access providers could therefore be punished for aiding and abetting the main offence.

It emerges from these cases that the transposition of the laws on the press and the media, together with their privileges and special features, to the Internet is in our view inadequate in the light of the number of actors involved on the web and the lack of clarity as to their role. If cascade liability should be envisaged, there should be specific provisions to that effect and the task of each of the persons involved and the liability associated with those tasks should be clearly defined.







_______________

1. http://www.csa.fr/html/declar.htm. This site provides all the necessary information concerning a declaration to open a website.
2. The summary of the decision is taken from the site http://www.legalis.net/net/, archives for September 1999. This site regularly places case-law and commentary concerning the Internet on line.
3. The judgment and commentaries may be found at: http://www.droittechnologie.org/2.asp?month=1&year=2000
4. See Medialex, 2000, p. 7.
5. This proposed article led to the defences being raised by the trade organisations both in France (e.g. AFA, the French association of access providers) and abroad. See the commentary by EuroIspa (http://www.euroispa.org): "Ironically, this law may have exactly the opposite effect from its perfectly honourable intentions. It could force French web authors into foreign jurisdictions and make it impossible for French plaintiffs and judges to obtain information on a French web author without recourse to international judiciary cooperation. The message to members of the French Parliament is simple. You should work with ISPs to provide maximum protection for all French citizens, not introduce a law which moves illegal content outside French jurisdiction, hurting French industry in the process."
6. On the problems of links, see Droit de l’informatique et des télécommunications 99/3, pp. 6 to 21, L’utilisation des liens hypertextes, des frames ou des meta-tags sur les sites d’entreprises commerciales”, by C. Curtelin. It is also possible to consult the site: www.jura.ui-tuebingen.de/ and search with the “Stefan Bechtold” search engine. Legal commentary and case-law on hypertext links can then be found
7. See the response of the German Government (Drucksache 13/7757 of 22 May 1997) on the closing of racist sites when the “radikal” case was denounced by the Greens. See, in France, the remarks of alternB, the French host, on website http://www.internet.gouv.fr: joint regulation of the Internet, the position of the trade. “In respect of the 40,000 sites hosted free of charge, I receive an average of one complaint per day by e-mail, one registered letter per month and one judicial complaint every two months. Now, in order to avoid being overwhelmed by procedures, I destroy everything complained of which I consider obviously illegal or contrary to the charter of the service. I am therefore compelled to be judge of the evidence”.
8. In Medialex, 2/99 p. 106.
9. Beyerisches Oberstes Landgericht, Decision of 11 November 1997, NJW 1998, p. 1087.
10. Oberlandesgericht Frankfurt a.M., Decision of 18 March 1998, NSTZ 1999, p. 356.
11. See details of the case on http://www.juriscom/net/elaw/e-law11.htm
12. A summary of the decision can be found on: http://www.droit technologie.org/2_asp?actu_id=1877271291&month=2&year=2000
13. Landgericht Berlin, 17.3.1998, NJW RR 1998, p. 1634
14. Decision published in: Diritto dell’informazione e dell’informatica 1998, p. 807.
15. NJA 1996, p. 79.
16. See the commentary by Per Furberg in Karnov CD-ROM, 1999/2000:1, note 1.
17. Furberg in Karnov CD-ROM, 1999/2000:1, note 10.
18. On aiding and abetting, see the article by Sébastien Canevet, “Fourniture d’accès à l’Internet et responsabilité pénale” (Provision of access to the Internet and criminal liability), available at: http://www.canevet.com/doctrine/resp-fai.htm
19. Cass. Crim. 15 November 1990, Bull. No 388.
20. ATF 121 IV 121
21. "it is irrelevant that he did not intend that the pornographic recordings should be heard by children. He is not charged with having committed the offence as author or co-author. He clearly pursued a different aim, namely the success of mailbox 156; it does not alter the fact that after being informed and given a formal warning by the Vaudois prosecutor, he agreed, by continuing to provide his services, to make a causal contribution to operators who to his knowledge were using this means to commit offences on a regular basis.”
22. Sources: Mr. Neil Stevenson, Community Relations Unit, Home Office [= Interior Ministry] and Detective Chief Superintendent Keith Akerman, Hampshire Constabulary, Chairman of the Computer Crime Working Group of the British Association of Chief Police Officers)
23. http://www.droit-technologie.org/2_1.asp?actu_id=1877271291&month=2&year=2000
24. Estelle Halliday v Valentin Lacambre, TGI, 9 June 1998 and Paris Court of Appeal, 10 February 1999; in this case Mrs Halliday discovered that 19 photographs showing her completely or partly naked were displayed on a web site and sought an injunction against Mr Lacambre, the host known by the name of altern.org
25. Interim order of the Nanterre Regional Court 31 January 2000 http://www.legalis.net/jnet/
26. Links relating to the article: http://www.multimedia.fr http://www.uejf.org
27. Order of the President of the Teramo District Court, 11 December 1997.
28. Cuneo District Court, 23 June 1997, in Giurisprudenza piemontese 1997, p. 493.
29. The French summary which follows is taken from website: http://www.droit technologie.org/2_asp?actu_id=1475345633&month=1&year=2000
30. Report of the group of experts cited, p. 9. See also I. Cherpillod, Quelques problèmes juridiques liés à Internet (Legal problems associated with the Internet), Plädoyer, 1997 p. 42.
31. See http://www.bj.admin.ch/themen/ri ir/access/intro-f.htm for the link in French.
32. Article 27 of the Criminal Code
Where an offence has been committed and perpetrated in the form of publication by one of the media, the author alone shall be liable, subject to the following provisions.
Where the identity of the author cannot be discovered or where he cannot be brought before a court in Switzerland, the editor responsible shall be liable, pursuant to Article 322 bis. Where there is no editor the person responsible for the publication in question shall be liable, pursuant to the same article.
Where the item concerned was published without the author’s knowledge or against his wishes the editor or, in the absence of an editor, the person responsible for the publication shall be liable as the author of the offence.
The author of an authentic account of public debates or official statements of an authority shall not be liable to any penalty.
Article 322 bis of the Criminal Code
The person responsible within the meaning of Article 27(2) or (3) for a publication constituting an offence shall be liable to a term of imprisonment or a fine if he deliberately did not oppose publication. Where he acted negligently he shall be liable to a short term of imprisonment or a fine.
33. Paris Regional Court, 12 June 1996, Réf. 53061/96
34. BGBl. I p. 1870, in force since 1 August 1997.
35. Telekommunikationsgesetz, BGBI. 1997/100
36. See Die Haftung des Providers, Arbeitsunterlage und Diskussionsgrundlage für de ISPA-Sitzung vom 13. Oktober 1998, p. 7 et seq.
37. On a proposal from Deputy P. Bloche. The French Parliament is currently examining a Bill amending the Law of 1986 on freedom of communications, certain provisions of which allow the conditions under which technical intermediaries on the Internet will be liable.
38. Opinion referred to in the Report of the French Council of State, The internet and numerical networks, 1998, p. 111.
39. See the common osition adopted by the Council with a view to adopting a Directive of the European Parliament and the Council on certain legal aspects of information society services, in particular electronic commerce, in the internal market ("Directive on electronic commerce"), 14263/1/99 REV 1.
40. Walter Pincus, “The Internet Paradox Libel, Slander & the First Amendment in Cyberspace", 2 Green Bag 2d 279 (Spring 1999), discussing Cubby, Inc. v. CompuServe Inc., 776 F. Supp. 135 (S.D.N.Y. 1991).
41. Stratton-Oakmont, Inc. v. Prodigy Services Co., 1995 WL 323710 (N.Y.Sup. 1995).
42. Pincus, op. cit. at 282, quoting Douglas B. Luftmann, “Defamation Liability for On-Line Services: The Sky Is Not Falling", 65 Geo. Wash. L. Rev. 1071 (1997).
43. 47 U.S.C.§223(e).
44. 47 U.S.C. §230. This is sometime referred to as the “good samaritain defense".
45. Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997)cert. denied, 425 U.S. 937 (1998) (AOL not liable for failure to remove defamatory messages after repeated requests of victim); Blumentahl v. Drudge, 992 F.Supp.44 (D.D.C. 1998) (AOL not liable for contents of paper it paid author to produce); Doe v. America Online, Inc., 718 So.2d. 385 (Fla Ct. App 4th Dist. 1998) review granted 729 So.2d. 390 (Fla. Sup. Ct. 1999) (CDA pre-empted state statutes and shields AOL from liability for sale by one of its members of child pornography through a “chat room" despite notice to AOL of such sales).
46. Pincus, op.cit at 287.
47. See D. Barrelet, Droit de la communication, Berne, 1998, p. 330. The author explains the origins of this special liability known as cascade liability, formerly envisaged as a way of allowing the publication of anonymous articles and avoiding lengthy and complicated proceedings.
48. None the less, in a decision in civil proceedings (unfair competition) the owner of an Internet domain name was assimilated to the proprietor of a newspaper (or a radio or television station) and held to be under an obligation to exercise diligence and, consequently, to be have direct civil liability: Naples District Court, 8 August 1997, published in Diritto dell’informazione e dell’informatica, 1997, p. 970, and in Giustizia Civile, 1998, p. 259. The owner of a domain name is therefore liable in civil law for unlawful acts committed as a result of the content of pages placed on the site which he operates ; he is under an obligation to check diligently whether the distinguishing mark belongs to the person inserting the relevant pages and to monitor the content of the message in order to ensure that the advertisement is clear, truthful and accurate. This principle applies even where the owner of the domain name is only involved with the technical maintenance of the site and the creation and management of the pages placed on the network, and the associated commercial negotiations, are entrusted to another person.
49. According to the legislative amendments introduced by Law No 85-1317 of 13 December 1985.
50. Law No 86-1067 of 30 December 1986 “on freedom of communication”, Journal Officiel, 1 October 1986, p. 11511.
51. Article 93.3 of the Law of 13 December 1985 defines the conditions for the application of editorial liability in audiovisual matters. The offending message must have been the subject of a declaratory judgment before it was communicated to the public. Liability is borne primarily by the editor of the publication, then by the author and finally by the publisher.
52. The press offences already determined by the Law of 29 July 1881 (Article 23 et seq.) were, in particular: incitement to commit felonies and misdemeanours; incitement of discrimination, hatred or racial violence; personal offences (defamation, insults).
53. Internet et les réseaux numériques, Report of the Council of State, 1998, is available on the following site: http://www.Ladocfrancaise.gouv.fr
54. Moreover, at first instance the court had taken the view that “the publication results from the renewed intention of the person transmitting it, who places the message on a site and chooses to keep it there or to remove it when he pleases. The act of publication is therefore continuous”.
55. Government explanatory report concerning its proposal to revise the criminal law on the media, Feuille Fédérale 1996 IV, p. 560.
56. J.P. Müller, op. cit., p. 203.
57. ATF 125 IV 206