Upload filters: a danger to free internet content?

The European Union is a political and economic union consisting of 28 member states, which are subject to the obligations and the privileges of the membership. Countries which form part of the European Union are therefore characterized by their ongoing efforts to create a common European area with an internal market which promotes scientific and technological development.

What is more, the European Union and the United States are tied not only by the world’s largest bilateral trade agreement, but also by common investments, which form part of the most integrated economic relationship in the world. Advancements within the EU are therefore seen as important building blocks of this relationship. One of the most recent developments concerns the European Union’s copyright reforms, which have been an ongoing phenomenon for the past couple of years.

Apart from the ancillary copyright for press publishers introduced in Germany in 2013, Article 17 (formerly Article 13) has also raised a few eyebrows, for obliging internet platforms to use so-called upload filters. While supporters of the clause in question see this as an essential technological development ensuring correct copyright measures in movies, music, or texts, its opponents fear that it will weaken the network culture and threaten the right to freedom of expression with unforeseeable consequences.

We must therefore ask ourselves what the upload filters are all about, how they work, where they have already been implemented, and most importantly, why do they cause such heated debates?

The current status: The EU has decided that upload filters will come

Despite much protesting, on the 03.26.2019 the EU Parliament decided to introduce copyright reform. Shortly before this, opponents of the reform tried to sway their opinion by holding numerous public protests: Demonstrations took place around Europe the weekend before the vote. In Germany alone, more than 100,000 people took to the streets. Wikipedia Germany also shut down their encyclopedia for the day, in a sensational protest. Instead, when visitors tried to access the page, they were redirected to an information page about the protests. In the end, these actions were fruitless: 384 MEPs voted in favor, 274 against and 36 abstained.

Article 17, which deals with content filtering, was formerly known as Article 13 and is still known under that label. The directive does not explicitly provide for upload filters, but the wording more or less does not allow any other options. Platform operators are already obliged to check contents for copyright infringements before publishing videos, music or pictures, otherwise the operators will be liable for infringements. Theoretically, it would also be conceivable to check each upload by hand, but critics consider this to be an unrealistic option, especially when it comes to larger providers like YouTube.

Exceptions apply to online encyclopedias (especially Wikipedia) or other educational offerings, platforms for the development of open source software, as well as services that have not yet been available for three years or that generate less than €10 million ($11.25 million, approx.) in sales per year.

Time will tell what providers like Google and Facebook will do. First, the European Council needs to approve the reform, but this is more or less a formality. This is due to take place at the beginning of April. The directive will then have to be transposed into national law. EU Member states will have two years to incorporate the reform into their respective national laws.

EU copyright reform: upload filters not mentioned explicitly

Upload filters have long been discussed at European level, as they could play a role in the single market under copyright law. In July 2018, the European Parliament rejected a corresponding draft law. On September 12th, 2018, a new version of the draft was put to the vote, which states that provisions to check content for copyright infringement will only apply to large sites, and smaller ones would be spared. Online encyclopedias such as Wikipedia are also to be exempted from the obligation to check content.

The European Commission have also presented another directive in which upload filters play a key role: As part of the fight against terrorism, internet platforms are to be forced to examine all content for terrorist propaganda. What was not included in the regulation, however, was an exception for small website operators or open source offerings. This means that the upload filter policy would be a blanket policy, covering all nations.

In the context of upload filters, Article 17, formerly Article 13, is of particular interest, even though it is not exactly mentioned. The European Parliament does not tell operators of online platforms how to ensure copyright protection. But critics and observers assume that there is no other option: According to the draft, platforms should check their content for copyright infringements already, before publishing anything. Due to the enormous volume of data, this is practically impossible without automated upload filters.

In a vote, the European Parliament adopted the bill with 438 votes in favor, 226 against and 39 abstentions.

What is upload filtering?

Upload filters are automated computer programs that scan data either when it is uploaded online or before it is published on a platform, and subsequently verify it according to certain criteria. There are three possible scenarios upon detecting content which does not conform to previously defined criteria – it is either blocked, the user is prevented from uploading, or the content is simply modified to match all norms. Upload filters can either be installed on individual sites and apps, implemented by web hosts, or by the user's internet provider. They can be used for the following purposes:

  • Prevention of extremist and criminal content
  • Limitation of false reports, insults, and cyberbullying
  • Filtering of pornographic or violent content
  • Identification of copyrighted material
  • Censorship is possible in the event of misuse

It is the last point that has recently triggered heated debates about upload filters among members of the European Parliament in the context of copyright law.

How do upload filters work?

Upload filters require two essential components, one of them being a database of impermissible data stored in the form of hash values. In the case of the current EU-driven incentive, such a database would contain copyright-protected material.

Fact

Hash values, which are primarily used to store passwords, are strings of letters and numbers generated by mathematical functions from source materials. The same source material always produces the same hash value. However, it is not possible to deduce the source material from hash values.

An algorithm compares the hash values of copyrighted material with those of the uploaded data. An overlap between the two prevents the file from being uploaded. Nevertheless, upload filters do not become activated solely in cases of completely identical or very similar files. With the aid of various machine learning methods, you can spot individual components in images, videos, song parts, or texts. It is even possible to model underlying files to a certain extent. Let’s take cat images as an example. From a database of cat images, algorithms are able to learn how a cat looks and subsequently recognize new cat images that have not yet been saved in a database.

Where have upload filters already been implemented?

A nationwide commitment to upload filters would be a far-reaching step. However, in order to validate the large amounts of data uploaded onto various platforms on a daily basis, large internet companies have already been using this technology for a long time.

YouTube

YouTube’s upload filter – Content ID – checks all newly uploaded videos for copyright infringements. Upon detection of content which violates respective regulations, copyright owners can act in three different ways:

  • Block access to the video and delete its content so that it can no longer be accessed
  • Gain revenue from advertisements which are broadcasted before videos start
  • Stay informed on the number of hits and other corresponding statistics

In particular, the unauthorized distribution of movies, shows, songs, and music videos is to be prevented by this method. According to YouTube, the algorithm in question spares the work of 180,000 human inspectors.

Facebook

The world’s largest social network – Facebook – predominantly uses upload filters to spot posts, images, and videos which are violent, inappropriate for young viewers, and offensive even before their publication. In order to combat terrorist or extremist content, Facebook, Twitter, Microsoft, and YouTube resort to a common database compliant with the operations of Europol – the European Police Authority.

Microsoft OneDrive

As individual files are uploaded onto Microsoft’s cloud, the file hosting service in question performs automatic data analysis called PhotoDNA, which is predominantly used to combat child pornography. In this way, in 2015 for example, the German Federal Criminal Police was able to detect a pedophile with the aid of evidence provided by Microsoft.

ResearchGate

Upon requests by various publishers, this social network for scientific publications was forced to introduce upload filters to identify unauthorized secondary publications and plagiarisms. The algorithm simply decided whether publications should only be made available to certain research groups or should be deleted it their entirety instead.

Criticism of upload filters

Although upload filters and their fight against child pornography, extremism, and copyright infringement initially sounds like a support-worthy cause, they also cause some considerable risks, which are incessantly pointed out by opponents of this new EU copyright law.

Susceptibility to errors and manipulation

Practical applications show that algorithms have long been susceptible to errors. For instance, they are relatively easy to trick when carrying copyrighted material across blocking filters. Nonetheless, what seems to be even more worrying is that computer programs often censor permissible content. In this way, algorithms do not spot parodies, remixes, or tributes, which are generally covered by copyright law. Critics therefore not only speak of restrictions to artistic freedom, but also of an end to “meme culture.” This internet phenomenon is often based on recontextualizing copyright-protected images, videos, and songs, and even sometimes contributes towards the redistribution of content which has been changed in its entirety.

It is also possible to fraudulently claim copyrighted material and register it in a database. The spread of content unprotected by copyrights would therefore be impossible before determining the true owner.

Possibility of censorship

Upload filters simultaneously provide a means for pre-censoring and controlling government-owned information. When misused, upload filters could favor the restriction of the right to freedom of expression and of the press. For instance, if databases were not filled with copyrighted material, but rather with unpopular statements and other forms of criticism directed at the State, then they could no longer be freely voiced on the internet. The implementation of such forms of technology gives a taste of what the nationwide content-filtering policy in China feels like.

What is the current debate on upload filters actually all about?

The redesign of copyright laws in the European Union has considerably heightened the public’s awareness of upload filters. Copyrights holders such as publishers, film distributors, and the music industry demand improved protection of their copyrighted works on digital distribution channels and urge the prevention of unauthorized data disclosure (already taking place on platforms such as YouTube).

On the other side of the fence, however, there are various associations, internet activists, civil rights campaigners, Wikipedia operators, and critically-minded politicians spread across various parties. Although they are in favor of the aims of the proposed act and support the protection of intellectual property, they point out that upload filters are not the right way to go at all. They believe that the filters in question go far beyond their main objective, are not yet fully developed, and would pose a threat to freedom of expression.

Click here for important legal disclaimers.

We use cookies on our website to provide you with the best possible user experience. By continuing to use our website or services, you agree to their use. More Information.