Sunday 14 February 2010

Sabam v Scarlet and ISP filters

As already noted on this blog, a reference has been made to the ECJ in Sabam v Scarlet (formerly the Belgian subsidiary of Tiscali). The questions are here. The judgment of 28 January is here - thanks to Cédric Manara.

Sabam (the Belgian Society of Authors, Composers and Publishers) sued Scarlet in 2004 over P2P copyright infringement. In 2007 Scarlet was ordered to adopt a system called Audible Magic to filter illegal file-sharing. Audible Magic didn't work and the order was overturned in 2008.
The Court of Appeal's questions are complex but the Internet waits for no man. To save the ECJ time here's my simplified version with excess directives filtered out:

1. Does Article 15 of the E-Commerce Directive prohibit an injunction compelling an ISP to filter P2P communications because it would be 'a general obligation to monitor'?
2. If not, should the Belgian court impose such an injunction on Scarlet?

To save even more time, the ECJ could pass on the second question because it has already provided courts with a guideline in Promusicae: the Copyright Directive (including injunctions) should be interpreted in accord with fundamental rights (e.g. rights to property, private life and freedom of expression) and other general principles of Community law (e.g. proportionality). In Sabam the court would balance the copyright owners' property right against users' rights to privacy and freedom of expression; then consider the proportionality of any measures (e.g. effectiveness). Furthermore, Recital 59 of the Copyright Directive says that the conditions and modalities relating to injunctions against digital intermediaries should be left to the national law of the Member States. The ECJ could decline to take on a general obligation to monitor injunctions.

That would leave just the one question – can ISPs be placed under an obligation to filter?

The answer to this question isn't spelled out in black and white in the E-Commerce Directive. It is a policy question that needs to be answered in the context of today's internet, not that of 2000 when the E-Commerce Directive was created. The copyright problem is on a different scale in 2010 (Napster was new on the block in 2000). Privacy concerns about 'monitoring' need to be reviewed in the context of the internet's mass voluntary waiver of privacy via Adwords etc. Concerns about whether ISPs can bear the financial burden of filtering need to be objectively reassessed in the light of real profits made by ISPs today.

How is the ECJ likely to respond? If this leaked document concerning ACTA is anything to go by, the ECJ will not authorize a filtering obligation – nor is it likely to be created by ACTA (see here). Although the issue in ACTA would be a general statutory obligation rather than a single injunction, as in Sabam, the effect of a single unlimited injunction would be comparable. Interestingly, in the consultation process that has culminated in the Digital Economy Bill, the British government looked at a general obligation on ISPs to impose filters (BERR Consultation on Legislative Options to Address Illicit P2P File-sharing, July 2008, Option A4) but in the Government's response of 29 January 2009, that option had disappeared without trace.

Filtering seems to be deemed politically incorrect but is it really such a dud idea? Currently individual rightsholders must scan the endless vistas of the internet in the hope of spotting an infringement. Then they enter the longwinded, expensive process of removing it, knowing that they are only fire-fighting as it will only be the tip of the iceberg. ISPs, by contrast, have a birds-eye view of everything.

If a restaurant has a health and safety problem, who is best placed to fix it? The one-off customer who gets food poisoning? Or the restaurant owner who exercises day-to-day control over the restaurant's premises and practices?

12 comments:

Jim Killock said...

Isn't the restaurant owner more akin to Google, as in your Blogger example, than the ISP, who is more like a road owner where the restaurant is situated? Filtering is easily circumvented by encryption, akin to putting blacked out windows into your car, which is why it was dropped.

Anonymous said...

Although there have been technical problems with filtering in the past, has it been proven once and for all that it can never work in the future, even if only partially? Presumably just as file-sharing technology advances, so filtering technology is capable of being improved. Just because it is not 100% effective does not mean it should be excluded as one of many tools for reducing copyright infringement. Not everyone can be bothered with encryption anyway.

Stephen Moffitt said...

I am not sure the case for filtering is as clear-cut as has been presented here. As the recent controversies over CView, Buzz and Facebook show, there is not a 'mass voluntary waiver of privacy'. There is still a strong aversion to certain activities, such as filtering, which cross the perceptual line into 'snooping'. Also, there are issues around the effectiveness of filtering since it often identifies legitimate uses. Given this, it seems prudent that there is no obligation to filter.

Anonymous said...

Stephen - I don't know the details of how different filtering systems work - I'm merely opening up the issue for debate - but page 27 of the Belgian judgment suggests that the Audible Magic filtering process isn't very intrusive - e.g. users aren't individually identified. I'm not sure that filtering invades users' privacy any more than the average web page. For example, if you click on the CQ counter on the right of the 1709 web page you can see the IP addresses (and rough geographical locations) of everyone who visits the site...

Unknown said...

Um, this is a fun one. To explain my background, I've worked as a programmer, am a singer/songwriter/recording engineer, and I'm currently working as a writer. I understand how filtering works (and doesn't work) because I've written comparison tools.

OK, let's take a song. We determine a checksum (a number that can uniquely identify the song) and set up our filter to look for files with that identifier that cross our network. This is actually quite easy to do.

But, what if the checksum changes? Believe it or not, changing the checksum of the file, while leaving the song playable is a trivial exercise. We can add the new checksum to the filter once we know about it, but it may be months before we learn, as there's so much traffic that we can't play every song transmitted. If you have a dozen different people who all rip the song from CD, and all change the checksum, it gets worse. Then imagine that a certain number of the people who download it change the checksum also. Now you have a hundred checksums that you don't know about.

The same basic techniques work with videos and books as well.

Filtering looks good on paper. It's a disaster to implement, because it's so easy to change checksums. Change one byte, the checksum changes. Or if you want redundance, change every 100th byte in the file, the overhead is low and you'll get a greater change in the checksum.

Filtering assumes that the original file isn't encrypted. If the file is encrypted the filter cannot recognize it. If you are considering un-encrypting the file, you'd need to know the key to do this, if you don't, you can't.

And since encryption technology is so simple to write, requiring a backdoor isn't practical - any teenager with a compiler can build an encryption engine. If you legislate compilers out of existence, you destroy your IT infrastructure, and leave your country open to cyber-attacks.

Oh, and don't try to make MP3 files illegal - I'll sure your ass off. I work for artists who use MP3 as their distribution method, any attempt to make MP3 files or transmission of MP3 files illegal will be meet by legal action from myself, and a multitude of sources.

A final note - the IFPI has been crowing about the court case in Italy where the ISPs are being forced to block torrent sites. I have heard a rumor that a bunch of artists that use torrent sites to distribute their material are considering asking the court to remove the block as they consider it an anti-competitive measure. I don't know if this is true or not, and of course I have no idea what the Italian courts would think of this if it is.

Anonymous said...

Hugo says "but page 27 of the Belgian judgment suggests that the Audible Magic filtering process isn't very intrusive". Yup, but the court also found that Audible Magic doesn't actually work. Perhaps we could use chocolate cake to filter the internet - like Audible Magic, it doesn't work for that purpose, like audible magic, it isn't very intrusive. However, unlike Audible Magic, it tastes nice.

Anonymous said...

Dear Mad Hatter - thank you for this informative comment. Perhaps the White Knight will be able to invent something: you say filtering looks good on paper, so maybe blotting paper will work? How fashionable defeatism seems to have become! Imagine Bletchley Park telling Churchill: 'I'm afraid this encryption is written by a teenager. There's absolutely no way we can crack it.'

Anonymous said...

Anonymous - thank you for the chocolate cake suggestion. I'm not sure about it myself but you definitely get brownie points for lateral thinking. Perhaps the White Knight can give it a try? Failing that, the Mad Hatter would probably appreciate something to soak up all that tea.

Anonymous said...

PS A bit of quick, amateur research suggests to me that encryption isn't a problem anyway:

http://www.ipoque.com/userfiles/file/DPI-Whitepaper.pdf

http://www.internetevolution.com/document.asp?doc_id=178633

Unknown said...

Dear Mad Hatter - thank you for this informative comment. Perhaps the White Knight will be able to invent something: you say filtering looks good on paper, so maybe blotting paper will work? How fashionable defeatism seems to have become! Imagine Bletchley Park telling Churchill: 'I'm afraid this encryption is written by a teenager. There's absolutely no way we can crack it.'
Hugo,

The German Enigma Device was pattern based, once you knew the pattern for the day, you could crack every message transmitted that day using that pattern (various organs of the German state used different Enigma systems, so Bletchey Park had to crack each system each day, for example Luftwaffe, Kreigsmarine, and Werhmacht each had their own Enigma, and they couldn't read each others mail). But because it was a pattern based system, once you knew the basic patterns (the Poles captured an Enigma early in the war and shipped it to England before the fall of Poland) you could using the Colossus computer, the Bombe, and the other devices devised for code breaking, break that day's pattern, and once the pattern was broken, read everything.

Modern encryption isn't pattern based. While it's not uncrackable per se, you really need a Quantum computer to do it - a normal super computer just doesn't have the power, here's an explanation:

faculty.cs.tamu.edu/daugher/quantumBH2003.ppt

I'm not going to claim I understand it myself - I'm good, but not this good :)

If you have 1 million file sharers using 1 million public keys, the cost for enough Quantum computers to handle cracking the encryption using RSA-129 would be astronomical, note the comment near the end:

If you can build a big enough quantum computer, you can crack RSA-1024 (about 300 decimal digits) in your lifetime

If RSA-129 becomes crackable, the switch to RSA-1024 will occur, and if RSA-1024 becomes crackable RSA-2048 will be adopted. Note that RSA-2048 isn't twice as hard to crack as RSA-1024, it's about 1,000 times harder to crack.

PS A bit of quick, amateur research suggests to me that encryption isn't a problem anyway

You are confusing identifying what program is sending the data, with identifying the data. For example I downloaded the newest version of Mandriva Linux using BitTorrent yesterday. Since I didn't route this through anything to hide what I was doing, my ISP could tell I was using a Torrent Client. It could also tell what I was downloading. However if I used encryption, what I was downloading would be unrecognizable. And of course there are other methods of obscuring what you are doing, for example you could use a Darknet, you could use IPREDator, etc., etc.

As I said above, filtering sounds good on paper, however it is ineffective. Minor technological changes can render filtering ineffective until ways of cracking those changes are designed, which can take months or years even when the new P2P technology is Open Source.

The only effective way to prevent file sharing over the internet is to shut down the internet. Is society going to find shutting down the Internet an acceptable means of stopping File Sharing?

A final note. I know a lot of people who are making more money than ever in the music business. They are doing this working directly with their customers. At one time to 'make it' in the business you had to sign with a label. Now the general attitude among artists is that only failures sign with the labels. This may be the reason that CD sales by the labels are down - the most successful artists no longer sign with them.

Anonymous said...

Doesn't the restaurant provide the food?

Unknown said...

Great to see this interest in a small Belgian operator.

In this case there are several things that are subject to judgement.

The first is the rightfullness of the request to force an operator to filter. I don't think there is a common sense on this. In Belgium a judge forced the operator, in Ireland, France and the UK there are agreements by law of by self regulation.

The second is the protection of privacy, net neutrallity. In teh base only the IP-adress will be known. However in the verdicts and regulations the IP adress should be connected to a customer, which blow away the privacy and netneutrallity as such.

The third is the technological part, to me alwaays the most fun.
The problem with P2P is more complex than sites like YouTube, facebook et all. P2P requires real time processing.
There are two approaches:
- in the middle (deep packet inspection like)
- as participant of the P2P network
The last one is proven the only one that can work.
The one based on DPI will not work. First of all it fails when encryprion is done. This never can be decrypted in realtime before completion of the exchange. Another aspect is the sizing, we calculated that the filtering anvironment becomes larger (much) than the infrastructure of the operator.

Last but not least there is the content itself. Every technology only will work with a database of protected content. Which does not stop at music. But we also talk movies, ebols, software. How will this be standardized to avoid multiple dataformats for every type of content. Or create a monopoly for the content database (with fingerprints/watermarks).

This discussion is not over for long. Because we are talking changing a business that has to be changes where the stakeholders are fossiles (production companies and the people in the middle like sabam).

Eric