One of this blogger's favourite readings of all time is the Google Transparency Report [first published in 2012: see here], a most clever and precious source of information about many things, including real-life developments in the world of online copyright.
As 1709 Blog readers will know, the Transparency Report [by the way: have you checked the transparency reports by Apple, Twitter, Facebook and Yahoo! yet?] shows the requests that Google has received from copyright owners and reporting organisations that represent them (eg BPI, RIAA ...) to remove search results [the Transparency Report is in fact limited to Search: it does not include requests for other products, eg YouTube or Blogger] that link to content that allegedly infringes copyright.
Each request names specific URLs to be removed, and so Google lists the domain portions of URLs requested to be removed under specified domains.
Since its first publication, the number of requests received by Mountain View-based internet giant has constantly increased, at a rather impressive pace.
According to the latest analysis of the report published by TorrentFreak, while in 2011 Google was asked to remove less than 10m URLs, in 2012 the URLs were 50m, and this year the number of requests was over 235m.
This means that the number of requests quadrupled from 2012 to 2013 and was more than 23 times higher than 2011. Wow. This is what can be really called "big data".
As of December 2012, average processing time across all removal requests submitted via the web form for Search was approximately 6 hours [it was 10 hours when the Report was first released ... How long does it take now? Does anyone know?]
This year Google complied with 91% requests [it was 97% between July and December 2011]. Google decided not to take action in respect of 21m URLs, either because the requests were illegitimate or duplicated already submitted in previous notices [the issue of duplicates is also highlighted here].
UPDATE: A great 1709 Blog friend let us know that you can learn more about Google's copyright notice and takedown process here.
As 1709 Blog readers will know, the Transparency Report [by the way: have you checked the transparency reports by Apple, Twitter, Facebook and Yahoo! yet?] shows the requests that Google has received from copyright owners and reporting organisations that represent them (eg BPI, RIAA ...) to remove search results [the Transparency Report is in fact limited to Search: it does not include requests for other products, eg YouTube or Blogger] that link to content that allegedly infringes copyright.
Each request names specific URLs to be removed, and so Google lists the domain portions of URLs requested to be removed under specified domains.
Since its first publication, the number of requests received by Mountain View-based internet giant has constantly increased, at a rather impressive pace.
According to the latest analysis of the report published by TorrentFreak, while in 2011 Google was asked to remove less than 10m URLs, in 2012 the URLs were 50m, and this year the number of requests was over 235m.
This means that the number of requests quadrupled from 2012 to 2013 and was more than 23 times higher than 2011. Wow. This is what can be really called "big data".
As of December 2012, average processing time across all removal requests submitted via the web form for Search was approximately 6 hours [it was 10 hours when the Report was first released ... How long does it take now? Does anyone know?]
This year Google complied with 91% requests [it was 97% between July and December 2011]. Google decided not to take action in respect of 21m URLs, either because the requests were illegitimate or duplicated already submitted in previous notices [the issue of duplicates is also highlighted here].
UPDATE: A great 1709 Blog friend let us know that you can learn more about Google's copyright notice and takedown process here.
Since search engines are not protected by the Hosting Defence, it might be more efficacious to sue than to keep sending 'takedown' requests?
ReplyDeleteIf the rate of increase in requests continues to increase by a factor of 4 or 5, then this year will see nearly a billion requests and 2015 will see something like 3 to 4 billion requests for take downs! That is several million take down requests every day of the year, I presume that the Google processing of these requests must be by some sort of testing algorithm , not by a 'army' of human operators.
ReplyDelete