Digital Asia News Update
It explains it in the article, although it’s easy to miss:
Every month, hundreds of thousands of volunteers make decisions about what content to include on Wikipedia, what constitutes a copyright violation, and when those decisions need to be revised. We like it this way — it allows people, not algorithms, to make decisions about what knowledge should be presented back to the rest of the world.
The world should be concerned about new proposals to introduce a system that would automatically filter information before it appears online. Through pre-filtering obligations or increased liability for user uploads, platforms would be forced to create costly, often biased systems to automatically review and filter out potential copyright violations on their sites. We already know that these systems are historically faulty and often lead to false positives. For example, consider the experience of a German professor who repeatedly received copyright violation notices when using public domain music from Beethoven, Bartók, and Schubert in videos on YouTube.
Here’s another article about it from Wikia, who have the same concerns as Wikipedia about the directive:
If passed, the proposed new law appears to require Wikia to verify every edit and upload, and remove any content that appears to be used without permission. They want to make Wikia actively police all content added to our sites.
In other words: this law would require us to censor our communities.
Article 13 of the proposed European Copyright Directive seems to require that companies automatically verify the copyright status of all edits, posts, videos and images. This would be very costly, often unfair, and would restrict freedom of speech and access to information.