720-891-1663

The EU is Enforcing the Digital Services Act and it Might Help You

According to the Electronic Frontier Foundation,

The DSA overhauls the EU’s core platform regulation, the e-Commerce Directive, and is intended to be an important tool in making the internet a fairer place by setting out new legal responsibilities for online platforms and educating users on why content is removed and what they can do about it. The powers of Big Tech are also reined in as the DSA subjects “very large online platforms (VLOPs)” to comply with far-reaching obligations and responsibly tackle systemic risks and abuse on their platform. These risks cover a variety of aspects, including the dissemination of illegal content, disinformation, and negative impact on fundamental rights. VLOPs also face oversight through independent audits, which will assess whether platforms respect the obligations under the DSA.

https://www.eff.org/deeplinks/2022/12/adoption-eus-digital-services-act-landmark-year-platform-regulation-2022-year

The EU Commission has identified 19 online platforms and search engines that will face new content moderation rules. To qualify as a VLOP, a company must have at least 45 million users in the EU, so it is very unlikely that you will be subject to the rule.

Unless, of course, you work for Amazon, Faceboook, Instagram, LinkedIn, Snapchat, TikTok and about a dozen others.

These companies need to become compliant by August 2025.

Will these big systems create two processes – one for Europe and one for the rest of the world? It is certainly possible, but hopefully, they won’t do that. Hopefully.

By August 2025 these platforms will need to show that they are compliant and that they have done their first risk assessment.

Failure to comply can cost a platform up to 6% of the company’s global revenue. That is a powerful incentive.

Among the features that these platforms will need to put in place is stronger content moderation – something that platforms like Facebook and Twitter don’t want. Some people come to the platform to consume disinformation and others come to lash out against disinformation, but to the platforms, it is all about engaging the viewer and if that means spewing lies, whatever. If you have any question about that, all you have to do is read the discovery documents from the $787 million settlement Fox paid.

Among the practices that are banned are targeting users based on religion, gender or sexual preference, using dark patterns to trick users and deceptive web design to try and trick people into clicking on things that they did not intend to click on.

Stay tuned to see how this plays out. Credit: Computerworld

Facebooktwitterredditlinkedinmailby feather

Leave a Reply

Your email address will not be published. Required fields are marked *