Digital Services Act (DSA) – Europe tackling illegal content and personalized advertising
Europe has presented a second major tech law. The Digital Services Act must ensure that EU citizens are better protected online. For example, children should no longer receive targeted ads and illegal content should be removed from social media more quickly.
The Digital Services Act (DSA) exists alongside the Digital Markets Act (DMA) and together these laws lay a new foundation for how online platforms should function. The DMA focuses on a level playing field for businesses, while the DSA focuses more on how platforms are set up. As a result, Internet users will notice more of the DSA in everyday use.
Digital Services Act: No personalized ads for children
Many Internet companies use information from users to display targeted advertisements. This is done on the basis of registered searches and website visits, among other things. This form of personalized advertising is going to be overhauled.
Sensitive user data may no longer be used for this purpose. This includes information such as religion, sexual orientation and ethnicity. Minors may not see any personalized ads at all.
Governments can turn to social media
Governments from the European Union can use the new legislation to indicate that illegal content on online platforms must be removed. This concerns, for example, posts that spread terrorism, child abuse or hatred.
It will be mandatory for companies to draw up rules that reflect what companies do to tackle illegal reporting. Major platforms such as Facebook and Google have already set up their own rules in recent years, but this will become mandatory.
European Commission President Ursula von der Leyen says this tightening will ensure that the Internet is a safer place. “In practice, for illegal content, what is illegal offline is also illegal online.” In doing so, large companies will have more responsibilities than smaller platforms.
Explanation of removal
If illegal content has been removed, online platforms and hosting companies should be able to explain why the posts were deleted under the new legislation. In addition, users should always be able to object. Companies must facilitate this in an accessible manner.
Incidentally, the DSA does not state what is illegal and what is not; that depends on the legislation in the individual countries.
More transparency about algorithms
Large online platforms, such as Facebook, must offer more insight into how their algorithms work. For example, it should be clear to users why certain posts on Facebook appear at the top of their news feed. On YouTube, for example, it should be clear why certain videos are recommended.
The companies need to better explain how the algorithms are influenced and what effects they have. People should also be able to easily turn off personal recommendations.
Incidentally, the platforms are allowed to stick to their personalized display of messages. Earlier, the EU summit talked about possibly requiring a chronological order of posts on users’ timelines, but that didn’t go through.
Digital Services Act – Entry into force of the law still takes some time
Although the content of the law has been determined, the details still need to be worked out. If all goes well, the Digital Service Act could come into force in 2024.