With the current pandemic, the time we spend on the Internet has increased. Every day, we are on social networks or we use one of the thousands of on-line services, such as on-line market places or file sharing platforms. Anyway, the main legislation setting the rules in this area has remained mostly unchanged for more than two decades. The Digital Services Act should help us to face the challenges of the present. Together with the Digital Markets Act, both update current rules in the digital environment and regulate on-line platforms. How will the new legislation affect our lives? And why should you care?
New times, new challenges
The year 2000 was marked by a number of milestones for Czechia. The electronic signature was approved, the sale of e-books became legal, or the movie “Night Talks with the Mother” was screened as the first domestic film with a premiere on the Internet. At the same time, at the turn of the millennium, the e-Commerce Directive, which set the rules in the Internet space in Europe, was also adopted.
Since then, a number of new services have emerged, and there has been a boom in social networks and Internet platforms such as Facebook, Google, and Amazon. Even in the Czech digital pond, services such as the Seznam internet portal, the Alza or Mall e-shops and many others have grown in recent years. Our communication with friends, the way we shop or do business have also changed. The new era brings not only benefits but also new threats. Legislation, which will reflect and address the current situation, should therefore also be changed.
What is the Digital Services Act?
The Digital Services Act aims to improve user protection, so that they feel safe in the on-line space and their rights are protected. We should return the on-line space back into the hands of people, as opposed to technology giants who are currently deciding what content we can see, what may or may not be published, and who use our personal data to their advantage. Updating the European legal framework should set clear responsibilities and boundaries.
The Digital Services Act answers to the questions “Who has the right to delete illegal or questionable content?” and “Should upload filters be used or is human control necessary?”. In addition, it determines who is responsible for the content if it is not removed in a timely manner. The Digital Services Act includes the following topics:
- On-line content moderation;
- Responsibility for user-uploaded content;
- Regulation of targeted advertising: increase the transparency of on-line advertising for Internet platforms;
- Restrictions on the spread of disinformation and illegal content;
- Specific obligation on very large platforms;
- Functioning and transparency of algorithms.
What is and what is not legal should be decided by independent courts
The case around former US President Donald Trump and the suspension of his social network accounts has sparked a strong debate about the appropriateness of suspending accounts and deleting posts by the platform operators. In order to prevent these situations, the Digital Services Act should address the process of removing illegal content in public on-line space. At the same time, it should prevent the removal of content that does not violate any rules. We have to follow the rule that what is legal off-line must be legal on-line.
One of the problems of the current Digital Services Act proposal is that it leaves too much of the decision to the platforms themselves. I want to work on this during the legislative process. What is or is not legal should be decided by independent courts under the relevant legal framework and not by private companies. These large tech companies often turn to automated tools and remove all potentially illegal content in order to avoid liability. This results in over-removal of legal content. We also need to update the functioning of the courts, so that they can respond quickly and flexibly, as today’s digital age requires.
Delete it or keep it?
Together with Members of my political group, we published a model law which relates to the so-called Notice and Action principle, where we set two levels. The first would be manifestly illegal content, such as child pornography. In the first phase, it would be assessed by the platforms themselves, which would temporarily block such content. However, the final decision should again be made by the court. The second level is potentially illegal content. This category includes potentially illegal threats or hate speech. These by nature context dependent content would always have to be judged by an independent court.
They follow our every step for better targeted advertising
Familiar situation, you are on social networks or reading news, and advertising offers you products that you’ve been recently searching. Currently, we do not have a clear idea of what data multinational companies collect about us and exactly what they use it for. We know that attributes such as our location, gender, age, or political affiliation, creates our exact profile and based on that, it is presumed that we may be interested in buying a particular product or service.
It is necessary to introduce much greater transparency in this area. Every user should be able to get information about whose advertisement is targeting them and on what basis, but also what the non-targeted ad looks like. Legislative limits need to be set where companies can collect sensitive data. Making our every step on-line, every click tracked and used for self-preferencing targeting is an inadmissible invasion of privacy.
Greater transparency will prevent the spread of disinformation
When a disinformation campaign is well funded, disinformation spreads successfully. In addition to transparent targeting, there is also a need to introduce transparency of Internet advertisers and targeting processes and rules. In this way, disinformation rabbit holes are deepened, where users put on their eye flaps and can only read material that deepens their disillusionment about the world and strengthen them in their view of the world.
As we have seen in the case of communication and disinformation about Brexit, misleading targeted messages can also negatively affect democratic principles. We need to very strictly regulate Internet advertising which relies on sensitive personal data and monitor financial flows, so that it is not possible to simply fund a disinformation campaign. There must be a regulation in this regard.
I work on a change
In December 2020, the European Commission presented a proposal on what the Digital Services Act should look like. Now, the legislative relay has been transferred to the Council of the European Union and the European Parliament. As shadow rapporteur in the Committee on Culture and Education (CULT), I actively work to ensure that the Digital Services Act provides rules for the Internet that respect privacy, ensure anonymity, prevent general monitoring, limit excessive data gathering used for micro-targeted advertising, and ensure secure encrypted communication.