What sounds like a reduction in bureaucracy could turn into a data protection disaster for consumers.

The EU Commission is planning a major update to European digital policy with its "Digital Omnibus" initiative. The goal: less paperwork, more innovation, clear rules. Sounds good—but what is currently on the table is causing alarm among consumer advocates and lawyers.

The reason? An inconspicuous proposal in the draft, Article 88c, which deeply interferes with the General Data Protection Regulation (GDPR) – and does so in favor of corporations that work with large amounts of data. In plain language: Anyone who classifies data processing as an "AI application" in the future will be exempt from many of the previous data protection obligations.

What sounds like technological progress could in fact be a massive watering down of European data protection—with consequences for hundreds of millions of people.

 

AI as an excuse? When data protection suddenly no longer applies

According to a recent report by the law firm Spirit Legal, commissioned by the Federation of German Consumer Organizations, the new regulation opens up dangerous loopholes.

The trick: The term "AI system" is so vague and broad in the proposal that virtually any automated data processing could fall under it. Companies could then simply claim that their systems are based on AI—and thus elegantly avoid complying with parts of the GDPR.

Particularly controversial: even sensitive data such as health information or political opinions would be much easier to process—the more data, the better, according to the logic of the draft. This would completely overturn the principle of data minimization that has characterized European data protection rules to date.

 

Losing rights without realizing it

According to the expert opinion, many protective mechanisms that are still firmly anchored in the GDPR today would be outsourced to legally non-binding sections of text. This would render them toothless in case of doubt.

An example: The option to object to the use of one's own data is to be provided for technically in the future – but without any legally enforceable obligation. Particularly in the case of so-called web scraping, i.e., the automated collection of publicly available data, those affected often have no idea that their information is being processed at all – let alone how they could prevent it.

In addition, there are currently few regulations on how to prevent personal data from reappearing in trained AI models—for example, in the form of responses that users receive later. Here, too, experts are calling for clear standards—so far in vain.

 

Youth data, parental responsibility—and a risky breach of trust

The authors are particularly critical of the handling of data relating to minors. Since many young people are unable to assess how their data will be used in the long term, the experts are calling for parental consent to be made mandatory. Furthermore, anyone who reaches the age of majority should automatically be given the option to object to the further use of their data without having to provide a reason.

For the Federation of German Consumer Organizations, one thing is certain: the current draft primarily favors American platforms. These could exploit legal gray areas with unclear wording, while European companies and users would be left behind with less protection.

In addition, a survey commissioned by the vzbv shows that 87 percent of people in Germany only trust digital services if their data is secure. For them, the GDPR is the most important anchor of trust. If this is weakened, the EU risks not only the protection but also the social acceptance of new technologies.

 

What lies ahead for citizens?

To put it more simply: the EU is planning a two-tier data protection system. Those who work with AI will receive special privileges, while those who want to protect their data will be left out in the cold. All in the name of innovation. But progress without consideration is not progress, but a free pass for the wrong people. Those who make data protection so porous should not be surprised when trust is lost. Because the winners here are not the citizens, but those who already know too much about us today.

 

Source: heise.de

Subscribe to the newsletter

and always up to date on data protection.