Digital fairness – protecting consumers online

The European Commission plans to propose a Digital Fairness Act in 2026 and to improve enforcement of existing rules

The newly adopted 2030 Consumer Agenda outlines the EU’s strategic direction for consumer policy over the next five years, placing a strong emphasis on promoting digital fairness and strengthening consumer protection online.

The rapid growth of e-commerce has brought many benefits for consumers, introducing new products and services and changing the way people shop, enjoy entertainment, access information and interact with businesses. This makes it essential that EU consumer law continues to provide strong protection for consumers online.

In 2022, the European Commission carried out a Digital Fairness Fitness Check to evaluate the challenges consumers face in digital markets and to assess whether existing EU consumer laws offer sufficient protection or if specific updates are needed. The Fitness Check has shown that there are gaps and problems in online consumer protection that need to be tackled to better protect consumers, to give businesses clearer rules, to avoid inconsistent laws and to make enforcement easier.

In view of this, the commission aims not only to improve enforcement of existing rules but also plans to propose a Digital Fairness Act in 2026. The new law will give consumers stronger protection online against practices like manipulative ‘dark patterns’ that trick users into unwanted subscriptions, addictive features in apps or video games, misleading promotions by social media influencers, unfair personalisation that take advantage of consumer weaknesses, and problematic features in digital products such as social media, video games and e-commerce. The new proposed act also aims to simplify rules for businesses, such as clarifying what information they must provide in online contracts or when consumers make repeated purchases from the same seller.

The 2030 Consumer Agenda will also focus on improving online protection for minors. Young consumers are especially vulnerable, as they often have unique consumption patterns and are among the first to use new technologies and digital products. While the Digital Fairness Act will enhance online protection for minors, the European Commission will also carry out an EU-wide inquiry to examine how social media affects the well-being of young people.

“The EU remains strongly committed to protecting consumers in today’s fast-changing digital world”

Protection of consumers against online fraud will be prioritised during these five years. Online fraud is one of the fastest-growing crimes on the internet and causes serious financial harm to consumers. To address this, the commission has proposed updating the Payment Services Directive by expanding fraud prevention rules for payment service providers and strengthening the refund rights of consumers who fall victim to fraud.

The commission will also publish an action plan aimed at improving prevention, making law enforcement more effective and helping fraud victims recover their money.

It will also continue to build on the Digital Services Act to combat online fraud and financial scams, ensuring that platforms and search engines protect consumers from deceptive practices.

Another important challenge is the fast growth of AI in consumer markets. While AI can bring benefits, like faster services, personalised advice and cost savings, it also carries risks, such as less human interaction, errors, bias or system failures.

The EU’s Artificial Intelligence Act provides a strong framework to ensure AI systems are trustworthy and respect people’s fundamental rights. In its 2030 Agenda, the commission commits to work closely with member states to make ensure the AI Act, along with consumer protection and product safety rules, are applied consistently.

It also commits to review the list of banned AI practices on a yearly basis and to update them as needed. It is essential that consumer protection keeps up with AI developments and consumers receive clear information when interacting with AI so they understand the nature of the interaction and the potential risks.

In conclusion, the EU remains strongly committed to protecting consumers in today’s fast-changing digital world. Through initiatives such as the upcoming Digital Fairness Act, updates to the Payment Services Directive, the Digital Services Act and the Artificial Intelligence Act, the commission aims to address emerging online risks, safeguard vulnerable groups like minors and provide clearer rules for businesses.

By keeping consumer protection aligned with technological developments, digital markets can remain fair, safe and trustworthy for all.

Odette Vella is director, Information and Research Directorate, MCCAA.

Total
0
Shares
Previous Article
Free Enterprise Award

MFSA chair honoured with IFNY ‘Free Enterprise Award’

Next Article
Industrial production

Index of industrial production down 1.0 per cent in October