The European Union is stepping up regulation of social media platforms, with plans to focus on cracking down on "addictive design" features in apps such as TikTok and Instagram that target minors. European Commission President von der Leyen said at the "Artificial Intelligence and Children" European Summit in Denmark on Tuesday that the EU will take action against the platform later this year and promote the implementation of relevant regulations.

Von der Leyen singled out the EU for taking measures against TikTok's "addictive design," which includes features such as endless scrolling, autoplay, and continuous push notifications. She also said that the same problem also exists on Meta-owned Instagram and Facebook. These platforms are considered to have failed to effectively enforce their own minimum age of 13 years. She stressed that the EU was investigating platforms that send children down a "rabbit hole of harmful content", such as videos that promote eating disorders such as anorexia or self-harm.

To strengthen the protection of minors online, the European Commission has also developed its own age verification application, which von der Leyen said has the "highest privacy standards in the world." The application will be available for member states to integrate into their respective digital wallets, while the age verification process can be easily accessed and executed via an online platform. "There are no excuses anymore - the age verification technology is here," she said. The European Commission could have a legal proposal ready as soon as this summer and is currently awaiting submissions from its Special Expert Group on Children's Online Safety.

This round of regulatory upgrades on child safety issues on social platforms is set against the backdrop of the European Union’s overall “tightening of regulation” of U.S. technology giants. Over the past year, the European Union has successively issued fines totaling more than $7 billion to a number of large technology companies based on antitrust and competition laws, causing dissatisfaction in the United States. Some officials warned that the EU may miss some economic opportunities in artificial intelligence as a result. U.S. President Donald Trump is trying to counter these penalties against U.S. companies. Companies such as Apple, Meta and Google have all faced hefty fines for allegedly violating EU regulations and have lodged objections.

In February this year, Trump signed a presidential memorandum proposing to consider imposing tariffs and other methods to "fight back against digital service taxes, fines, and related unfair policies imposed by foreign governments against U.S. companies." Meanwhile, the European Union also launched an investigation earlier this year into Elon Musk's X (formerly Twitter) over allegations that the platform's chatbot Grok generated and distributed non-consensual pornographic images of women and children.

Legal scrutiny surrounding the safety of minors on social media is heating up around the world. In March, a high-profile U.S. lawsuit ruled that some design features of Meta and YouTube—including infinite scrolling and autoplay—encouraged addictive behavior in teenagers and caused harm to their mental health. Recently, a preliminary investigation by the European Commission also found that Meta failed to effectively prevent minors under the age of 13 from accessing its platform. Minors could easily bypass existing detection mechanisms, and was therefore found to have violated the relevant requirements of the EU's Digital Services Act.

In addition to EU-level initiatives, some countries are trying to go further. In December this year, Australia became the first to implement a "total ban" on social media for people under the age of 16, banning minors from using mainstream social platforms. European countries such as Spain, France and the United Kingdom are also pushing for similar legislation, planning to isolate children and teenagers from social media as much as possible by raising age thresholds, forcing identity verification or directly restricting access. Under multiple regulatory pressures, platforms such as TikTok and Instagram may have to make in-depth adjustments to their core product designs in the future to comply with the higher standards of the European Union and other countries for the online protection of children.