ER Editor: A reminder that the shoe is expected to drop today, August 25. For the article below we issue a Severe MSM Warning, level 5.
Oblivious to the chilling effect of this law, the article does give us an indication that such censorship will be used to interfere in national elections of EU member states.
See this we published back in early July —
Government censorship of public online discourse in the West’s ostensibly liberal democracies has been largely covert until now, as revealed by the Twitter Files. But thanks to the EU’s Digital Services Act, it is about to become overt.
Next month, a little-known development will occur that could end up having huge repercussions for the nature of public discourse on the Internet all over the planet. August 25, 2023 is the date by which big social media platforms will have to begin fully complying with the European Union’s Digital Services Act, or DSA. The DSA, among many other things, obliges all “Very Large Online Platforms”, or VLOPs, to speedily remove illegal content, hate speech and so-called disinformation from their platforms. If not, they risk fines of up to 6% of their annual global revenue.
With the GDPR, Europe rewrote the rules of online privacy. Now, it’s turning its attention to making the internet safer.
After setting far-reaching benchmarks for privacy standards, the EU is ready to usher in rules that could reshape the web, forcing Big Tech companies like Facebook, YouTube and Instagram to curb the spread of toxic content on their platforms and increase their operations’ transparency.
As soon as Friday, 19 of the world’s largest social media companies, e-commerce platforms and search engines will be required to comply with the Digital Services Act (DSA) — or else face sweeping fines of up to 6 percent of their global annual revenue.
Daunting though the new rulebook may be, it will only be as powerful as its enforcement on Big Tech firms, which is largely in the hands of the European Commission. In the opposite corner stand the tech giants themselves and the weight of expectations — and impatience — across Europe.
“Europe is now effectively the first jurisdiction in the world where online platforms no longer benefit from a ‘free pass’ and set their own rules,” said Internal Market Commissioner Thierry Breton, who spearheads the Commission directorate that will enforce the DSA.
“Technology has been ‘stress testing’ our society. It was time to turn the tables and ensure that no online platform behaves as if it was ‘too big to care.’”
Reshaping users’ online experience
Some of the requirements companies face include swiftly removing illegal content; stopping the use of people’s sensitive data, like their health information and sexual orientation, to show them personalized ads; and revealing previously secret information about how they operate. Companies will have to tell users if they remove their content, limit its visibility, or stop its monetization — and explain why.
Social media networks like Instagram and Facebook have already announced that European users will be able to tailor their feeds to see posts shared by accounts they follow, or in chronological order. TikTok said its users could choose to be shown videos based on their location, or on worldwide popularity, instead of based on the company’s own algorithm. Other companies like Snapchat announced how they were by making it impossible for advertisers to use teenagers’ data to show them personalized ads.
Companies will also have to identify — and implement concrete measures to counter — major long-term risks that their platforms pose for society, such as disinformation and negative effects on mental health under the scrutiny of the Commission, auditors and vetted researchers.
“My expectation is that throughout the DSA enforcement saga, we will see a change in the business structures of platforms,” said Renate Nikolay, the deputy director of the directorate supervising the law, in a late-June public discussion with activists, consumer representatives, lawyers, tech executives and academics.
The Commission plays cop
It’s one thing to come up with an ambitious rulebook. It’s another to successfully enforce it.
The content-moderation law has serious potential to bite. The law provides for stronger fines than its GDPR sister rulebook — 6 percent of companies’ annual revenue, compared with 4 percent. Led by the team that wrote the law — and knows it inside and out — the Commission will have broad enforcement powers, similar to antitrust investigators’, to oversee and ensure the compliance of the biggest tech firms. It will also receive extra yearly funding — an estimated €45 million for 2024 — funded through an annual levy from the Big Tech firms themselves.
The teams in Brussels will be backed by dozens of artificial intelligence and computer scientists at the Commission’s European Centre for Algorithmic Transparency (ECAT). And the Commission will also cooperate with national EU digital regulators, including in Ireland, where most of the affected tech firms have their EU headquarters.
“My services and I will thoroughly enforce the DSA, and fully use our new powers to investigate and sanction platforms where warranted,” Breton said in comments shared with journalists.
Priorities will include checking whether the designated companies are doing enough to protect children online and to fight disinformation campaigns, especially ahead of crucial national elections in Slovakia and Poland next year, as well those for the European Parliament in June 2024, Breton added.
The Liberty Beacon Project is now expanding at a near exponential rate, and for this we are grateful and excited! But we must also be practical. For 7 years we have not asked for any donations, and have built this project with our own funds as we grew. We are now experiencing ever increasing growing pains due to the large number of websites and projects we represent. So we have just installed donation buttons on our websites and ask that you consider this when you visit them. Nothing is too small. We thank you for all your support and your considerations … (TLB)
Comment Policy: As a privately owned web site, we reserve the right to remove comments that contain spam, advertising, vulgarity, threats of violence, racism, or personal/abusive attacks on other users. This also applies to trolling, the use of more than one alias, or just intentional mischief. Enforcement of this policy is at the discretion of this websites administrators. Repeat offenders may be blocked or permanently banned without prior warning.
Disclaimer: TLB websites contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, health, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.
Disclaimer: The information and opinions shared are for informational purposes only including, but not limited to, text, graphics, images and other material are not intended as medical advice or instruction. Nothing mentioned is intended to be a substitute for professional medical advice, diagnosis or treatment.