As COVID-19 spread rapidly across the world in 2020, people everywhere were hungry for reliable information. A global network of volunteers rose to the challenge, integrating information from scientists, journalists and medical professionals, and making it accessible to the general public.
Two of them live about 3,200 km apart from each other: Dr. ala najjari is a Wikipedia volunteer and medical doctor who spends leisure time addressing COVID-19 misinformation on the Arabic version of the site during his emergency room shift. Swedish Dr. netha hussain, a clinical neuroscientist and doctor, spent his downtime editing COVID-19 articles in English and Malayalam (a language of southwest India), the latter in his efforts to improve Wikipedia articles about COVID-19 vaccines focused on.
Thanks to Najjar, Hussain and over 280,000 volunteers, Wikipedia has emerged as one of the most trusted sources for up-to-date, COVID-19. comprehensive information about, spread over 7,000 articles in 188 languages. Wikipedia’s reach and ability to support knowledge-sharing on a global scale – from informing the public about a major disease to helping students study for trials – is only possible by laws that its partners, Enable volunteer-led models to flourish.
As the European Parliament considers new rules aimed at holding Big Tech platforms accountable for illegal content amplified on their websites and apps through packages such as the Digital Services Act (DSA), it will need citizens to cooperate in serving the public interest. ability should be protected.
Lawmakers have the right to attempt to prevent the dissemination of material that causes physical or psychological harm, including material that is illegal in many jurisdictions. As they consider a range of provisions for the Comprehensive DSA, we welcome some of the proposed elements, including requirements for greater transparency in how the Platform’s content moderation works.
But the current draft also includes how the terms of service should be enforced. At first glance, these measures may seem necessary to curb the growing power of social media, curb the spread of illegal content, and ensure the safety of online spaces. But what happens to projects like Wikipedia? Some of the proposed requirements may move power from people to platform providers, which could affect digital platforms operating differently from larger commercial platforms.
Big tech platforms operate in fundamentally different ways than non-profit, collaborative websites like Wikipedia. All articles created by Wikipedia volunteers are available for free, without ads and without tracking the browsing habits of our readers. The incentive structures of commercial platforms maximize profits and time on site, using algorithms that take advantage of detailed user profiles to target people with content that is most likely to impress them. They deploy more algorithms to automatically moderate content, resulting in more and fewer enforcement errors. For example, computer programs often confuse artwork And Satire with illegal content, while failing to understand the human nuances and context necessary to enforce the actual rules of the platform.
NS Wikimedia Foundation and affiliates located in specific countries, such as Wikimedia GermanySupport Wikipedia volunteers and their autonomy to decide what information should and should not be on Wikipedia. The open editing model of online encyclopedias is based on the belief that People Taking advantage of established volunteer-developed rules for neutrality and credible sources, it is important to decide what information resides on Wikipedia.
This model ensures that for any Wikipedia article on a topic, people who know and care about a topic enforce the rules about what content is allowed on its page. What’s more, our content moderation is transparent and accountable: all conversations between editors on the platform are publicly accessible. It’s not a perfect system, but it has worked extensively To make Wikipedia a global source of neutral and verified information.
Due to the lack of accountability to our readers and editors, forcing Wikipedia to operate like a commercial platform with a top-down power structure, the DSA’s genuine public interest by excluding our communities from critical decisions about content would surely destroy his intentions.
The Internet is at an inflection point. Democracy and civilian space are under attack in Europe and around the world. Now, more than ever, we all need to think carefully about how the new rules will promote, not hinder, an online environment that allows for new forms of culture, science, participation and knowledge.
Lawmakers can engage with public interest communities like ours to develop standards and principles that are more inclusive, more enforceable, and more effective. But they should not impose rules that are aimed solely at the most powerful commercial Internet platforms.
We all deserve a better, safer internet. We call on lawmakers to work with partners in all fields, including Wikimedia, to create rules that empower citizens to improve.