Child rights groups call TikTok ‘design discrimination’

- Advertisement -


A study looking at the default settings and terms offered to underage social media giants TikTok, WhatsApp and Instagram in 14 different countries, including the US, Brazil, Indonesia and the UK found that the three platforms do not provide the same level of privacy and security. protecting children in all markets where they operate.

- Advertisement -

According to a new report titled: Global platforms, partial protection – which found “significant” differences in the experiences of children across countries on “seemingly identical platforms”.

- Advertisement -

The study was carried out Faira non-profit organization that advocates for an end to marketing aimed at children.

TikTok has proven particularly problematic in this regard. And, along with the release of the Fairplay report, the company was featured in a joint letter signed by nearly 40 child safety and digital rights advocacy groups calling for a Safe By Design and Children’s Rights By Design approach. globally, and not just to ensure the highest standards in regions like Europe, where regulators have taken early action to protect children online.

- Advertisement -

Citing information in the Fairplay report, 39 child advocacy and digital rights organizations from 11 countries, including the UK-based 5Rights Foundation, the Tech Transparency Project, the African Digital Rights Center in Ghana and the Eating Disorders Coalition for Research, Policy and actions, to name a few, signed a letter to TikTok CEO Show Zi Chu, urging him to address the key design differences highlighted in the report.

These include inconsistencies in where TikTok offers “age-appropriate” designs for minors, such as the default settings for private (as happens in the UK and some EU markets) – while elsewhere it was found that the default is 17 -year users to public accounts.

The report also identifies many (non-European) markets where TikTok does not provide its terms of service in the native language of young people. Also critical is the lack of transparency around minimum age requirements – TikTok’s detection sometimes provides users with conflicting information, making it difficult for minors to know if the service is right for them to use.

“Many of the young TikTok users are not European; TikTok’s biggest markets are in the US, Indonesia, and Brazil. All children and young people deserve an age-appropriate experience, not just Europeans.

Fairplay’s research methodology involved core researchers based in London and Sydney who reviewed the privacy policies and platform terms and conditions, supported by a global network of local research organizations, which included the creation of experimental accounts to explore options for proposed default settings. up to 17 year olds in different markets.

The researchers suggest their findings cast doubt on the social media giants’ claims that they care about protecting children, as they clearly don’t provide the same standards of safety and privacy for minors everywhere.

Instead, social media platforms appear to be exploiting gaps in the global juvenile legal system to prioritize commercial goals, such as increased engagement, at the expense of children’s safety and privacy.

Notably, children in the Global South and some other regions are more susceptible to manipulative design than children in Europe, where legal frameworks have already been put in place to protect their online experience, such as British design code age appropriate (effective from September 2020); or the European Union General Data Protection Regulation (GDPR), which will come into effect in May 2018 and require data processors to take extra care to ensure protection where services process information from minors, with the risk of large fines for non-compliance.

Asked to summarize the study’s findings in one line, a Fairplay spokesperson told TechCrunch: “In terms of a one-line summary, this is that regulation works and tech companies don’t work without it.She also suggested that the correct conclusion is that the lack of regulation makes users more vulnerable to “the vagaries of the platform’s business model.”

In the report, the authors expressly urge lawmakers to implement settings and policies that provide “maximum protection for the well-being and privacy of young people.”

The report’s findings are likely to add calls for lawmakers outside of Europe to step up their efforts to pass legislation to protect children in the digital age — and to avoid the risk that platforms will focus their most discriminatory and predatory behavior on minors living in markets where the law is absent. checks “datification” by default.

In recent months, California lawmakers have tried to pass British style age-appropriate design code. Although earlier this year a number of US senators proposed Children’s Online Safety Act as the issue of children’s online safety has received more attention, although the passage of privacy legislation at the federal level of any stripe in the US is still a major issue.

In an accompanying statement, Rhys Farthing, report author and researcher at Fairplay, said: “It is disturbing to think that these companies are targeting young people for whom they must provide the best possible security and privacy protection. It’s reasonable to expect that once a company has figured out how to make their products a little better for kids, they’ll extend that to all young people. But social media companies are failing us again and continue to create unnecessary risks for their platforms. Legislators need to step in and pass rules that require digital service providers to design their products in a way that works for young people.”

“Many jurisdictions around the world are looking into this kind of regulation,” she also noted in comments accompanying the publication of the report. “In California, an age-appropriate design code that is before the State Assembly can ensure that some of these risks are removed for young people. Otherwise, you can expect social media companies to offer them second-rate privacy and security.”

Asked why Meta, which owns Instagram and WhatsApp, also doesn’t receive criticism from human rights groups, a Fairplay spokeswoman said TikTok was “by far the worst platform” by its researchers. greatest urgency” to focus their defense on that. (Although the report itself also discusses issues with the two platforms owned by Meta.)

“TikTok has over a billion active users, and according to various global estimates, between a third and a quarter of them are minors. The security and privacy decisions your company makes have the potential to affect 250 million young people worldwide, and these decisions must ensure that the best interests of children and young people are and are being pursued equally,” the advocacy groups write in the letter.

“We encourage you to adopt a Safe by Design and Children by Design approach and immediately conduct a global risk assessment of your products to identify and mitigate privacy and security risks on your platform. If a local practice or policy is found to provide maximum safety or privacy for children, TikTok should embrace it globally. All younger TikTok users deserve the best possible protection and maximum privacy, not just children from European jurisdictions where regulators took early action.”

While European lawmakers may have reason to feel a little complacent in light of the relatively higher standards of protection that Fairplay researchers offer to children in the region, the key word here is relative: even in Europe, a region that is considered the de facto world leader. in data protection standards – TikTok has faced in recent years a series of complaints about the safety and privacy of children; including class actions and regulatory investigations how it handles data about children.

Criticism of TikTok regarding child safety in the region persists – especially due to its extensive profiling and targeting of users – and many of the aforementioned lawsuits and investigations continue and remain unresolved, even as new issues emerge.

Just this week, for example, the Italian data protection agency sounded the alarm about TikTok’s planned privacy policy change that has been proposed does not comply with existing EU privacy laws. issuance of a formal warning. He urged the platform not to push for the switch, which he said could backfire on minors on the service, who could be shown inappropriate “personalized” ads.

Back in 2021Italian authorities also intervened after child safety concerns he said were linked to the TikTok challenge – ordering the company to block users who couldn’t verify the age. TikTok continued delete over half a million accounts the country, which he says could not be verified, has been missing for at least 13 years.


Credit: techcrunch.com /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox