The FDA should regulate Instagram’s algorithm as a drug

DMCA / Correction Notice
- Advertisement -

Wall Street Journal on Tuesday informed of Silicon Valley’s worst secret: Instagram hurts teen mental health; In fact, its effect is so negative that it introduces suicidal thoughts.

- Advertisement -

Thirty-two percent of teenage girls who feel bad about their bodies report that Instagram makes them feel worse. The WSJ report states that among teens with suicidal thoughts, 13% of British and 6% of American users search for those thoughts on Instagram. This is Facebook’s internal data. The truth is definitely worse.

President Theodore Roosevelt and Congress Make The Food and Drug Administration in 1906 precisely because Big Food and Big Pharma failed to protect the general welfare. as its officers the parade In celebration of the unattainable 0.01% lifestyle and body at the Met Gala that we mere mortals will never achieve, Instagram’s reluctance to do the right thing is a clear call for regulation: giving the FDA its codified authority to power algorithms. The emphasis should be on the drug of Instagram.


FDA should consider algorithms a drug affecting our nation’s mental health: Federal Food, Drug and Cosmetic Act gives The FDA’s authority to regulate drugs, partly defines drugs as “objects (other than food) intended to affect the structure or any function of the body of humans or other animals.” Instagram’s internal data shows that its technology is the kind of article that changes our minds. If this effort fails, Congress and President Joe Biden should create a mental health FDA.

Researchers can study what Facebook prioritizes and how those decisions affect our brains. How do we know this? Because Facebook is already doing that – they’re just burying the results.

The public needs to understand what the algorithms of Facebook and Instagram prioritize. Our government is well equipped to conduct clinical trials studies of products that may cause physical harm to the public. Researchers can study what the privileges of Facebook are and how those decisions affect our brains. How do we know this? Because Facebook is already doing that – they’re just burying the results.

- Advertisement -

In November 2020, as reported by Cecilia Kang and Shira Frenkel “an ugly truthFacebook made an emergency change to its News Feed, placing more emphasis on the “News Ecosystem Quality” score (NEQ). High NEQ sources were reliable sources; were less unreliable. Facebook changed the algorithm to privilege higher NEQ scores. As a result, for the five days around the election, users saw a “good news feed” with less fake news and fewer conspiracy theories. But Mark Zuckerberg reversed the change because it reduced engagement and could lead to a conservative backlash. The public had to bear the brunt of this.

Facebook has similarly studied what happens when the algorithm privileges “bad for the world” content over “good for the world” content. Lo and behold, the engagement goes down. Facebook knows that its algorithms have a remarkable impact on the minds of the American public. How can the government allow an individual to set standards on the basis of his occupational exigencies rather than general welfare?

Upton Sinclair memorably exposed the dangerous abuse in “The Jungle”, which caused public outrage. The free market failed. Consumers needed protection. The 1906 Pure Food and Drug Act promulgated safety standards for the first time, regulating consumables that affect our physical health. Today, we need to regulate the algorithms that affect our mental health. teen depression is on the rise worrying since 2007. Similarly, there has been a nearly 60% increase between 2007 and 2018 in suicides from 10 to 24.

Of course it is impossible to prove that social media is entirely responsible for this growth, but it is absurd to argue that it has not contributed. Filter bubbles distort our thoughts and make them more extreme. Online bullying is easy and stable. Regulators should audit the algorithm and question the likes of Facebook.

When it comes to Facebook’s biggest problem – what is the product? makes us – Regulators have struggled to clarify the problem. Section 230 is correct in its intent and application; The Internet cannot work if platforms are responsible for each user’s pronunciation. And a private company like Facebook loses the trust of its community if it enforces arbitrary rules that target users based on their background or political beliefs. Facebook as a company has no explicit duty to uphold the First Amendment, but a public perception of its fairness is essential to the brand.

As such, Zuckerberg has done similar in the years before, banning Holocaust deniers, Donald Trump, anti-vaccine activists and other bad actors of late. In deciding which speech is privileged or allowed on its platform, Facebook will always be too slow, overly cautious and ineffective to react. Zuckerberg only cares about engagement and growth. Our hearts and minds are stuck in balance.

The most frightening part of “The Ugly Truth,” the passage that got everyone talking in Silicon Valley, was the eponymous memoir: Andrew “Boz” Bosworth’s 2016 “The Ugly.”

In the memo, Zuckerberg’s longtime deputy Bosworth writes:

“So we add more people. It can be bad if they make it negative. Exposing someone to threats might cost someone a life. Someone might die in a coordinated terrorist attack on our equipment. And yet we connect people. The ugly truth is that we believe in connecting people so deeply that whatever allows us to connect more people more often is Actually Good.”

Zuckerberg and Sheryl Sandberg retracted their statements to Bosworth when employees objected, but to outsiders, the memo represents the ugly truth, Facebook’s undescribed ID. Facebook’s monopoly, its clout on our social and political fabric, its growth at all costs “connection” mantra is not really good. As Bosworth acknowledges, Facebook causes suicide and allows terrorists to organize. So much power, concentrated in the hands of a corporation run by one person, is a threat to our democracy and way of life.

Critics of the FDA’s regulation of social media will claim that it is a big brother to our personal liberties. But what is the alternative? Why would it be bad for our government to demand that Facebook do its internal calculations for the public? Is the number of sessions, time spent and revenue growth the only results that matter? What about the collective mental health of the nation and the world?

The refusal to study the problem does not mean that it does not exist. In the absence of action, we are left with a single man who decides what is right. How much do we pay for a “connection”? It’s not even Zuckerberg. The FDA must decide.

- Advertisement -

Stay on top - Get the daily news in your inbox

Recent Articles

Related Stories