The FDA should regulate Instagram’s algorithm as a drug –

The Wall Street Journal reported on Tuesday Silicon Valley’s best-kept secret: Instagram harms teens’ mental health; In fact, its effect is so negative that it introduces suicidal thoughts.

Thirty-two percent of teenage girls who feel bad about their bodies report that Instagram makes them feel worse. The WSJ report states that among teens with suicidal thoughts, 13% of British and 6% of American users search for those thoughts on Instagram. This is Facebook’s internal data. The truth is definitely worse.

President Theodore Roosevelt and Congress formed the Food and Drug Administration in 1906 precisely because Big Food and Big Pharma failed to protect the general welfare. As its executives parade at the Met Gala in an unattainable celebration of the 0.01% lifestyle and body that we mere mortals will never be able to achieve, Instagram’s reluctance to do what is right is a clear call for regulation: the FDA to regulate. Let’s emphasize its codified authority of the algorithm that powers Instagram’s medicine.

The FDA Algorithm Should Consider Drug Affecting Our Nation’s Mental Health: The federal Food, Drug and Cosmetic Act empowers the FDA to regulate drugs, treating drugs as “commodities (other than food) defines that which is intended to affect the structure or any function of the body of man or other animal.” Instagram’s internal data shows that its technology is the kind of article that changes our minds. If this effort fails, Congress and President Joe Biden should create a mental health FDA.

Researchers can study what Facebook prioritizes and how those decisions affect our brains. How do we know this? Because Facebook is already doing that – they’re just burying the results.

The public needs to understand what the algorithms of Facebook and Instagram prioritize. Our government is well equipped to conduct clinical trials studies of products that may cause physical harm to the public. Researchers can study what the privileges of Facebook are and how those decisions affect our brains. How do we know this? Because Facebook is already doing that – they’re just burying the results.

In November 2020, Facebook made an emergency change to its News Feed, placing more emphasis on the “News Ecosystem Quality” score (NEQ), as reported by Cecilia Kang and Shira Frenkel in “An Ugly Truth”. High NEQ sources were reliable sources; were less unreliable. Facebook changed the algorithm to privilege higher NEQ scores. As a result, for the five days around the election, users saw a “good news feed” with less fake news and fewer conspiracy theories. But Mark Zuckerberg reversed the change because it reduced engagement and could lead to a conservative backlash. The public had to bear the brunt of this.

Facebook has similarly studied what happens when the algorithm privileges “bad for the world” content over “good for the world” content. Lo and behold, the engagement goes down. Facebook knows that its algorithms have a remarkable impact on the minds of the American public. How can the government allow an individual to set standards on the basis of his occupational exigencies rather than general welfare?

Upton Sinclair memorably exposed the dangerous abuse in “The Jungle”, which caused public outrage. The free market failed. Consumers needed protection. The 1906 Pure Food and Drug Act promulgated safety standards for the first time, regulating consumables that affect our physical health. Today, we need to regulate the algorithms that affect our mental health. Teen depression has risen alarmingly since 2007. Similarly, there has been a nearly 60% increase between 2007 and 2018 in suicides from 10 to 24.

Of course it is impossible to prove that social media is entirely responsible for this growth, but it is absurd to argue that it has not contributed. Filter bubbles distort our thoughts and make them more extreme. Online bullying is easy and stable. Regulators should audit the algorithm and question the likes of Facebook.

When it comes to Facebook’s biggest problem – what is the product? makes us – Regulators have struggled to clarify the problem. Section 230 is correct in its intent and application; The Internet cannot work if platforms are responsible for each user’s pronunciation. And a private company like Facebook loses the trust of its community if it enforces arbitrary rules that target users based on their background or political beliefs. Facebook as a company has no explicit duty to uphold the First Amendment, but a public perception of its fairness is essential to the brand.

As such, Zuckerberg has done similar in the years before, banning Holocaust deniers, Donald Trump, anti-vaccine activists and other bad actors of late. In deciding which speech is privileged or allowed on its platform, Facebook will always be too slow, overly cautious and ineffective to react. Zuckerberg only cares about engagement and growth. Our hearts and minds are stuck in balance.

The most frightening part of “The Ugly Truth,” the passage that got everyone talking in Silicon Valley, was the eponymous memoir: Andrew “Boz” Bosworth’s 2016 “The Ugly.”

In the memo, Zuckerberg’s longtime deputy Bosworth writes:

That’s why we add more people. It can be bad if they make it negative. Exposing someone to threats might cost someone a life. Someone might die in a coordinated terrorist attack on our equipment. And yet we connect people. The ugly truth is that we believe in connecting people so deeply that whatever allows us to connect more people more often is Actually Good.

Zuckerberg and Sheryl Sandberg retracted their statements to Bosworth when employees objected, but to outsiders, the memo represents the ugly truth, Facebook’s undescribed ID. Facebook’s monopoly, its clout on our social and political fabric, its growth at all costs “connection” mantra is not really good. As Bosworth acknowledges, Facebook causes suicide and allows terrorists to organize. So much power, concentrated in the hands of a corporation run by one person, is a threat to our democracy and way of life.

Critics of the FDA’s regulation of social media will claim that it is a big brother to our personal liberties. But what is the alternative? Why would it be bad for our government to demand that Facebook do its internal calculations for the public? Is the number of sessions, time spent and revenue growth the only results that matter? What about the collective mental health of the nation and the world?

The refusal to study the problem does not mean that it does not exist. In the absence of action, we are left with a single man who decides what is right. How much do we pay for a “connection”? It’s not even Zuckerberg. The FDA must decide.

Related Posts

error: Content is protected !!