Free-for-all, rather than the CEO-slamming sound, aimed at listening more than a listening session on Tuesday’s big tech algorithm – and in that sense it was mostly successful.
The policy-focused hearing shifts to Facebook, YouTube and Twitter, rather than the CEOs of those companies. No large-scale revelations were offered as a result in a few hours, but still perhaps more productive than squeezing some of the world’s most powerful people to “get you back on it” for their commitments.
At the hearing, lawmakers misinterpreted echoes of social media and algorithms that pump content through platforms are able to completely change human behavior. .
“… This advanced technology is used in algorithms designed to attract our time and attention to social media, and the result is the attention of our children, the quality of our public discourse, our public health and even our Can be harmful for Democracy, ”said Chris Kons (D-DE), chair of the Senate Judiciary Subcommittee on Privacy and Technology, which held the hearing.
Coons strikes a cooperative note, noting that algorithms drive innovation, but their dark side costs a lot
None of these are new. But Congress is crawling closer to a solution, one repeat technique hearing at a time. Tuesday’s hearing highlighted some areas of the bipartisan agreement, which could determine the prospects of a technical reform bill passed by the Senate, which is controlled by Democrats. Coons expressed optimism that a “widely bipartisan solution” could be reached.
What does that look like? Possibly a change has been made in section 230 of the Communications Decency Act, which we have written extensively in the last few years. The law protects social media companies from liability for user-created content and is a big reason for technical regulation, which is between Biden in both the Democratic Senate and the previous Republican-led Senate, who asked Trump Had taken his hint.
A Broken Business Model
At the hearing, lawmakers pointed out the inherent flaws in how major social media companies make money as the heart of the problem. Instead of criticizing companies for specific failures, they focused mostly on the core business model from which many of social media spring ahead.
“I think it’s very important for us to bring back the idea that there are actually easy quantitative solutions to complex, qualitative problems,” said Sen. Ben Sasse (R-NE). He argued that because social media companies make money by hooking users up for their products, any real solution would be to maintain that business model as a whole.
Josh Haley (R-MO) said, “The business model of these companies is addictive, social media is called” attention treadmill “by design.
Ex-Gogler and frequent tech critic Tristan Harris did not explain how tech companies talk about that central design in their testimony. “It’s similar to listening to a hostage in a mortgage video,” Harris said, shutting down the business model seeking engagement only upside down with a gun.
Spotlight on section 230
In a big way MPs have proposed to hinder those deeply motivated incentives? Adding algorithm-focused exceptions to the Section-280 security that social media companies enjoy. Some bills float around and take that approach.
Sen. John Kennedy (R-La) and a bill of reps. Paul Gosar (RA) and Tulsi Gabbard (R-HI) will require platforms with 10 million or more users to obtain consent before serving users’ content based on their behavior or demographic. Data if they want to keep Section 230 security. The idea is to revoke 230 immunity from platforms that “increase engagement for information that polarizes their views” unless users specifically oppose it.
In another bill, the Americans Dangerous Algorithm Act, Reps. Anna Eshu (D-CA) and Tom Malinowski (D-NJ) have been proposed to suspend Section 230 security and make companies liable “if their algorithm increases misinformation that leads to offline violence.” The bill will amend section 230 in the context of existing civil rights laws.
Defenders of Section 230 argue that targeted changes to the law in any way can disrupt the modern Internet as we know it, resulting in better negative impacts than the set scope of reform efforts. An apparent repeal of the law is almost certainly off the table, but even small tweaks can completely prepare Internet businesses for better or worse.
During the hearing, Hawley made a comprehensive suggestion for companies using algorithms to chase profits. “Why shouldn’t we just remove 230 security from any platform engaging in behavioral advertising or algorithmic amplification?” He asked that the outright repeal of the law not be opposed.
Sen. Klobuscher, who heads the Senate’s counterparty subcommittee, was associated with algorithm concerns for anti-competitive behavior in the tech industry. “If you have a company that buys everyone from under them … we’ll never know if they can develop bells and whistles to help us with misinformation, because there’s no competition.” Is, ”said Klocher.
Subcommittee members Klobuchar and Sen. Mazi Hirono (D-HI) have their own major section 230 reform bill, the Safe Take Act, but the legislation is less concerned with algorithms than advertisements and paid content.
At least one major bill looking at Section 230 through the lens of the algorithm is still under way. Leading big tech reviewer House Rep. David Sicileen (D-RI) is soon to come out with a section 230 bill that could suspend liability protection for companies that promote engagement and line their pockets Believe it.
“This is a very complex algorithm designed to raise advertising prices to produce maximum profits for the company,” Sicilyn told Axis last month. “… It is a set of business decisions for which, it can be quite easy to argue that a company should be held accountable.”