Following the suggestion of whistleblower Francis Haugen
Democratic lawmakers want the social network to face legal liability for recommending harmful content to users. Reps. Anna Esshu (D-CA), Frank Pellone Jr. (D-NJ), Mike Doyle (D-PA), and Jan Shakowski (D-IL) introduced the “Justice Against Malicious Algorithms Act,” which would amend the section. Protection of 230 to exclude “personalized recommendations” for content that contributed to physical or serious emotional injury.
The bill follows a recommendation by Facebook whistleblower Frances Haugen done before Congress last week. Haugen, a former employee who leaked extensive internal Facebook research, encouraged lawmakers to crack down on algorithms that promote, rank or otherwise order content based on user engagement. It applies to web services with more than 5 million monthly visitors and excludes certain categories of content, including infrastructure such as web hosting and systems that return search results.
For covered platforms, the bill targets Section 230 of the Communications Civilization Act, which prohibits people from suing Web services over third-party content posted by users. The new exception will let these matters proceed if the Services knowingly or negligently used “personal algorithms” to recommend third party content. This may include posts, groups, accounts and other user-supplied information.
The bill would not necessarily let people sue Haugen over the type of material that has been criticized, including hate speech and material related to anorexia. Much of that content is legal in the United States, so the platform does not require an additional liability shield to host it. (a pallone statement categorized sites To promote “extremism” and “propaganda”, which are not necessarily illegal.) The bill only covers Individual Recommendations, defined as the sorting of content with an algorithm that “relies on information specific to an individual.” Companies can still use analysis extensively to recommend the most popular general content.
In his testimony, Haugen suggested that the goal was to add general legal risk until Facebook and similar companies stopped using personalized recommendations altogether. “If We Improved” [Section] 230 In order to make Facebook accountable for the consequences of their deliberate ranking decisions, I think they will get rid of engagement-based rankings,” she said.