Author

Topic: Google warns Supreme Court against ‘gutting’ controversial tech provision (Read 39 times)

legendary
Activity: 4410
Merit: 4788
the thing is if google had better algo's they would not need to see government want to force google hand in moderating

take the youtube crap where everything bitcoin related has scammers commenting to all viewers with whatsapp numbers.
its friggen easy for google to make an algo to instant0ban anyone maing accounts with funky fonts that are mostly numeric

even things like the fake tesla accounts that show a old livestream about crypto but then have chat bots promoting scams. that can easily be banned by google.

by moderating scams. they then wont need to have laws changed to force them to moderate scams that can be moderated

it does not need to involve "overwatch" personnel to be hired. it involves simply better detection algo's
legendary
Activity: 2562
Merit: 1441
Quote
Google argued that if the Supreme Court rules to scale back a liability shield for internet companies, the decision could lead to more censorship and hate speech online, according to a brief filed Thursday.

The filing showcases Google’s argument in a case facing the high court that centers around Section 230 of the Communications Decency Act, a controversial provision that protects companies from being sued over content posted by third parties.

“Gutting Section 230 … would upend the internet and perversely encourage both wide-ranging suppression of speech and the proliferation of more offensive speech,” the filing states.

Sites with resources to take down objectionable content could “become beholden to heckler’s vetoes, removing anything anyone found objectionable,” while other sides could take “the see-no-evil approach” and disable filtering to “avoid any interference of constructive knowledge of third-party content,” the company argued.

The case is based on allegations against Google raised by the family of Nohemi Gonzalez, a 23-year-old U.S. citizen killed in a 2015 Islamic State Terror Attack in France. Gonzalez’s family alleges Google-owned video-sharing site YouTube provided a platform for terrorist content and recommended content inciting violence and recruiting potential Islamic State supporters through YouTube’s recommendation algorithm.

At the crux of the case is a question of whether Section 230 protects Google against the allegations.

Oral arguments before the Supreme Court are scheduled for Feb. 21.

Lawmakers on both sides of the aisle have been pushing for reforms to Section 230, but for different reasons, meaning there is likely to be little consensus by way of policy reform.

President Biden doubled down on his calls to reform Section 230 in an op-ed published in The Wall Street Journal on Wednesday.

Democrats argue the provision leads to more hate speech and misinformation online, since it protects tech platforms from being legally responsible for such content. Meanwhile, Republicans argue it allows platforms to censor content with anti-conservative biases.

The Justice Department filed a brief in the case last month warning the Supreme Court against an “overly broad” interpretation of Section 230. The department argued that the provision protects YouTube over liability for hosting or “failing to remove” ISIS-related content, but not over claims based on YouTube’s “own conduct in designing and implementing its targeted-recommendation algorithms.”



https://news.yahoo.com/google-warns-supreme-court-against-214357642.html


....


My take on this is that if parents can't be expected to watch their kids 24/7 and screen all of the content and activities they come into contact with. Corporations with an internet presence also should not be expected to scrutinize every tiny minutiae of activity on their platforms. While corporations might possibly hire excessive manpower to watchover everything that happens on their servers. It may not even be feasible or possible for such a thing to occur in terms of cost effectiveness.

There is a potential option in the form of markets like amazon turk which payout low fees for random internet workers to carry out specific tasks:

Amazon turk:  https://www.mturk.com/worker

It is also known that internet tasks are often outsourced to nations with lower labor costs.

Perhaps these models can be applied to allow platforms like pornhub to do a better job screening its content.

Although, I get a feeling that state entities of the world want such content and end user audits to be conducted completely within the borders of their own nations. Which could complicate things.
Jump to: