Author

Topic: Zuckerberg: why he thinks FB shouldn't take down posts that deny the Holocaust (Read 120 times)

sr. member
Activity: 854
Merit: 277
liife threw a tempest at you? be a coconut !
the revolt? Zuck the great decide to say fuck to mr brennan? ohhh big courage.

No myself I understand zuck the great, because when you see mr brennan former director of the cia, you get quickly that the treason is /on and those working for/with/under mr brennan don't seem to understand the situation...

And I don't blame zuck, it's a civilian project... explaining the meeting with the men in black Smiley...

If he means what he says, then i can agree with most.

yeah me too, as much as I understand the plague the people of myanmar are facing with extreme breeding strategy used by the muslims conquerors and annihilators (look at what they left behind beyond the palaces of their rulers), I wouldn't like to know that facebook is used to facilitate population removal (those intend to invade, multiply and then split the country with the part they control under the own laws of the sand masters).

I think the myanmar people should use or develop their own open source application to handle the existential threat they face...

frankly what I hate is the medias cover up of the facebook and co, to foster their systems only, under the fear to get banned or delisted from those platforms... anyway most old dying medias did it to themselves, pushing facebook, twitter or what ever private (and censorable at will like here) and believing that it will "spare" them...

but then the fb censors (shareholders, managers and co) must understand that when they are forced or want to work against the constitutional republic of the united states of america known by some as the USA and by others as the great satan, that if large swaths of the men in black, so be it. the only question is what should put on their graves, born american died - traitors?

once upon a time, there was an alligator made in to a pet by a talented manipulator... one day a little bird landed before the alligator, laughed and said: did you forget who you are? and in an instant the alligator snapped back to the swamps, and the bird said, reptile to reptile, I never such clean teeth... it was never confirmed that the talented manipulator wore a smoking the day of his disappearance...

moral of the story: alligators digest slowly...

jr. member
Activity: 124
Merit: 8
If he means what he says, then i can agree with most.
full member
Activity: 448
Merit: 110
tl;dr In summary, if you have an offending and violent page with intent to cause harm, either to yourself or others,  Facebook's AI will not make it public on news feeds.
hero member
Activity: 672
Merit: 526
Zuckerberg: The Recode interview
Everything was on the table — and after Facebook’s wildest year yet, that’s a really big table.



Okay. “Sandy Hook didn’t happen” is not a debate. It is false. You can’t just take that down?

I agree that it is false.

I also think that going to someone who is a victim of Sandy Hook and telling them, “Hey, no, you’re a liar” — that is harassment, and we actually will take that down. But overall, let’s take this whole closer to home...

I’m Jewish, and there’s a set of people who deny that the Holocaust happened.

I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think-

In the case of the Holocaust deniers, they might be, but go ahead.

It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, “We’re going to take someone off the platform if they get things wrong, even multiple times.” (Update: Mark has clarified these remarks here: “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.”)

What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed. I think we, actually, to the contrary-

So you move them down? Versus, in Myanmar, where you remove it?

Yes.

Can I ask you that, specifically about Myanmar? How did you feel about those killings and the blame that some people put on Facebook? Do you feel responsible for those deaths?

I think that we have a responsibility to be doing more there.

I want to know how you felt.

Yes, I think that there’s a terrible situation where there’s underlying sectarian violence and intention. It is clearly the responsibility of all of the players who were involved there. So, the government, civil society, the different folks who were involved, and I think that we have an important role, given the platform, that we play, so we need to make sure that we do what we need to. We’ve significantly ramped up the investment in people who speak Burmese. It’s often hard, from where we sit, to identify who are the figures who are promoting hate and what is going to... which is the content that is going to incite violence? So it’s important that we build relationships with civil society and folks there who can help us identify that.

I want make sure that our products are used for good. At the end of the day, other people blaming us or not is actually not the thing that matters to me. It’s not that every single thing that happens on Facebook is gonna be good. This is humanity. People use tools for good and bad, but I think that we have a clear responsibility to make sure that the good is amplified and to do everything we can to mitigate the bad.

Let me give you another example. When Live came up, one of the terrible use cases where people were using ... There were a small number of uses of this, but people were using it to ... show themselves [doing] self-harm, or there were even a few cases of suicide. We looked at this, we’re like, “This is terrible. This is not what we want the product to be. This is terrible, and if this is happening and we can help prevent it, then we have a responsibility to.”

So, what did we do? We took the time to build AI tools and to hire a team of 3,000 people to be able to respond to those live videos within 10 minutes. Most content on Facebook, we try to get to within hours or within a day, if it comes up, and obviously, if someone’s gonna harm themselves, you don’t have a day or hours. You have to get to that quickly. With all the millions of videos that are posted, we had to build this combination of an AI system that could flag content that our reviewers should look at, and then hire a specific team trained and dedicated to that, so that way they could review all the things very quickly and have a very low latency.

In the last six months, we’ve been able to help first responders get to more than a thousand people who needed help quickly because of that effort.


Jump to: