Author

Topic: Facebook Whistleblower Frances Haugen: The 60 Minutes Interview (Read 123 times)

legendary
Activity: 4690
Merit: 1276

Comedy Central:
Tristan Harris - Facebook & Rethinking Big Tech | The Daily Show
https://www.youtube.com/watch?v=YruYAnzirxw&t=760s
...

Watched a couple of minutes from the cut.

These two losers are feeling the effects of being in a cult.  Inevitably the cult leadership will structure things such that you don't communicate outside of the cult which means with people who are not approved by the cult explicitly, or who gain approval in real-time by ticking off the correct 'virtue signaling' flags and what-not.  Often this will be associated with style.  Dress, for instance, and especialy linguistic style down to the sentence structure, cadence, cult-appropriate buzz-words, etc.

Anyway, yes, 'big tech' certainly knows who's in what cult and caters to them.  What this Facebook 'whistleblower' fraud wants is to have big tech be more pro-active in shunting people into her (and the OP's) cult and keeping the communications walls around them.  Legal measures to achieve this imposed and enforced by the state are now what it's going to take because their cult is well past the event horizon of complete crazy-land and it's hard to obtain and keep new cult memebers.

What true-believer bottom feeders such as _Miracle don't realize is that inevitably the state and the character of the cult will change, and probably to the point of exhibiting/enforcing things which are 180-degress of of what he/she believes is 'good' now.  Fortunately for him/her, he/she will be be turned with the ship and by the time it happens these things will be a good thing.

The other equally likely scenario is that _Miracle will become worm-dirt via the gene therapy injects by the time the ship completes it's 180 degree turn so it actually won't matter much.  If he/she had any assets, they will be first stripped to pay for the ultimately ineffective cancer treatment he/she will need.

hero member
Activity: 912
Merit: 661
Do due diligence
Has anyone watched Fox or CNN's take on the hearings yet?


Comedy Central:
Tristan Harris - Facebook & Rethinking Big Tech | The Daily Show
https://www.youtube.com/watch?v=YruYAnzirxw&t=760s

Good points being made, including conversations being turned into sound bytes, is it the app or is it just amplifying human nature?
@ about 5:40 instead of "the invisible hand" where people are choosing freely, what we have it is "The digital hand of Mark Zuckerberg" ---->
@ 10min 15secChina's response to social media  
@ 11min 30sec questioning our legislators  ability to create laws when they don't even know what an app is (although I thought the hearing went better than those of the past).

Not ranking for what creates "crazy town"

How S.M.  hooks you---> comparing them to a digital drug lord, giving you a taste when you try to take a break.


BTW I would rather there be more transparency from companies like this as opposed to direct government oversight.
Remember the Surgeon General's warning on packs of cigarettes? I used to look at it at least a dozen times a day when I smoked.

hero member
Activity: 912
Merit: 661
Do due diligence



I would respect him more had he not been in full damage control, but he presides over the most garbage platform in the world that harvests user data and milks every ounce of cash from their userbase, so I won't say I'm disappointed or pretend to be shocked that a billionaire acts in the interest in his company.

All the talk about Facebook because that's where all the "Covid misinformation" happens. Twitter on the other hand is the left wing version of Facebook, where you have vaccine junkies proudly displaying their 5th dose of the Pfizer shot, or have some sort of "pride" flag to demonstrate their 388 genders. They sure have the same problems Facebook does, but we won't hear about that.
 

The  corporate  laws are built to favor fiduciary levels of responsibility  to profit and share holders and we citizens have [almost] accepted that. It would be nice to see some changes in both of those areas.
It's rare to see the senate so cooperative with each other but you can tell, they're excited to have the documents.
Wondering if Facebook will try to sue, slander or rehire her? Maybe they'll let it go?

I decided to keep my FB friend count below 150 early on based on Dunbars numbers, I know, kinda quirky but it works for me. MY FB experience has been cultivated into a pretty good one for years, those are people I love, following hobbies that I like. Over the years my friends know that I'm going to be like...can you please at least Snopes this before you post it on my page  Cheesy. I use several Twitter feeds to keep some areas of interest "segregated", one is aimed at children's issues, Rotary, foster care and non-profits: we can have conversations about child sex trafficking and not get a flood of nonsense ----a whole different Twitterverse than some of my other feeds.

My daughter is in her 30's so it was My Space, FB, and online video games; all which had to be accessed from our family room so what parents are facing today is an avalanche compared to then.
legendary
Activity: 4690
Merit: 1276

What a laughable gas-light charade.  Just political cover to go even more totalitarian vis-a-vis control of information and speech which is what they and the rest of corp/gov want to do anyway.

As for the 'psychological harm' that is caused the platform, that was leaked like a decade ago by real insider 'whistleblowers' (if you want to call them that) who were there from the beginning.  Not news to anyone...who is paying attention and doesn't get their news from mainstream media.., and it pretty obvious besides.  Certainly the phenomenon isn't limited to Facefuck, and 'interdiction' by 'regulators' will certainly make it 10 times worse.  That would be the point of doing it in fact.

legendary
Activity: 2828
Merit: 1515
...

I would respect him more had he not been in full damage control, but he presides over the most garbage platform in the world that harvests user data and milks every ounce of cash from their userbase, so I won't say I'm disappointed or pretend to be shocked that a billionaire acts in the interest in his company.

All the talk about Facebook because that's where all the "Covid misinformation" happens. Twitter on the other hand is the left wing version of Facebook, where you have vaccine junkies proudly displaying their 5th dose of the Pfizer shot, or have some sort of "pride" flag to demonstrate their 388 genders. They sure have the same problems Facebook does, but we won't hear about that.
member
Activity: 478
Merit: 66
I'm just going to say that I did not watch the interview but got the gist of what this person did. So it is good that she is exposing how corrupt Facebook is by letting anyone post anything even if it is considered "Hate Speech" or "Misinformation". Basically, how it is bad that they are doing this for the almighty dollar to advertise on this posted content much like sugary drink companies try to market themselves heavily to children. I'm sure the interview had her talk about the toxic work culture as well stereotypical of these interviews. To the mainstream media that is ok with removing free speech, what about Weapons of Mass Destruction or the Gulf of Tonkin incident would this not be misinformation? What about CRT? Is this not hate speech in and of itself against white people? Basically, I'm saying that this person was put on by 60 Minutes to cause a false furor over Facebook's policies that allowed this horrible thing called the first amendment to exist on their platform and all they would do is advertise on said content. What they want the majority Democrat congress and Senate to do is create a Patriot Act 2.0 that will revoke the first amendment from social media and perhaps altogether. For those that don't know the Patriot Act was created after the events of 9-11 out of fear by at the time a majority Republican congress/senate for those that would commit terrorism. This applied to both foreigners and domestic people as well but really was designed to clamp down on those domestically. So I'm saying that history will repeat and the Congress/Senate will pass this law that is contrary to the first amendment and will cement the US's descent into becoming more and more like China. Aside from the Whistleblower, I found it interesting how Facebook removed the DNS or had issues on Monday as none of their empire sites were up. I bet they were scrubbing content or setting up new blockades against free speech much like Twitter and YouTube have done.
hero member
Activity: 912
Merit: 661
Do due diligence
And a Facebook Post Today from Mark Zuckerberg:

"I wanted to share a note I wrote to everyone at our company.
---
Hey everyone: it's been quite a week, and I wanted to share some thoughts with all of you.
First, the SEV that took down all our services yesterday was the worst outage we've had in years. We've spent the past 24 hours debriefing how we can strengthen our systems against this kind of failure. This was also a reminder of how much our work matters to people. The deeper concern with an outage like this isn't how many people switch to competitive services or how much money we lose, but what it means for the people who rely on our services to communicate with loved ones, run their businesses, or support their communities.
Second, now that today's testimony is over, I wanted to reflect on the public debate we're in. I'm sure many of you have found the recent coverage hard to read because it just doesn't reflect the company we know. We care deeply about issues like safety, well-being and mental health. It's difficult to see coverage that misrepresents our work and our motives. At the most basic level, I think most of us just don't recognize the false picture of the company that is being painted.
Many of the claims don't make any sense. If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn't care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space -- even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we're doing? And if social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries with just as heavy use of social media around the world?
At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That's just not true. For example, one move that has been called into question is when we introduced the Meaningful Social Interactions change to News Feed. This change showed fewer viral videos and more content from friends and family -- which we did knowing it would mean people spent less time on Facebook, but that research suggested it was the right thing for people's well-being. Is that something a company focused on profits over people would do?
The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don't want their ads next to harmful or angry content. And I don't know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction.
But of everything published, I'm particularly focused on the questions raised about our work with kids. I've spent a lot of time reflecting on the kinds of experiences I want my kids and others to have online, and it's very important to me that everything we build is safe and good for kids.
The reality is that young people use technology. Think about how many school-age kids have phones. Rather than ignoring this, technology companies should build experiences that meet their needs while also keeping them safe. We're deeply committed to doing industry-leading work in this area. A good example of this work is Messenger Kids, which is widely recognized as better and safer than alternatives.
We've also worked on bringing this kind of age-appropriate experience with parental controls for Instagram too. But given all the questions about whether this would actually be better for kids, we've paused that project to take more time to engage with experts and make sure anything we do would be helpful.
Like many of you, I found it difficult to read the mischaracterization of the research into how Instagram affects young people. As we wrote in our Newsroom post explaining this: "The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced. In fact, in 11 of 12 areas on the slide referenced by the Journal -- including serious areas like loneliness, anxiety, sadness and eating issues -- more teenage girls who said they struggled with that issue also said Instagram made those difficult times better rather than worse."
But when it comes to young people's health or well-being, every negative experience matters. It is incredibly sad to think of a young person in a moment of distress who, instead of being comforted, has their experience made worse. We have worked for years on industry-leading efforts to help people in these moments and I'm proud of the work we've done. We constantly use our research to improve this work further.
Similar to balancing other social issues, I don't believe private companies should make all of the decisions on their own. That's why we have advocated for updated internet regulations for several years now. I have testified in Congress multiple times and asked them to update these regulations. I've written op-eds outlining the areas of regulation we think are most important related to elections, harmful content, privacy, and competition.
We're committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress. For example, what is the right age for teens to be able to use internet services? How should internet services verify people's ages? And how should companies balance teens' privacy while giving parents visibility into their activity?
If we're going to have an informed conversation about the effects of social media on young people, it's important to start with a full picture. We're committed to doing more research ourselves and making more research publicly available.
That said, I'm worried about the incentives that are being set here. We have an industry-leading research program so that we can identify important issues and work on them. It's disheartening to see that work taken out of context and used to construct a false narrative that we don't care. If we attack organizations making an effort to study their impact on the world, we're effectively sending the message that it's safer not to look at all, in case you find something that could be held against you. That's the conclusion other companies seem to have reached, and I think that leads to a place that would be far worse for society. Even though it might be easier for us to follow that path, we're going to keep doing research because it's the right thing to do.
I know it's frustrating to see the good work we do get mischaracterized, especially for those of you who are making important contributions across safety, integrity, research and product. But I believe that over the long term if we keep trying to do what's right and delivering experiences that improve people's lives, it will be better for our community and our business. I've asked leaders across the company to do deep dives on our work across many areas over the next few days so you can see everything that we're doing to get there.
When I reflect on our work, I think about the real impact we have on the world -- the people who can now stay in touch with their loved ones, create opportunities to support themselves, and find community. This is why billions of people love our products. I'm proud of everything we do to keep building the best social products in the world and grateful to all of you for the work you do here every day."
hero member
Activity: 912
Merit: 661
Do due diligence
https://www.youtube.com/watch?v=GOnpVQnv5Cw&ab_channel=C-SPAN

I was able to get through the 3 1/2 hours

@ around 2hr:48min---- Section 230

@ around 2hr:56min----potential effect on children

If you want to expand on the effects on children you could watch The Social Dilemma on Netflix or here
https://www.youtube.com/watch?v=7mqR_e2seeM&ab_channel=Netflix
hero member
Activity: 912
Merit: 661
Do due diligence

There are other issues in the Facebook Files that are worth discussing, such as mental health for young children who use the various Facebook applications.
You are right, that could be an entire discussion on its own.
One of my favorite Authors Jonathan Haidt (The Righteous Mind: Why Good People Are Divided by Politics and Religion / The Coddling of the American Mind)
has focused his studies on the changes social media is having on children. All of us are exposed to social distortions in every ad, book, show, and movie we watch; I do think it's on overdrive with things like instagram.

I'm not finished watching the C-Span hearing, one of the the issues that gets my knickers in a twist in regards to legislators when it come to anything tech (for over 30 years)
is that they don't even seem to know the right questions to ask, so we'll see if I can finish it.
(skip the first 8 mins)
https://www.youtube.com/watch?v=GOnpVQnv5Cw&ab_channel=C-SPAN

One of the other issues at question [simplified] is: a company choosing profit over public good----knowingly making the calculation.
And that is an unfortunately old question from choosing subpar safety equipment in cars to polluting natural resources.





Particularly good portion of the podcast I'll find it in my history and repost  
03 Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead.
By Keach Hagey and Jeff Horwitz
https://www.wsj.com/podcasts/the-journal/the-facebook-files-part-4-the-outrage-algorithm/e619fbb7-43b0-485b-877f-18a98ffa773f?mod=article_inline

Partial----TRANSCRIPT
"This transcript was prepared by a transcription service. This version may not be in its final form and may be updated.

Ryan Knutson: This is the Facebook Files, a series from The Journal. We're looking deep inside Facebook through its own internal documents. If you haven't already heard parts one, two, and three during your feed. Facebook's algorithm is something of a black box. It's a complex set of mathematical equations, all adding up to a mysterious calculation that ultimately decides what you see when you log on or open the app and in early 2018, Facebook said it was making a big change to that algorithm.

Keach Hagey: So Facebook's algorithm changes all the time. They're constantly fiddling with it but this change was a paradigm shift, more than just a tweak.

Ryan Knutson: That's our colleague, Keach Hagey.

Keach Hagey: It was a completely different emphasis for what was going into your newsfeed and they actually came forward and discussed it, which is not normal for an algorithm change.

Speaker 3: The major announcement from Facebook. If you didn't notice, they overhauled your newsfeed overnight. They're trying to enhance your connections with family and friends.

Speaker 4: Zuckerberg writing on Facebook, of course, "Facebook is a lot of work to do whether it's protecting our community from abuse and hate, defending against interference by nation states or making sure that the time spent on Facebook is time well spent."

Ryan Knutson: At a congressional hearing after the company overhauled the algorithm, Facebook CEO, Mark Zuckerberg, said the change would clean up the platform and make it a healthier place.

Mark Zuckerberg: It's not enough to just connect people, we have to make sure that those connections are positive. It's not enough to just give people a voice, we need to make sure that people aren't using it to harm other people or to spread misinformation. We need to now take a more active view in policing the ecosystem, but I'm committed to getting this right and I believe that people will see real differences.

Keach Hagey: When Mark Zuckerberg came out and explained this, he described it as something of a sacrifice that the company was going to make. He said it was the right thing to do for the good of humanity and for the good of the mental health of their users, perhaps at the, at least in short term, expense of the business.

Ryan Knutson: Was Mark Zuckerberg telling the full story about the reasoning behind this change?

Keach Hagey: According to what we saw in the documents, no.

Ryan Knutson: An array of internal documents reviewed by the Wall Street Journal reveal an entirely different story behind Facebook's algorithm overhaul.

Keach Hagey: From the documents that we have seen, there was a panic going on inside the company.

Ryan Knutson: The panic wasn't over misinformation or harm stemming from the platform. It was about a troubling trend with Facebook's business. The company was noticing a steep decline in user engagement. Facebook hoped the algorithm change would reverse that decline. It did that and more."



*Let me know if you hit a paywall and/or have free versions
 
copper member
Activity: 1652
Merit: 1901
Amazon Prime Member #7
I didn't watch the 60 minutes interview, but I did read the WSJ "Facebook Files" and the article in which Haugen outed herself as the source for the Facebook Files.

One thing that stuck out to me about Haugen is that she claimed a friend of hers because a "white nationalist" after being exposed to "online misinformation", and she ended their relationship. Many on the left will claim someone is a "white supremacist" or a "racist" or a "white nationalist" if they have moderate left of center, mainstream views that differ from their own, even if they believe in things such as equality and treating everyone the same. This friend of hers helped Haugen function while she was recovering from a temporary disability by doing things such as buying groceries, taking her to the doctor's office, and helping her walk.

I think that Haugen is an activist and is trying to profit from her stealing information that is damaging to Facebook's reputation.

There are no whistleblower protections for employees who leak information to the press.


I found the first article in the Facebook Files to be the most interesting, the one about the "Whitelist" that is not actually a whitelist, known as XCheck or cross check.

It appears there were about 5.8 million Facebook users (accounts?) that were on XCheck as of the middle of last year. The users who are on this special list are subject to the same moderation rules that everyone else is subject to, however, their posts/content will typically not be subject to automated detection to determine if a rule has been broken. Most users will have their posts/content run through a number of classification models depending on a number of factors to determine if a post breaks one of Facebook's rules. Users who are part of XCheck will have their posts reviewed "manually" by a specialized team upon a number of triggers.

The problem with the above classification models is that these types of models are not sufficiently accurate. There are likely well over a billion posts made on Facebook per day, and if a model is 99% accurate, 10 million posts will be erroneously removed every day. I don't know that Facebook has been able to get their automation to be correct 99% of the time.

Binary classification models that are trained on imbalanced data (that is data that is overwhelmingly classified one way more than the other) are typically evaluated by a metric called the Area Under the Curve, which is somewhat similar to "accuracy" if the data was balanced. A classification model that predicts if a patient has a disease or cancer with an AUC score above 0.90 (this is comparable to 90% accuracy if the data was balanced) is typically considered "excellent".

For high-profile users, it is simply too risky to rely on a series of models to make moderation decisions. A model getting a decision wrong has the potential to cause negative PR problems when done to high-profile users. If one mistake was done in moderating posts from 1% of XCheck every day, Facebook would be mistakenly taking down tens of thousands of posts from these high-profile users every week. That is obviously not something that is acceptable to any rational company.


There are other issues in the Facebook Files that are worth discussing, such as mental health for young children who use the various Facebook applications.
hero member
Activity: 912
Merit: 661
Do due diligence
Facebook Whistleblower Frances Haugen: The 60 Minutes Interview

https://www.youtube.com/watch?v=_Lx5VmAdZSI&ab_channel=60Minutes



https://www.cbsnews.com/news/facebook-whistleblower-frances-haugen-misinformation-public-60-minutes-2021-10-03/
Part of the text in the above link:

"Her name is Frances Haugen. That is a fact that Facebook has been anxious to know since last month when an anonymous former employee filed complaints with federal law enforcement. The complaints say Facebook's own research shows that it amplifies hate, misinformation and political unrest—but the company hides what it knows. One complaint alleges that Facebook's Instagram harms teenage girls. What makes Haugen's complaints unprecedented is the trove of private Facebook research she took when she quit in May. The documents appeared first, last month, in the Wall Street Journal. But tonight, Frances Haugen is revealing her identity to explain why she became the Facebook whistleblower.

Facebook's response to 60 Minutes' report, "The Facebook Whistleblower"
Facebook whistleblower says company incentivizes "angry, polarizing, divisive content"
Frances Haugen: The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.

Frances Haugen is 37, a data scientist from Iowa with a degree in computer engineering and a Harvard master's degree in business. For 15 years she's worked for companies including Google and Pinterest.

Frances Haugen: I've seen a bunch of social networks and it was substantially worse at Facebook than anything I'd seen before.

Scott Pelley: You know, someone else might have just quit and moved on. And I wonder why you take this stand.

Frances Haugen: Imagine you know what's going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.

Scott Pelley: When and how did it occur to you to take all of these documents out of the company?

Frances Haugen: At some point in 2021, I realized, "Okay, I'm gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real."

Not sure if you will hit a paywall with this link
https://www.wsj.com/articles/the-facebook-files-11631713039
Podcast
https://www.wsj.com/podcasts/the-journal/the-facebook-files-part-1-the-whitelist/aa216713-15af-474e-9fd4-5070ccaa774c?mod=article_inline



If you watched the video or read the article, what do you think about it?
Jump to: