Showing posts with label #BustThisTrust. Show all posts
Showing posts with label #BustThisTrust. Show all posts

August 29, 2020

This week in Facebook: Zuckerberg throws contractors under the bus for Facebook's Kenosha fail, while deflecting responsibility...

... and if that isn't peak Facebook in a nutshell, I don't know what is.

As reported by TIME:

In a video posted to Facebook on Friday, Facebook chief executive Mark Zuckerberg said that the social media giant made a mistake by not removing a page and event that urged people in Kenosha, Wis., to carry weapons amid protests. On Tuesday night, a 17-year-old named Kyle Rittenhouse allegedly fatally shot two people and injured a third.

Zuckerberg admitted that “a bunch of people” had reported the page and said the decision to not remove it was “largely an operational mistake.”

“The contractors, the reviewers who the initial complaints were funneled to, didn’t, basically, didn’t pick this up,” Zuckerberg said in the Friday video, which was taken from a larger company-wide meeting. “And on second review, doing it more sensitively, the team that’s responsible for dangerous organizations recognized that this violated the policies and we took it down.”

He went on to deny that the shooter had followed this particular Facebook group, as if that was required for him to have decided to show up for an event which was organized on Facebook by the group; went on to announce that the shooter's Facebook and Instagram pages had been "suspended," and that the "Kenosha Guard" page had also been taken down... just hours after the public outcry started about white supremacist militia groups organizing events on Facebook that led to the shootings.

On the plus side, though, Zuckerberg did describe the shootings, accurately, as a "mass murder," so at least he's finally stopped pandering to these asshats.

At this point, it's pretty clear that Facebook is not a positive force in society; their corporate culture is, and always has been, morally bankrupt, suffering from a total lack of anything resembling actual principles. And the problem is pervasive, the result of a corporate leadership which views rules as being for other people, and morals as the small-minded thinking of the unintelligent; Facebook is a fish that's rotted from the head down, and which is thoroughly rotten.

As long as Facebook is allowed to continuing policing itself, subject only to "internal investigations" of its own failures, no matter how many lives are lost as a result of those failures, its problems will not be solved. Facebook does not have problems; Facebook is the problem. And the only solution to that problem is for Facebook is to stop being Facebook, most likely due to antitrust action breaking them up into chunks of manageable size. No other remedy can possibly begin to bring the problem of Facebook to heel.

May 26, 2020

This week in Facebook

It's been a while since last posted one of these.

I mean, with all the COVID-19 chaos currently sweeping the world, it's just been hard to get all that worked up about Facebook's essentially evil nature. It helps that they've had very few major screwups, lately; there have been no more Cambridge Analytica-style scandals, no more Congressional testimony, no major developments on the anti-trust front... It had been so quiet, in fact, that Facebook's image looked like it might be about to recover from years of terrible PR.

And then the Wall Street Journal came along, and reminded us just how awful Facebook actually is:
A Facebook team had a blunt message for senior executives. The company's algorithms weren't bringing people together. They were driving people apart. "Our algorithms exploit the human brain's attraction to divisiveness," read a slide from a 2018 presentation. "If left unchecked," it warned, Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform." That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior? The answer it found, in some cases, was yes.