Showing posts with label Cambridge Analytica. Show all posts
Showing posts with label Cambridge Analytica. Show all posts

January 26, 2019

This week in Facebook

After starting the new year with a few largely scandal-free weeks, Mark Zuckerberg apparently decided that he was bored, or something, because the Facebook shit resumed flying fast and thick, and Gizmodo had pretty good coverage of it all.

First up: Mark Zuckerberg's thirsty op-ed, in which he opined that people didn't trust Facebook only because we don't understand them:
On Thursday, the Wall Street Journal published a 1,000-word screed by Mark Zuckerberg about the company’s data collecting practices titled “The Facts About Facebook.” In it, Zuckerberg makes noise about the company being about “people,” and insists—as he has been for the majority of his company’s 15-year history—that we should trust it. Zuckerberg appears to think the primary reason users have little faith in the company’s ability to responsibly or ethically handle their data is because of its targeted advertising practices, about which he writes: “This model can feel opaque, and we’re all distrustful of systems we don’t understand.” 
I guess the apology tour is over; Zuck is back to his normal, condescending self.

Gizmodo's Catie Keck goes on to list a few of the reasons why people who understand Facebook just fine also distrust Zuck & Co., starting with FB's lack of transparency, continuing on through Cambridge Analytica, and ending with their scraping and then sharing data about their users (and also about people who've never used Facebook themselves) with advertisers, and other low-lights:
In 2018, we learned that Facebook was data-sharing with other companies like Microsoft’s Bing, Spotify, Netflix, and others in exchange for more information about its users. There were also the revelations that Cambridge Analytica data-scraping was worse than we thought; that Facebook was sharing shadow contact information with advertisers; and that turning off Facebook location-sharing doesn’t stop it from tracking you. That’s obviously totally aside from the George Soros conspiracy theory fiasco; its mishandling of Myanmar genocide; and its standing as a hotbed for rampant misinformation.
As with his year-end Facebook post—which I’ll note here also largely ignored the tsunami of public relations problems the company faced last year—Zuckerberg appears to remain bafflingly optimistic about the function of his company. To be clear, this is the same founder of Facebook who once called users of his product “dumb fucks” for trusting him with their sensitive information.
Lots of links in the original article, if you missed some of those earlier "hits" when they happened.

So, not an auspicious beginning. Zuck wasn't done yet, though; not by a long shot.

December 20, 2018

Facebook's very bad year gets even worse

It turns out that Facebook couldn't even make it through one more day before getting hit with more bad news. This time, though, it's not news of their incompetence, or their outright malice, that's wrecking their week; rather, it's news of actual consequences for Facebook. Finally.

As reported by The Washington Post:
[...]
The D.C. case threatens to develop into an even worse headache for Facebook. Racine told reporters that his office has “had discussions with a number of other states that are similarly interested in protecting the data and personal information of their consumers,” though he cautioned there is no formal agreement for them to proceed jointly. And the attorney general’s aides said they could add additional charges to their lawsuit as other details about Facebook’s privacy lapses become public.
Hello, again, Christopher Wylie! I'd honestly forgotten that he even existed. But I digress...

July 02, 2018

Facebook’s disclosures under scrutiny
by the FBI, SEC, FTC, and DOJ

When the extent of the Cambridge Analytica scandal was first breaking back in March, I wrote this:
There are people at Facebook who signed off on a business plan that involved collecting legally protected information about people with neither their knowledge nor their consent, and selling that data to third parties; people who then decided not to notify users when it was crystal clear that the whole shady business had gone very, very wrong. Those people will not just be facing lawsuits; those people will be facing jail time... in addition to the lawsuits.
Some readers (all two of you 😃) may have thought that I was being somewhat hyperbolic with that  statement. And, in fairness, apart from a few relatively uneventful appearances before lawmakers in the U.S. and EU, Facebook was looking like they might have escaped the worst of the possible outcomes that they could have been facing. But appearances can deceive, and Facebook themselves are now confirming that they've been under investigation, by multiple U.S. federal agencies, since at least May.

As reported by the Washington Post:
The questioning from federal investigators centers on what Facebook knew three years ago and why the company didn’t reveal it at the time to its users or investors, as well as any discrepancies in more recent accounts, among other issues, according to these people.The Capitol Hill testimony of Facebook officials, including Chief Executive Mark Zuckerberg, also is being scrutinized as part of the probe, said people familiar with the federal inquiries.
Facebook confirmed that it had received questions from the federal agencies and said it was sharing information and cooperating in other ways. “We are cooperating with officials in the US, UK and beyond," said Facebook spokesman Matt Steinfeld.
This puts yesterday's revelations (from last Friday's midnight document dump) in a different light. Who wants to bet that Facebook's 747-page infodump will be mostly information that investigators already know? Who else thinks that they were trying to get out ahead of the narrative on investigative heat that's about to get way hotter, in addition to burying as many juicy details as possible in the Friday night news graveyard?

Who else thinks that they might not get away with either of those things, this time around?

May 03, 2018

Cambridge Analytica finally killed by the scandal that they caused...

... and somehow, they didn't see it coming. From CBC News:
The British data analysis firm at the centre of Facebook's privacy scandal is declaring bankruptcy and shutting down.
London-based Cambridge Analytica blamed "unfairly negative media coverage" and said it has been "vilified" for actions it says are both legal and widely accepted as part of online advertising.
"The siege of media coverage has driven away virtually all of the company's customers and suppliers," the company said in a statement on Tuesday. "As a result, it has been determined that it is no longer viable to continue operating the business."
The company said it has filed papers to begin insolvency proceedings in the U.K. and will seek bankruptcy protection in a federal court in New York. Employees were told on Wednesday to turn in their computers, according to the Wall Street Journal.
Facebook said it will keep looking into data misuse by Cambridge Analytica even though the firm is closing down. And Jeff Chester of the Center for Digital Democracy, a digital advocacy group in Washington, said criticisms of Facebook's privacy practices won't go away just because Cambridge Analytica has.
"Cambridge Analytica's practices, although it crossed ethical boundaries, is really emblematic of how data-driven digital marketing occurs worldwide," Chester said.
"Rather than rejoicing that a bad actor has met its just reward, we should recognize that many more Cambridge Analytica-like companies are operating in the conjoined commercial and political marketplace."
Just a little reminder, in case you still needed it, that there's more where Cambridge Analytica came from, and Facebook's fiasco is far from over. I have to disagree with Jeff Chester on one point, though: I think that most of us can still remember that, while also rejoicing in Cambridge Analytica's demise.

The other Facebook histoire du jour? The Facebook engineer, and professional stalker, that they had to fire for abusing FB's user information database, of course.

April 02, 2018

An utterly inadequate response...

As if to ram home the point that Facebook really aren't serious about confronting their actual problems, we get this news today about their half-hearted efforts at reform. As reported by c|net:
So, they're still going to collect as much information about you as possible, and they're still going to sell that data to interests outside of Facebook, but those outside interests will have to pinkie swear not to use the data unless you've given them permission first. What could possibly go wrong?

That's a rhetorical question, of course, since we already know what can go wrong. Advertisers, and others, can always simply claim that they received user permissions that they didn't, in fact, receive. People can and do lie, something we know because the most famous of Facebook's customer did exactly that. Remember Cambridge Analytica, who "confirmed" that their uploaded set of Facebook data would only be used for academic research purposes, and then used it for paid political consultancy?

The problem with allowing anyone outside Facebook to upload user data from Facebook is that the uses of that data cannot be controlled, once the data itself resides on servers outside of Facebook's direct control. It can and will be used for anything and everything, for both good and evil, with Facebook reaping the profits of the trade while Facebook's users bear the cost. That shit ain't right.

I've said it before, and I'll say it again, Facebook is the problem, here. Their entire philosophy ranks growth as more important than the human cost of that growth, and cares nothing for the privacy and security of their customers, and it never will. Not unless forces outside Facebook force them to care.

Personally, I don't think that any of the services that Facebook provide are worth their hidden costs; they've gone out of the way to make their product as addictive as possible, but I don't think that people actually need Facebook. You may feel like you do, but you don't. It's up to you, but if you were ever thinking of maybe taking a break from Facebook to find out if you can live without it... well, now is probably that time. Just saying.

#FacebookIsTheProblem, 
#DeleteFacebook

That's not the point, Zuck, and you damned well know it...

Last week, Apple CEO Tim Cook weighed in on Facebook's fiasco by claiming that Apple had never, and would never have, done the things that Facebook did with and to their users. As reported by Brinkwire:
Apple CEO Tim Cook didn’t mince words when discussing the controversy that has engulfed fellow tech giant Mark Zuckerberg, stating unequivocally that the data leak scandal that affected an estimated 50 million Facebook users would never have happened to Apple because the company doesn’t treat its customers like “products.”
“I think it’s an invasion of privacy,” Cook said during a Wednesday interview snippet with Kara Swisher and Chris Hayes, hosts of an upcoming MSNBC and Recode special on Apple. “Privacy to us is a human right. It’s a civil liberty, and is something that is unique to America. This is like freedom of speech and freedom of the press, and privacy is right up there for us.”
When asked what he would do if he were in Zuckerberg’s position, the Apple CEO quickly answered: “What would I do? I wouldn’t be in this situation.”
“We could make a ton of money if we monetized our customers, if our customers were our product,” Cook said. “We’ve elected not to do that.”
Yes, that's Apple, a profoundly whose most successful product is that profoundly anti-consumer iOS and its associated walled-garden app store, taking Facebook to task for being too evil, even for Apple's tastes. And, I might add, correctly: Apple might do everything it can to keep its users trapped inside the walls of its iOS garden, but they didn't try to turn MacOS into the same kind of walled garden user experience (the way Microsoft did, with Windows 8 & 10), and they haven't monetized their customer base in any other way that we're aware of. There are things that Apple simply won't do.

Of course, it helps that selling software and online services isn't actually Apple's core business. Apple make consumer devices; the iOS app store is meant to add value to the expensive device you've already bought, in addition to generating a extra revenue for Apple in the form of licensing fees. Facebook only has their service; in order to be a viable business, they do have to find some way of generating revenue from the service itself. But the problem isn't that Facebook are collecting a lot of information about their users, and then using that data as fuel for a targeted advertising service which they then sell to the advertisers who want to reach you. Google does that, too, and nobody much gives a shit.

No, the problem with Facebook is that they collect data about you that you're not aware that they have access to, with zero visibility or accountability, and (until five minutes ago) no real option for the user to opt out. Not only are they collecting information about you, but they're also doing the same for everybody on your contact list, and doing so without their knowledge or informed consent, either; instead, there are click-through legalese pop-ups in which you agree that Facebook can harvest information about the people you know, as you had power of attorney, or something, and thus the legal right to approve the harvest of the personal information of anyone other than yourself.

But wait... it gets worse! Because Facebook aren't just building these data profiles about their users; they're also building data profiles of contacts that you might have in apps and on sites other than Facebook, if you ever used Facebook to log into them. And, having built this Orwellian data mining system, they're not just using it to target you more effectively with advertising. Oh, no, precious.


Facebook were not just selling advertising space. They were selling access to the data itself, to interests outside Facebook.


Remember Cambridge Analytica? Yes, it does look like Cambridge Analytica harvested more data from Facebook than they strictly should have, but Facebook let them do it, and they let them do it for money. And so, when you read today that Mark Zuckerberg is dismissing Tim Cook's criticisms as "glib," and saying things like (from Vox):
“The reality here is that if you want to build a service that helps connect everyone in the world, then there are a lot of people can’t afford to pay,” Zuckerberg said. “And therefore, as with a lot of media, having an advertising-supported model is the only rational model that can support building this service to reach people.”
take a moment to remember how we got here, and to appreciate just how glib Zuckerberg is being himself, in this moment.

This isn't why Facebook is in so much trouble right now, Zuck, and you damned well know it.

Again, I'm no huge fan of Tim Cook or Apple, but when Tim Cook tells you that there are business practices that Apple rejected as being simply too scummy, even for them, you can believe that; Apple's behaviour over the years bears him out. When Google, whose founding principle was, "Don't Be Evil," and who have provided more robust privacy management tools than Facebook, and better transparency about privacy issues, for years, tell you that your data is safe with them, you can believe it; their behaviour over the years bears it out.

On the other hand, Facebook are still lying to you about this shit. Mark Zuckerberg's leadership, and that of FB leadership team members like Andrew "Boz" Bosworth, make a mockery of everything that they're now claiming to stand for. In the meantime, Facebook is now saying that it will take a couple of years for them to actually clean up their act. As reported, again, by Vox:
“I wish I could solve all these issues in three months or six months, but I just think the reality is that solving some of these questions is just going to take a longer period of time,” he said in a podcast interview with media publication Vox.
The company started investing more in security at least a year ago, Zuckerberg said,“so if this is going to be a three-year process, then I think we’re about a year in already. Hopefully by the end of this year, we’ll have really started to turn the corner on some of these issues.”
Honestly, I'll be surprised if they're "turning the corner" on anything in a year's time, given that they're still not being honest, with anyone, apparently, about what their problems actually are. And even if they can, the one thing that's become painfully clear over the last few weeks is that Facebook, and everybody who works there, simply cannot be trusted to do the right thing for their users, let alone for society as a whole. Facebook must face penalties for what they've already done wrong, and strong regulations which will discourage them, or anyone else, from doing anything like this again.

That's not "glib," Zuck. It's just the truth. Something with which you might want to start acquainting yourself, while there's still time.

#FacebookIsTheProblem
#DeleteFacebook

March 29, 2018

Today in Facebook

As we come up on the two week mark of Facebook's fiasco, there are finally signs that Facebook might actually be taking the matter of users' privacy seriously enough to do something, this time.

First, from Ars Technica:
Facebook will (soon) yank third-party ad data in the name of privacy
In the wake of the Cambridge Analytica scandal and rising public pressure against Facebook, the social media giant announced on Wednesday evening that it will restrict how much data advertisers can have access to.
Facebook will soon stop allowing advertisers access to data about individuals held by companies like Experian and Acxiom.
Prior to this change, Facebook allowed advertisers to target groups of people based on an amalgamation of both datasets.
Baby steps, I guess. Ars is also reporting that Facebook is also going to make it easier for users to find and change their accounts' privacy settings. "If that sounds familiar, it's because Facebook has made that exact kind of announcement many times over."

Facebook are also blocking new apps from joining the platform, as reported by The Verge:
Facebook paused its app review process last week to “implement new changes,” the company quietly announced yesterday. Facebook’s move to momentarily prevent new apps and chatbots onto its platform comes after the Cambridge Analytica data privacy scandal that’s unfolded over the last two weeks. The ongoing situation has embroiled the company in an existential crisis of unprecedented magnitude after up to 50 million Facebook users profiles’ were compromised by a third-party app. Last week, Facebook said it will further limit developers’ access to user data.
[...]
One co-founder of a digital agency took to Facebook to complain about the sudden pause, as spotted by Mashable. “Imagine hundreds of hours of work, tens to hundreds of thousands of dollars in investment capital, and dozens of clients disappearing at any given moment at the whim of a few lines of code,” Troy Osinoff wrote, as he set his status to “thinking about the meaning of life.”
Yes, the Facebook Effect is now expanding to cause harm to companies that have done nothing wrong... except to get into business with Facebook. Expect more lawsuits, in addition to the fourteen class actions that have already been filed.

So serious is Facebook's situation that market analysts, who were predicting that Facebook would ride out this storm just fine, are now predicting much more gloom ahead, not only for Facebook, but for tech giants in general.

March 23, 2018

Today in Facebook...

I have a feeling that this will be a regular thing for a while.

To start with, I'd like to draw your attention to this great piece from Engadget:
Let’s stop pretending Facebook cares
[...]
The really great thing to come out of the Cambridge Analytica scandal is that Facebook will now start doing that thing we were previously assured at every turn they were doing all along. And all it took was everyone finding out about the harvesting and sale of everyone's data to right-wing zealots like Steve Bannon for political power. Not Facebook finding out, because they already knew. For years. In fact, Facebook knew it so well, the company legally threatened Observer and NYT to prevent their reporting on it; to keep everyone else from finding out.
[...]
When the The Guardian's 2015 article came out, Facebook pretended to care."And then," former Cambridge Analytica employee Christopher Wylie told The Observer, "all they did was write a letter."
"But literally all I had to do was tick a box and sign it and send it back, and that was it," says Wylie. "Facebook made zero effort to get the data back."
[...]
It wasn't until the NYT and The Observer prepared to publish their articles last Friday that Facebook decided to suspend Cambridge Analytica and Christopher Wylie from the platform -- in a weak attempt to get ahead of the story. Even then, it was after Facebook made legal threats on both NYT and The Observer in an effort to silence both publications.
[...]
It almost goes without saying that this whole sickening affair is more proof we didn't need that Facebook only cares when it is forced to. When the company decides it has a reputation problem. Which is the only problem they actually care about fixing. Other than that, it's all about creating more data dealer WMD's, like Facebook's impending patent to determine social class, which we can all assume will be abused until press who can afford to stand up to Facebook write an article about it.
Yes, Cambridge Analytica have definitely done bad things, but Facebook is the problem. It's heartening to see that the media is increasingly seeing past the Cambridge Analytica trees to the out-of-control Facebook forest fire. Some of them have also started paying attention to Facebook's corrosive social and psychological effects, too.

Engadget is also keeping tabs on the class action lawsuit situion (up to four), #deleteFacebook picked up steam todayt when Elon Musk deleted Tesla's and SpaceX's Facebook pages, and Facebook's share price is down 13% for the week - although, if you've got nerves of steel, now is either a great time to take a short position on Facebook, or to pick up some FB stock cheap, in the hope that they can ride this shitstorm out... and good luck with that.

The Verge has a very detailed guide up to deleting Facebook, step-by-step (#deleteFacebook), and LifeHacker has a detailed guide to finding out everything that Facebook knows about you (spoiler alert: it's really, really not easy). GQ has just posted an article about how consumers can kill Facebook. Oh, and the notoriously feckless and ineffectual U.S. Congress has apparently smelled the cross-spectrum, bi-partisan outrage, and summoned Zuckerberg to the Hill so that he can lie to them again.

I've probably missed quite a bit. This story is now so big, and so hot, that a dozen new articles are being posted about it hourly. Make no mistake about it, folks; Facebook are in some real trouble, here.

March 22, 2018

Yes, Facebook's fiasco really did get worse...

Remember just yesterday, when Mark Zuckerberg was trying to explain their Cambridge Analytica dealing away as some sort of outlier, and talking about how, sure, in hindsight, they probably shouldn't have taken CA's money, but how were they to know at the time? Well, pretty much all of that was horseshit. CA wasn't any sort of an outlier, and the amount of data they received was not at all abnormal.

From The Guardian:
Before Facebook suspended Aleksandr Kogan from its platform for the data harvestingscam at the centre of the unfolding Cambridge Analytica scandal, the social media company enjoyed a close enough relationship with the researcher that it provided him with an anonymised, aggregate dataset of 57bn Facebook friendships.
Facebook provided the dataset of “every friendship formed in 2011 in every country in the world at the national aggregate level” to Kogan’s University of Cambridge laboratory for a study on international friendships published in Personality and Individual Differences in 2015. Two Facebook employees were named as co-authors of the study, alongside researchers from Cambridge, Harvard and the University of California, Berkeley. Kogan was publishing under the name Aleksandr Spectre at the time.
[...]
“The sheer volume of the 57bn friend pairs implies a pre-existing relationship,” said Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University. “It’s not common for Facebook to share that kind of data. It suggests a trusted partnership between Aleksandr Kogan/Spectre and Facebook.”
[...]
Facebook has not explained how it came to have such a close relationship with Kogan that it was co-authoring research papers with him, nor why it took until this week – more than two years after the Guardian initially reported on Kogan’s data harvesting activities – for it to inform the users whose personal information was improperly shared.
[...]
“We made clear the app was for commercial use – we never mentioned academic research nor the University of Cambridge,” Kogan wrote. “We clearly stated that the users were granting us the right to use the data in broad scope, including selling and licensing the data. These changes were all made on the Facebook app platform and thus they had full ability to review the nature of the app and raise issues. Facebook at no point raised any concerns at all about any of these changes.”
Kogan is not alone in criticising Facebook’s apparent efforts to place the blame on him.
“In my view, it’s Facebook that did most of the sharing,” said Albright, who questioned why Facebook created a system for third parties to access so much personal information in the first place. That system “was designed to share their users’ data in meaningful ways in exchange for stock value”, he added.
Whistleblower Christopher Wylie told the Observer that Facebook was aware of the volume of data being pulled by Kogan’s app. “Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic use,” Wylie said. “So they were like: ‘Fine.’”
As I wrote yesterday, Facebook is the problem, here. They didn't just fall in with bad company, through no fault of their own; they jumped into shark-infested waters with a bucket of chum, ignored the circling fins (the warning signs that their own processes threw up), and raked in the money quite cheerfully right up until the moment when it became apparent that they were, indeed, bleeding heavily and about to lose an unknown number of corporate limbs. They didn't care when it mattered, and they didn't act when it mattered, and they damned well knew better at the time.

March 21, 2018

Facebook is the problem

I don't think my previous post quite made this clear, but there's a very simple reason why I've been posting about the of the Cambridge Analytica story here, on my tech blog, rather than over there, at my political blog. It's because the political angle of this never struck me as being the most important part of the story; because the problem here really isn't Cambridge Analytica, per se.

Yes, Steven Bannon was (and probably still is) a real piece of work, and the company to which he was attached did do some very bad things, but Cambridge Analytica didn't do anything that Facebook didn't allow them to do, at the time. Yes, CA scraped waaayyy more data from FB than Zuckerberg's crew expected, and clearly abused it, and then behaved in almost cartoonishly villainous ways, but the real problem is that FB had the data available to sell in the first place.

To get a real idea of how big, and bad, the problem is, consider the following hypothetical scenario:
  1. You "friend" or "follow" your doctor on Facebook. This is useful; it allows you to book appointments more easily, and keeps your doctor's contact info readily available if you need it...
  2. ... and you do need it, because you've just been diagnosed with something that's chronic, serious, and both difficult and expensive to treat. Your doctor mentions a few different medications that he might want you to try, and tells you who makes them, so you...
  3. ... follow those pharmaceutical companies online. After all, they make medications that you're now intensely interested in.
  4. Meanwhile, your doctor has reached out to some of their colleagues via a professional FB group. Your name is never mentioned, of course, just the basic fact that they have "a patient" with a difficult and unusual diagnosis, and they'd appreciate some advice.
  5. Facebook now know (a) your name, (b) your doctor's name, and (c) your interest in companies that make medications to treat (d) the condition that your doctor now also wants advice about, because it's a rare diagnosis and they're never seen an actual case before.
  6. ( a + b + c + d ) = details of your medical history, which you never divulged to anyone, but which Facebook now has in their database, access to which they now sell to...
  7. (e) anyone who might have a financial interest in knowing about the sudden increase in medical bills that you're about to incur. Have you applied for a mortgage recently? Or a job? Or extended medical insurance coverage? Would any or all of those companies maybe appreciate a solid cost-saving heads-up about your circumstance?
This may sound like a far-fetched hypothetical, but it's not. The data that Cambridge Analytica scraped from Facebook's database was of exactly this kind, and you'd better believe that they weren't the only firm to buy access to the data profile that Facebook has built of you, with neither your knowledge nor informed consent, and then sold to God knows who.

This is a problem because data, once sold, can't be un-sold; once Cambridge Analytica had scraped FB's data trove onto their own servers, there was nothing FB could do about it anymore. Do you know how many criminal organizations might have gained access to personal information about Facebook's users, and then re-sold it on the darknet? Because I don't, and neither do Facebook. The fact that they've just recently stopped/are about to stop doing these evil things doesn't begin to un-do all the previous evil they've already done... the effects of which their products users (i.e. you) will now be living with for years to come, at the very least.