Categories
Google SurCap Surveillance capitalism

Google wants you to pay them $2.79/month (and ignore the bad stuff they’re doing)

Like billions of people, I have happily used Google search, Gmail and Docs for years, marvelling at how they could provide all that free storage space. However, the fun recently came to an end when I started getting this message from Google:

The message doesn’t suggest deleting things to clear space, let alone provide any helpful tips on how to do so. The message is clearly aimed at scaring people consumed by FOMO (the fear of missing out) into paying for storage.

I’m sure Google told me about the 15GB limit when I signed up years ago, but I had long since forgotten, as I’m sure most people have. So, now we have the choice of deleting lots of things, moving them to physical hard drives or finding a free storage service (good luck with the last one).

I tried deleting things but, after wiping out my biggest files, I still had 14GB of stuff and couldn’t figure out how all my little files could add up to all that. These issues, however, pale compared to the larger one: that Google is asking us to pay to store our content from which they already make huge profits partly by enabling things like hate speech and disinformation.

As I explained in my rabble article COVID-19 could mean we lose and surveillance capitalists win — again, Google and Facebook make lots of money off hate speech and disinformation like conspiracy theories because they generate lots of engagement. The more engagement, the more they can charge for ads.

The excerpts below from Roger McNamee’s booked, Zucked: Waking up to the Facebook catastrophe, explains why this matters.

Here’s what McNamee said at the May 28, 2019 hearing of Canada’s federal Standing Committee on Access to Information, Privacy and Ethics:

“For Google and Facebook, the business is behavioural prediction. They build a high resolution data avatar of every consumer – a voodoo doll, if you will. They gather a tiny amount of data from user posts and queries but the vast majority of their data comes from surveillance: web tracking, scanning emails and documents, data from apps and third parties and ambient surveillance from products like Alexa, Google Assistant…and Pokemon Go!. Google and Facebook use data voodoo dolls to provide their customers – who are marketers – with perfect information about every consumer. They use the same data to manipulate consumer choices. Just as in China, behavioural manipulation is the goal. The algorithms of Google and Facebook are tuned to keep users on site and active, preferably by pressing emotional buttons that reveal each users true self. For most users, this means content that provokes fear or outrage. Hate speech, disinformation and conspiracy theories are catnip for these algorithms.”

In Zucked, McNamee argues that what people do, or influence others to do, when they get access to our data voodoo dolls, legally or illegally, can be devastating:

“The vast majority of the data in your voodoo doll got into the hands of internet platforms without your participation or permission. It bears little relation to the services you value. The harm it causes is generally to other people, which means that other people’s data can harm you. That is what happened to the victims in El Paso, Christchurch, and so many other places. Do we want the power of roughly three billion data voodoo dolls to be available to anyone willing to pay for access? Would it not be better to prevent antivaxxers from leveraging Google’s predictions about pregnancy to indoctrinate unsuspecting mothers-to-be with their conspiracy theory, placing many people at risk of infectious disease? The same question needs to be asked about climate change denial and white supremacy, both of which are amplified by internet platforms. How about election interference and voter suppression? Internet platforms did not create these ills, but they have magnified them. Is it really acceptable for corporations to profit from the algorithmic amplification of hate speech, disinformation, and conspiracy theories? Do we want to reward corporations for damaging society?”

Do you want to pay Google $2.79 a month to damage society? If we don’t, what can we do?

One thing would be to have massive Global Google Deletion Days (with the cool #G2D2 hashtag) where millions of users simultaneously delete big chunks of their data to avoid going over the 15GB limit. This would deprive Google of the extra revenue they’d get from people paying for Google One cloud storage, and the revenue they were making off the deleted content.

Start going through your stuff…and enjoy finding and sharing some gems from your past.

Categories
GDPR GoC Regulation SurCap

All the great “free” stuff we get from Google and Facebook is costing us a lot more than our privacy

In my post, COVID-19 could mean we lose and surveillance capitalists win — again, I discussed some of the challenges of surveillance capitalism (surcap). This post starts the discussion of what we can collectively do about those challenges.

One of the first issues is who is “we”? Only those who think there’s a problem will see the need for a solution. However, unlike the rise of the resistance to industrial capitalism that was partly fuelled by people slaving under horrible working conditions, surveillance capitalism’s most negative effects are mostly cloaked. Most people see only the benefits like free search, email and YouTube.  

For those that do see a problem, there are, thus, two challenges:

  1. How to fight back.
  2. How to get more people to join the fight.

As knowing how to fight back is key to getting more people to join the fight, let’s focus on that for now.

The success of surveillance capitalism is due to a lot more than lots of folks feeling hooked on great, free tools like Facebook and Gmail. In her book, The Age of Surveillance Capitalism, released in February 2019, Shoshana Zuboff identifies 16 reasons for surcap’s success. Here are 7 of them, including the most personal ones:

  1. Unprecedented – Surcap is a completely new phenomenon so we have a hard time fully understanding it as we tend to compare it to things we know.
  2. Velocity – “Surcap rose from invention to domination in record time”, says Zuboff. She says this is by design to freeze resistance while distracting us with immediate gratification.
  3. Inevitability – Surcap rhetoric makes us believe that it’s all inevitable and we should simply accept it, enjoy its benefits – and don’t think too much about any possible down side.
  4. Inclusion – Paraphrasing Zuboff, “Many people feel that if you’re not on Facebook, you don’t exist. People all over the world raced to participate in Pokemon Go. With so much energy, success and money flowing into surcap, standing outside of it, let alone against it, can feel like a lonely and risky prospect.”
  5. Ignorance – Surcap’s inner workings are secretive by design. Their systems are intended to ensnare us, preying on our vulnerabilities bred by an unequal balance of knowledge, and amplified by our scarcity of time, resources and support.
  6. Dependency – Most people find it difficult to withdraw from using surcap’s free tools and many wonder if it is even possible.
  7. No alternatives – There just aren’t great alternatives to Google Search, in terms of quality, and Facebook in terms of ubiquity. There are better alternatives to things like Gmail and Google Docs but, with so many people using them, it’s very hard to switch.

However we decide to combat surcap, one thing is clear: we can’t do it on our own. Stopping surcap’s march will require many of us constantly pushing our governments to bring in effective regulation – and working to get more folks to join the fight. 

In terms of government regulations, Zuboff says many hopes today are pinned on the EU’s new General Data Protection Regulation (GDPR), which became enforceable in May 2018. The EU approach fundamentally differs from that of the US in that companies must justify their data activities within the GDPR’s regulatory framework. The regulations introduce several key new substantive and procedural features, including:

  • a requirement to notify people when personal data is breached;
  • a high threshold for the definition of “consent” that puts limits on a company’s reliance on this tactic to approve personal data use;
  • a prohibition on making personal information public by default;
  • a requirement to use privacy by design when building systems;
  • a right to erasure of data; and
  • expanded protections against decision making authored by automated systems that imposes “consequential” effects on a person’s life.

The new regulatory framework also imposes substantial fines for violations, which will rise to a possible 4% of a company’s global revenue, and it allows for class-action lawsuits in which users can combine to assert their rights to privacy and data protection. 

In May, 2019, Jim Balsillie, co-founder of Research in Motion that created the Blackberry, made other suggestions for what governments can do. Balsillie appeared as a witness, alongside Zuboff and Zucked author Roger McNamee, at a hearing of the International Grand Committee on Big Data, Privacy and Democracy, in Ottawa and suggested:

  1. Eliminating tax deductions of specific categories of online ads.
  2. Banning personalized online advertising during elections.
  3. Implementing strict data governance regulations for political parties.
  4. Providing effective whistle-blower protections.
  5. Adding explicit personal liability alongside corporate responsibility to affect CEO and board of director decision-making.
  6. Creating a new institution for like-minded nations to address digital co-operation and stability.

The Grand Committee is a first step towards achieving #6 as it has members from around the globe, some of whom come from countries like the Philippines which are already experiencing life and death consequences of uncontrolled surcap. However, it is #5 that may have the most impact given one of the most dangerous impacts of surcap: an increase in online hate fuelled by real “fake news”.

One of the points that came through clearly at the Committee hearing is the most disturbing: surcap companies resist removing hateful content and fake news because it generates by far the most engagement and the most money. Balsillie’s point was that making CEOs and board members personally liable for such content would make them think twice about letting it proliferate on their platforms.

So now it’s up to us to demand that our political leaders start implementing ideas like Balsilli’s.

Our very freedom is at stake just like it was during the Second World War. Only this time, instead of being controlled through blatant terror by a power that knows very little about us, we’re being controlled through hidden systems by powers that know almost everything about us.

Categories
COVID19 SurCap

COVID-19 could mean we lose and surveillance capitalists win — again

This post is on rabble.ca

Update, April 18 – Since posting this, I found some good blog posts on The Age of Surveillance Capitalism. Barbara Fisher’s post, The Age of Surveillance Capitalism: A Mixed Review on the Inside Higher Ed site, nicely explains how businesses aren’t just using the predictions about our behaviour that they buy from surveillance capitalists to help them better target their ads:

“That Fitbit your employer paid for? It feeds information to insurers that can use to change your behaviour and reduce costs – or charge you more if you don’t comply. Google drove into our neighbourhoods with camera-equipped cars to capture images of our communities and create detailed maps that will be useful for routing their self-driving cars and even planning entire cities where everything will be connected and everyone’s life experience moment by moment can be rendered as data.”