Privacy is fundamental to Humane Tech (and Democracy)!

I’ve been working in tech for over 20 years, and have always been aware (and worried) of privacy issues. The last couple of months I have read many, many articles and discussions about the subject, and it is WAY, WAY WORSE than I even suspected. The Cambridge Analytica scandal (among others) that led everyone to focus on FB is just the tiniest tip of a humongous iceberg.

I think that regaining control of our privacy is essential if we ever want to successfully fight tech addiction and other problems stated by the CHT. More attention to this subject is in order, IMHO.

There is a vicious cycle at play that drags us ever further in a privacy nightmare swamp that will amplify our tech problems, if we don’t break the cycle. Simplified the cycle goes something like this:

  1. Start: Ad-based business models have proven to be very lucrative, and there is much to gain still, making it an attractive choice
  2. Attract: In order to increase ad effectiveness (and revenue) user retention and attention needs to be increased
  3. Retain: Behavioral designers, psychologists and marketers continuously optimize apps and services to achieve this, making us addicts of their products
  4. Collect: Users of these apps and services are sucked dry of an ASTOUNDING amount of personal information (amounting to many gigabytes)
  5. Re-sell: This information is sold on to many 3rd parties, adding an additional revenue stream, and 3rd parties in turn sell that data on to yet many other parties (eventually amounting in the 1000’s), each with their own revenue models, thereby further increasing the value of the data
  6. Analyze: Big data analytics, machine learning and AI’s analyze, combine, enrich, de-anonimize and aggregate the data. 3rd parties feed data back to app and service providers
  7. Aggregate: This all leads to incredibly detailed, non-anonymous user (psycho)profiles - other people can now know you better than you know yourself
  8. Advertise: These profiles are exposed by products or API’s, or directly as data packages to advertisers and marketers who buy ad-space and create targeted, optimized ads to entice us
  9. Hook: Both the ads AND increasingly the content (!) are turned into tailor-made, personal customer experiences. A/B testing (creating slight variations in layout and content) is used to optimize the experience in the next cycle
  10. Profit: Profit has been made and was maximized based on available data, and ‘quality of service’ (for anyone but us).
  11. Repeat: And now we go back to square 2, and the cycle repeats itself, becoming ever more pervasive

If CHT wants to be successful we have to come up with solutions that break this chain at one or more of these steps (note: we should work on all aspects, be full-spectrum).

While most users of this community are, I think, most focussed on 3) and 9) (retain and hook), where our addictions are created, I think the most prominent issue is the data collection, which boils down to privacy.
Sure, we can come up with guidelines and practices for better behaviour, providing alternatives to dark patterns, etc. But in this cycle there is very little incentive for the perpetrators to adopt them, besides cosmetic improvements to address some criticism, just enough to satisfy the larger public.

Therefore I argue: Privacy is fundamental to Humane Tech!

Protect your Privacy

If we don’t provide our data, or only provide the data that we want, have control over it, then the complete cycle breaks down!

We - the users - have to this. We have to take the initiative. As for addictiveness, for privacy the big tech players that dominate our world have no incentive to improve, unless they are absolutely forced to do so (e.g. by regulation or huge public pressure, which usually ebbs away after a while).

One example of big tech having no incentive to improve is Google/Alphabet (and Samsung) with their Android devices. Google could easily ensure more privacy/security for apps in their Play store… were it not that about 70% of all Android apps contain Google trackers. Google is the biggest perpetrator worldwide (followed at a distance by Facebook).

On any Android device they have pervasive surveillance of your activity, harvesting among others your Android ID and IMEI number which are very personal.
Their trackers/spyware are often packaged as nice, free utilities for software and web developers (Google Analytics - there is no better, Google Font API’s - reduce site traffic costs, etc.).

This great (technical) article has details on a study that was conducted regarding Android apps:

Note that it is not just ad-based models - where you are the product - that are problematic. For instance Sony and Samsung - hardware providers where you are a paying customer - are huge violators of privacy. This Hacker News discussion for example is about Smart TV’s doing their utmost to spy on you:

I am curious to hear all of your thoughts on this, so WDYT?


Agree 100%, nice post and information. I’m also an app developer plus I have 35 years of wireless industry experience including senior executive experience. The term that this community needs to adopt is “Surveillance Capitalism” in regards to the business model that supports online services such as Facebook and/or connected products supported by the android OS, Apple iOS and Microsoft Windows OS.

Surveillance Capitalism is the root problem in regards to predatory surveillance and data acquisition (“data mining”) business practices employed by data driven technology providers such as Google, Apple, Microsoft, Facebook, BAIDU, and other tech giants who are in the business to exploit technology user’s for financial gain at the expense of privacy, safety, and cyber security.

The solution is simple, we need an Electronic Bill of Rights to protect consumers and children from data driven technology providers who employ predatory surveillance and data mining business practices especially when it comes to products and services that require payment to participate such as smartphones.

People need to get off their A$$ and get involved by engaging law makers. Especially industry insiders who claim that their is a problem yet refuse to take action in regards to engaging law makers. I wrote a policy change proposal- Electronic Bill of Rights that I’ve been submitting to law makers such as senators and congressional representatives plus I’ve submitted the policy change proposal to my service providers such as AT&T coupled with official consumer complaints against their business partners that include Google & Apple.

For more information go to this link and click on the green button: You are very knowledgeable and right on track. Contact me at if you want to connect and take action. Keep posting relevant information such as the post I’m replying to. Regards- Rex

1 Like

My take on this is that better regulation on users data through well crafted laws and implementation would give consumers leverage over the tech companies to give them better products and services and protection from bad actors. The European Union General Data Protection Regulation is a significant step in this direction it will take effect this coming May 2018. The regulation ensures the consumers own their private informations thus have the right to control its usage and the tech companies have obligation to give the consumers tools to control their data. The rules also give the consumers the abiltiy to erase data that they want to be forgotten. I think consumers control over their data here is key in solving tech addiction, threat to democracy, emotional health of users, etc.

1 Like

Richard Stallman - a true hero in tech - wrote this in The Guardian:

There is an interesting discussion on Hacker News as well with lotsa food for thought:

A Tough Task For Facebook: european-Type Privacy For all

While I hope that companies will implement GDPR for U.S. consumers as well, I’m expecting that it won’t happen…maybe pretending that I’m an EU citizen will work? (Make a social media post about it, get an EU-based VPN, etc.)

From an application/device development perspective, I think that data collection can work, but it has to be opt-in rather than opt-out. If users want to provide you data, they can; if nobody’s providing you data, maybe your value prop isn’t strong enough for them.

Yes, this might work, but it could be hard to make it look like you are hailing in from the EU, given all the personal data you are broadcasting.

There was a discussion on Hacker News this week, where this topic was brought up. See:

Because the fines of GDPR violations can amount to huge sums of money the social network might err on the safe side and allow you to have the GDPR privacy umbrella, but they may also opt to detect ‘falsified’ profiles and take legal action against you, if only to make the case and discourage others from doing the same (there is sooo much interest in this worldwide… people will be lining up in droves).

The best thing, of course would be if the social network company decides to implement GDPR-like privacy worldwide. But they may also decide to attack / hollow-out the GDPR instead.
I suspect that there will be some disappointment initially among internet users, because the GDPR-enforcing institutions will be so overburdened. They’ll only take on some big cases initially, I think, to create jurisprudention.

From the HN comments:

"> 1. A Data Subject under GDPR is anyone within the borders of the EU at the time of processing of their personal data. However, they can also be anyone and anywhere in the context of EU established Data Controllers an Data Processors.

It’s not even resident, the bar is far lower. A US resident on holidays to europe is covered.

See (linked in a sibling comment)"

So, worst case, all I would need to do is go on a quick vacation.

Based on the comments, it may be difficult for Facebook to come up with a way to accurately determine if a person is a GDPR “data subject”. I agree that it would be best (and easiest) to just apply GDPR everywhere. But, as the old saying goes, expect the worst, hope for the best. :slight_smile:

1 Like

I think facebook need to be regulated. From the article:

Sandy Parakilas: I led Facebook’s efforts to fix privacy problems on its developer platform in advance of its 2012 initial public offering. What I saw from the inside was a company that prioritized data collection from its users over protecting them from abuse.

1 Like

Thx @anon74302558! I edited your post a bit so its more clearly related to the topic :slight_smile:

There is an opportunity now to really do something to help get this regulation come about!

Activism alert:

Don’t just #deletefacebook (not yet anyway), but engage Facebook, and engage your government representatives!! Now is the time!

  1. If you live outside of the USA, then contact Facebook through all the channels you can find, and demand more Privacy

  2. Additionally if your are a USA resident, contact your local legislator, your senator, governor, etc. and make your demands clearly heard!

  3. Activate others to do the same! Share this idea on your social media channels

Note that the government by itself is not all that interrested in good privacy, partly in light of the War on Terror (but this is also used as an excuse), partly because they are being heavily lobbied by the tech corporations (privacy measures threaten their bottom line).

Note also that Democracy requires Freedom of Speech, and that again requires Privacy.

According to CNN the Senators fired “A sea of very weak questions.” at Zuckerberg in his interrogation yesterday (one Senator asked “How do you sustain your business model, given your users use if for free?”. ZB looked surprised, then smiled “Senator, we run ads.”… the public burst into laughter), and he was excellently well prepared. As a result FB stock jumped 4.5% in the plus…

So: Your action is required!

Facebook and GDPR

Facebook has been considering rolling out the EU privacy regulation GDPR worldwide, but is no scrambling back, getting hesitant. Having GDPR be applied worldwide would be A Big Win for Humane Tech, so use this in your argumentation when taking action!

To help you collect arguments, here’s a search query to a whole bunch of Hacker News discussions on the topic:

(PS I’ll also add the activism label to this topic)

1 Like

I don’t understand why so many people are upset with Mark Zuckerberg. He is just giving you what you want, although you might be wondering why he collects all your personal information.

Turns out Senator Orrin Hatch would like to know…

“So, how do you sustain a business model in which users don’t pay for your service?“ asked Sen. Orrin Hatch, the Utah Republican. Zuckerberg could hardly contain a smirk as he replied: “Senator, we run ads.”

Two reasons really, one he doesn’t think you care and two he can sell it to cover his expenses.

  1. An early email exchange…

Zuck: Yeah so if you ever need info about anyone at Harvard
Zuck: Just ask
Zuck: I have over 4,000 emails, pictures, addresses, SNS
[Redacted Friend’s Name]: What? How’d you manage that one?
Zuck: People just submitted it.
Zuck: I don’t know why.
Zuck: They “trust me”
Zuck: Dumb f….(expletive deleted)

Now he deletes his sent messages from the recipients computer.

  1. Creative Strategies ran a study across 1000 Americans who are representative of the US population in gender and age. "When we asked our panelists if they would be interested in paying for a Facebook version without advertising and with stricter guarantees of privacy protection 59% said no.”

FaceBooks 2017 revenue was $40 billion with about two billion active users. If 40 percent are willing to pay the cost of a no tracking no ads service it would cost less than $5.00 a month. I pay $5.00 just to read Medium articles and I doubt that advertisers would still want to serve ads to the remaining 59 percenters in any case.

Problem Solved

a@cowboy. Do you think because its free it is a license to harm the mental health of users, manipulate democracy ( that help elect dangerous leaders that has worldwide consequence ), promote and enable hate speech that harm society (rohingya genocide ) and zero respect of privacy among others? It is obvious that people all over the world realized that facebook business model that consider user data as a kind of product and monopoly does some harm to their interest in a big way. But we cannot turn the clock that is why many people unites to reform or change the system. but you have point in having alternative business model.

“The scandal swirling around Facebook and Cambridge Analytica has begun to usher in a new era for this once-ignored community of privacy researchers and developers. After years of largely disregarding their warnings about exactly what companies like Facebook were doing — that is, collecting enormous amounts of information on its users and making it available to third parties with little to no oversight — the general public suddenly seemed to care about what they were saying.”

‘Many tech companies understand that Washington’s renewed focus on privacy is likely to reach them soon, said Dean Garfield, head of the Information Technology Industry Council, a trade group representing the largest tech companies.’

“The tide has shifted,” he said. “I don’t think this is passing us by.”

1 Like

I would like to throw in a couple of thoughts:

  1. There are hundreds of billions of dollars dedicated to finding ways to separate consumers from their data. Any effort at regulation is unlikely to succeed because the stakes of gaming the system are so high. That doesn’t mean that we shouldn’t use regulation as part of the solution, just not the whole solution. Alternatively, it is possible to look for a shift in the market that makes it possible to disrupt this business model. The whole home personal assistant is that shift. The data that will be generated from microphones, cameras and digital communications will dwarf what is happening with Facebook. If this system is correctly crafted to protect consumers, it will tend to starve out the existing players. Don’t use gov as a brake. Use innovation to obsolete the model. Quick sanity check. Do you really think any gov will tell consumers that they can’t allow a company to collect their data? If not, then the task for companies is just to convince you to sign up. Trivial circumvention of almost any regulation.
  2. I’ve been working on forecasting the evolution of the smart home and parts beyond where this system comes into play. It has the following characteristics:
    a) Data is owned by the consumer. The consumer is paid for its use.
    b) Data can be collected but never sold.
    c) Data can be used in an “escrowed manner” to extract data value without giving the data away.
    d) Right to be forgotten is required.
    e) Most data is kept physically in the home. An understanding of the emerging architecture makes this more rational.
    f) A business structure is created that generates money for the consumer and aligns its interests with the consumer (biz profits when consumer profits). The biz initially works to use consumer data to verify identity. Identity services are provided for banks and others (identity fraud prevention).

By way of a little background. I’ve spent about 16 years developing models to explain the evolution of technology markets in order to predict behavior. Technology is a self-assembling system that is not random. It can be forecast to about 15 years reliably. There is an event building that is centered on the smart home and smart car. The details above are the results of analysis that I’ve been doing on this sector.

The “consumption convergence” occurring in the home can be leveraged to disrupt the existing data rape and pillage models. What it requires is a handful of major companies who are not currently participating in the rape and pillage models to create the core of a new system. Those companies should be telecom, utilities, auto and at least one major internet player who is reasonably ethical and forward looking. The potential for getting paid for data and the ability to defuse identity theft risk has the potential to cause consumers to flock to the service. The system builds momentum as data is added and the consumer payments escalate.

1 Like


What you were saying basically is more of the same.This device basically monitor what you say and do. Most of personal assistant in the market are owned by Google, Amazon and Apple. The business model of personal assistant is still based on surveilance machine maximizing the harvest of data and human attention. This is another scheme of doing it. Data which could be targeted for ads and products recomendation or worse misuse for political manipulation of voters or large population of people. That is why regulation is needed here.

I’m impresssed that you’ve been able to predict the evolution of technology for so long.

However I don’t understand your current assessment. “Smart” assistants require many sohisticated servers to process the data, currently the biggest multibillionaire companies out there Google, Apple and Amazon.

To that respect I don’t see any difference between this current web and app space and the “smart” home. I also don’t see the “smart” home as being that popular.

Also anyone weird enough actually use a “smart” assistant or “smart” home is pretty much asking to be taken advantage of. These kind of indescriminate consumers who would buy such products don’t really care about their privacy, they just want addictive mind candy.

Old habit is hard to change.

“FACEBOOK WEDNESDAY ANNOUNCED changes to how it asks users for permission to collect their personal information, in order to comply with strict new European privacy rules. But critics say Facebook’s new offerings seem designed to encourage users to make few changes and share as much information as possible.”

Thanks for sharing! Yea, this was already in the line of expectation. The good part of the upcoming GDPR is that for Europeans certain privacy settings will be opt-in, while for the rest of the world they will be opt-out, like new automatic facial recognition facilities, that - when enabled - kick in on all images FB can lay their hands on.

It remains to be seen how Facebook will package this opt-in setting, though. Also in the line of expectation is that they will apply some dark pattern to make this seem really attractive, or no real choice at all. Let’s be watchful and see what they come up with.

1 Like

The stakes, as the Cambridge Analytica debacle makes clear, are high. In her remarks at the World Economic Forum’s Annual Meeting in Davos this year, German chancellor Angela Merkel linked the data governance question to the very health of democracy itself. “The question ‘who owns that data?’ will decide whether democracy, the participatory social model, and economic prosperity can be combined,” Merkel said.