Privacy is fundamental to Humane Tech (and Democracy)!

‘Many tech companies understand that Washington’s renewed focus on privacy is likely to reach them soon, said Dean Garfield, head of the Information Technology Industry Council, a trade group representing the largest tech companies.’

“The tide has shifted,” he said. “I don’t think this is passing us by.”

1 Like

I would like to throw in a couple of thoughts:

  1. There are hundreds of billions of dollars dedicated to finding ways to separate consumers from their data. Any effort at regulation is unlikely to succeed because the stakes of gaming the system are so high. That doesn’t mean that we shouldn’t use regulation as part of the solution, just not the whole solution. Alternatively, it is possible to look for a shift in the market that makes it possible to disrupt this business model. The whole home personal assistant is that shift. The data that will be generated from microphones, cameras and digital communications will dwarf what is happening with Facebook. If this system is correctly crafted to protect consumers, it will tend to starve out the existing players. Don’t use gov as a brake. Use innovation to obsolete the model. Quick sanity check. Do you really think any gov will tell consumers that they can’t allow a company to collect their data? If not, then the task for companies is just to convince you to sign up. Trivial circumvention of almost any regulation.
  2. I’ve been working on forecasting the evolution of the smart home and parts beyond where this system comes into play. It has the following characteristics:
    a) Data is owned by the consumer. The consumer is paid for its use.
    b) Data can be collected but never sold.
    c) Data can be used in an “escrowed manner” to extract data value without giving the data away.
    d) Right to be forgotten is required.
    e) Most data is kept physically in the home. An understanding of the emerging architecture makes this more rational.
    f) A business structure is created that generates money for the consumer and aligns its interests with the consumer (biz profits when consumer profits). The biz initially works to use consumer data to verify identity. Identity services are provided for banks and others (identity fraud prevention).

By way of a little background. I’ve spent about 16 years developing models to explain the evolution of technology markets in order to predict behavior. Technology is a self-assembling system that is not random. It can be forecast to about 15 years reliably. There is an event building that is centered on the smart home and smart car. The details above are the results of analysis that I’ve been doing on this sector.

The “consumption convergence” occurring in the home can be leveraged to disrupt the existing data rape and pillage models. What it requires is a handful of major companies who are not currently participating in the rape and pillage models to create the core of a new system. Those companies should be telecom, utilities, auto and at least one major internet player who is reasonably ethical and forward looking. The potential for getting paid for data and the ability to defuse identity theft risk has the potential to cause consumers to flock to the service. The system builds momentum as data is added and the consumer payments escalate.

1 Like

@wood.stephenafrontie

What you were saying basically is more of the same.This device basically monitor what you say and do. Most of personal assistant in the market are owned by Google, Amazon and Apple. The business model of personal assistant is still based on surveilance machine maximizing the harvest of data and human attention. This is another scheme of doing it. Data which could be targeted for ads and products recomendation or worse misuse for political manipulation of voters or large population of people. That is why regulation is needed here.

I’m impresssed that you’ve been able to predict the evolution of technology for so long.

However I don’t understand your current assessment. “Smart” assistants require many sohisticated servers to process the data, currently the biggest multibillionaire companies out there Google, Apple and Amazon.

To that respect I don’t see any difference between this current web and app space and the “smart” home. I also don’t see the “smart” home as being that popular.

Also anyone weird enough actually use a “smart” assistant or “smart” home is pretty much asking to be taken advantage of. These kind of indescriminate consumers who would buy such products don’t really care about their privacy, they just want addictive mind candy.

Old habit is hard to change.

“FACEBOOK WEDNESDAY ANNOUNCED changes to how it asks users for permission to collect their personal information, in order to comply with strict new European privacy rules. But critics say Facebook’s new offerings seem designed to encourage users to make few changes and share as much information as possible.”

Thanks for sharing! Yea, this was already in the line of expectation. The good part of the upcoming GDPR is that for Europeans certain privacy settings will be opt-in, while for the rest of the world they will be opt-out, like new automatic facial recognition facilities, that - when enabled - kick in on all images FB can lay their hands on.

It remains to be seen how Facebook will package this opt-in setting, though. Also in the line of expectation is that they will apply some dark pattern to make this seem really attractive, or no real choice at all. Let’s be watchful and see what they come up with.

1 Like

The stakes, as the Cambridge Analytica debacle makes clear, are high. In her remarks at the World Economic Forum’s Annual Meeting in Davos this year, German chancellor Angela Merkel linked the data governance question to the very health of democracy itself. “The question ‘who owns that data?’ will decide whether democracy, the participatory social model, and economic prosperity can be combined,” Merkel said.

2 Likes

Your assessment is correct, privacy in regards to the problem being bigger than Facebook Cambridge Analytical. However, people are having a hard time identifying what the underlining problem is in regards to privacy, cyber security, and consumer exploitation threats that were brought to light by the Facebook Cambridge Analytica revelation.

The privacy, cyber security, and consumer & child exploitation threats brought to light by the Facebook Cambridge Analytica incident are systemic to all online services such as social media and connected products supported by predatory surveillance and data mining business practices rooted in “Surveillance Capitalism”.

Lawmakers, the FTC, FCC, DOJ, and other relevant agencies need to address Surveillance Capitalism as a whole due to the fact that data driven technology providers such as Google, Apple, Microsoft, Facebook, Samsung, Amazon and other tech giants have adopted the Surveillance Capitalism Business Model.

For more information please read this topic: Surveillance Capitalism- The Need for an Electronic Bill of Rights

Regards- Rex

The loss of privacy will be all-pervasive if we do not act to protect it:

Three things that should frighten you:

  1. In China, the government is using data to control the country’s population…
  2. The recent coverage of Cambridge Analytica and Facebook shows just how much corporations in the West know about us without us knowing.
  3. Quantum computing will soon be able to break modern encryption, laying open everything we so far thought was private and safe, and more powerful computers will be able to search and map this data going back through digital time.

Most of all, I feel for our children. They are growing up in a world where everything is connected, viewable, shared. They obsess over their image, worry about their following and who likes their posts. […] We should anticipate a moment when a future version of Google can search for every image with your face in it. […] A generation back, you could make mistakes, do the stupid things teenagers do, and let it be buried by time. That is over.

Agree- I reviewed a Vice news report that was focused on China’s use of surveillance technology, surveillance data, and sensitive user data to apply a social ranking to citizens.

There are other security solutions than encryption that may provide a breakthrough in protection such as the use of cognitive ID.

I agree with your assessment in regards to child privacy. My biggest concern is the use of “Digital Discrimination” to lawfully discriminate against people based on their data rating much like China is doing today.

Lawmakers, the FCC, FTC, and privacy advocates really need to focus on Surveillance Capitalism which is the root of all evil in regards to the threat to civil liberties such as due process and privacy- Rex

1 Like

The cognitive ID idea :wink: is interesting… I bumped into this commercial website before (is this what you are referring to?): http://cognitive-id.com/

I am wondering if - while this may solve the problem with passwords - this is a privacy nightmare in itself. Because anyone can (and will) be using this to identify the person behind the screen, even with them not being aware of this, as all that is needed is add a ‘single line of Javascript in your webpage’.

I’m wondering about the role of societal norms. Reading Harcourt’s book on the expository society (http://www.hup.harvard.edu/catalog.php?isbn=9780674504578&content=reviews) made me reflect not just on insidious business models and surveillance, but so too the normalisaiton of sharing, ‘transparency’ (in a very singular way), being surveilled, and yes, surveilling others up to the point where we encounter sousveillance and quanitfied self. We can change the business models and the settings, but is society in a place to pull back from this laissez-faire attitude towards privacy and sharing (which are different, as one implies control and choice)?

2 Likes

So everyone’s all about privacy - which I agree I’m also not into people with nefarious agendas knowing all about me-or is it that I don’t want them using it for nefarious agendas ? Meaning it’s not my privacy that I care about - it’s how you’re using it as a tool in a way that hurts me and helps you. I bring this up because I feel that in some ways anonymity has actually been the most destructive aspect of the internet and I want it to go away! There’s a difference between privacy and anonymity. You shouldn’t be able to talk to people without revealing your name or likewise post or influence or troll.

The facebook scandal involving cambridge analytica is only the latest of ethically questionable business practices by a tech company and hardly be the last. Google confirmed in 2014 that its gmail system read all the emails and uber admitted that pesonal infomations of drivers and customers were compromised in 2016 but choose not to inform them fo more than a year. The public reactions remain the same those techs need oversight. governance and accountabiltiy. We cannot let them continue to self regulate themselves. What I am trying to say is there must some sort of regulation about privacy.and much better if it is a worldwide privacy regulation because tech companies reach knows no borders. The stakes are to high. Our lives and future generation lives under constant surveillance.

This continuing saga of data collection by tech company like google inspite of the backlash and this time better data gathering is like the good old days once more.

“As Facebook becomes increasingly synonymous with the tech backlash, a growing chorus of voices is urging more focus on Google’s data-collection and -analysis practices, which The Wall Street Journal warns could be worse than Facebook’s. But consumers may be too busy Googling to care”

1 Like

I cannot even tell you how strongly I believe in this issue. It’s a civil rights issue.

As much as I respect Stallman, I worry that “the only safe database is the one that was never collected” is loosing way to go. Why? It is simply so beneficial for many reasons from personal / family perspective, that even if e.g. Europe could with its more strict laws keep massive data collection somewhat in control, pressure in e.g. health related services would be huge. Think about it this way: one day we learn that China has really advanced AI in health care, it is superior to what others have to offer - because they had billion people database for deep learning (and other kinds of AI that benefit from all the data one can feed to them).
So: we should probably try to come up with solutions that guarantee that researchers have good access to anonymised data, keep AI community as open as possible and still try to keep quite strict laws. It will be tricky, but I think we have to do this well, or much is in danger. I will be writing more about this at dreamsorientedcomputing.com , “Own our own data” is drafts title (owning data is kind of funny way to put it, on purpose - the term takes us further than only thinking from privacy viewpoint).

1 Like

Yes, I agree, @Mikko. Stallman’s database that was never collected could be done in part by regulation, but for other ways to have real control of your own data other, better technology is needed to offer that control and protection. Of course it would also help if people knew in what ways they leaked data, and actively tried to minimize this data leakage.

Especially in the workplace people are subject to increasing privacy invasions by their employers (it’s the boss’ time after all, right):

Tech companies are coming up with ever more bizarre and intrusive ways to monitor workforces.

How to balance freedom and safety? Start by ensuring that the digital world, like the real one, has places where law-abiding people can enjoy privacy. Citizens of liberal democracies do not expect to be frisked without good cause, or have their homes searched without a warrant. Similarly, a mobile phone in a person’s pocket should be treated like a filing cabinet at home. Just as filing cabinets can be locked, encryption should not be curtailed.

And the discussion on Hacker News: https://news.ycombinator.com/item?id=17247288