Since the internet has existed, and even before then (on BBS systems and so on) threaded conversations have been used as the primary means of social interaction and debate. From this conversation, to Facebook or Twitter, Slack, YouTube comments, the Discuss comments sections for hundreds of newspapers, IRC, Usenet, or practically any system anywhere, the user’s primary social interactions rely on threaded conversations.
In 40 years, the sum intelligence of UI designers and developers have produced nothing better, so in every new product, we always use the same way of communicating.
This tool has no sense of history or nuance. It provides no background information. It does not help it’s users learn, or evolve. Even when we build more intelligent and complex systems, the communication hub is invariably a threaded conversation, (see StackOverflow, GitHub etc).
So, here we are, having another threaded conversation. It might work OK, and might end up productive, but I believe the only reason that is able to happen, is because people on this forum already have a shared point of view and interest. On the other hand, a single troll arriving could soon turn this thread into a flame war, and have it suffer from the same demise that most communication on the internet does today.
I think there are a few serious problems with threads, as the primary method of interaction on the internet:
Threaded conversations don’t scale - the more people there are involved, the less likely anyone is to use their frontal cortex before replying to posts.
One troll can hyjack the thread and destroy any useful conversation and rapport between participants. Trolls have too much power to influence conversation.
No one feels like they are being heard, so have to use stronger language and less subtle points to have any hope of anyone noticing them.
Being a troll is incentivised, because people get more attention when they say an outrageous thing. This is an example of negative reinforcement.
There’s no real incentive to spend time and effort backing up your point with research and facts. Most debate is shallow and based on automatic prejudice and reaction. Longer posts don’t get read, and you get more attention for saying things that people already thought and believed, whether they are true or not.
For a long time I’ve thought that this is partly a UX issue, and that we as humans and developers have given ourselves an inappropriate tool which just encourages abuse and lack of thought.
As a software dev I feel some responsibility for this. I feel like as an industry we need to design better tools and interfaces which encourage constructive criticism, fact based debate, and incentivise that sort of interaction. They should discourage hate speech and unthinking replies - not just by banning abusers, but by ensuring the game doesn’t favour such actors. Right now, the game does favour such actors - two such case points, Milo Yiannopoulos and the US president.
To some extent we have some such existing tools. For example, though StackOverflow is largely thread based, it incentivises thoughtful answers and helpfulness to some extent. It rewards users for their work, novel solutions to problems, and their expertise.
My question is - how can we do things better? How can we move from “the loudest wins” to something more like deliberative democracy?
What tools can we build to stop the facile “like” being the only way to incentivise people’s interactions?