Thank you, @Blumsday. I asked questions like these awhile back, and they didn’t get answered. I am guessing that the absence of an answer reflects some limitation of the Discourse software.
For those who didn’t see my initial comment on this topic, I’d like to ask if we could have an icon that simply indicates “read that.” There is too great a gap between a heart and no response at all; some sort of intermediary is needed, I think.
So interesting to see these books paired together.
In Reader, Come Home, Maryanne Wolf tells us why reading is not an instinctive act like speaking, i.e., why we have to be taught to read. She goes into the science of the brain, and her explanation helps us understand why we don’t all comprehend and interpret–in the same ways–the text we read. Furthermore, when the ways we are taught to read change–and when what we read changes–we can lose certain analytical skills. And certain communication skills.
Going back to social networks: perhaps they replace certain kinds of “reading” so that we don’t have to understand; and by popular vote, such social conveniences are deemed “good.”
@scottcapener had this to say about going in the opposite direction: cultivating a genuine love of learning.
This is a heads up to community members. We will soon be launching an alternative solution to FaceBook that is secure and private, No Ads - No Tracking - No Likes - No Nagging - No Feeds - No Registration - No App
Hi @frrrst - very interesting question, and very best of luck with your research project.
I’d like to answer your query with a question if I may, which is essentially the same question I ask any startup I come across which is trying to build a ‘new Facebook’ or ‘human social media’ or anything like that. It’s a rather long, cumbersome question, but I think it gets to the core issue of building such an alternative social media platform.
How are you going to fund the industry-level salaries of locally-employed, university-educated, permanently-contracted content moderators, as well as their training, benefits, and psychiatric supports? And you do realise that you will have to employ at least one of these for every thousand members your platform gains?
The point being that many of the problems we have with social media are due to the fact that their owners have not adequately invested in content moderation. It has to be carried out by properly trained and well-resourced individuals, who have knowledge of local customs and morals, and who do not have ridiculous workloads. That’s the only way you can create a social media environment which is humane - proper content moderation.
So unless your blueprint for a new social media platform contains a plan to fund those roles, it has not learned the lesson of any of the recent controversies.
Hi @cjamcmahon, thanks for posting. I have to say I messed up this topic a bit and the post dates are wrong. This was originally posted in April, I believe (sorry, should’ve mentioned in a comment).
The point you make is a very good one, though. This is a huge problem for large-scale social networks that are targeted at a broad audience (like FB, IG, Snapchat et al). There are smaller scale networks that are doing very fine. My favourite is Hacker News - the social network for (Silicon Valley) techies - which is moderated by just a few people, and with help of a very good reputation system (upvoting/downvoting by users, and karma scores, plus ranking algorithms).
I am also fan of the Fediverse where moderation is distributed among individual federated servers. On Mastodon for instance each server instance has their own moderators who can set their own moderation policies. Divide and conquer. Most (if not all) of these are volunteers.