Two days ago, a common topic detonated the Internet and became a trending topic on Weibo for two days in a row. It's called "Information Cocoon Room". Many people have heard this term before and understood it, but the reason why everyone reacted so strongly this time is because this is the first time they have faced the information cocoon so intuitively.
The thing is like this.
Some netizens discovered that under a video about a couple quarreling, different accounts saw different comment areas.
In the comment area seen on his account, the top comments are all from male netizens, and their positions are all from male perspectives; in the comment area seen on his girlfriend’s account, the opposite is true.
Left male, right female ▼
Under this Weibo, many netizens questioned whether this would invisibly affect our judgment.
The next day, a blogger saw this video and decided to do a test.
She registered a new Douyin account, followed Yixiaoqingcheng, kept liking videos of senior citizens, and pretended to be middle-aged and senior citizens in the online world.
After doing this for an hour, she found herself entering a new world.
The people competing online are no longer young anchors, but a few old men with sparse hair and shiny heads; the netizens who are online are also aunties of about the same age.
Below a video of an old man tasting tea, the top comments are all from real middle-aged and elderly people.
But when the blogger switched back to his account and found this tea tasting video, the first comment in the comment area was a comment he had never seen before.
This means that different ages will see different comment areas.
After the incident came to light, it immediately aroused doubts from many netizens. In addition to forwarding and commenting, they also went to the comment area of the original video to test it.
Someone asked whether the netizens who could see his comments were male or female;
Someone posted a screenshot of his comment area to let others see if it is the same;
Many people thought of their previous experience and concluded that "the algorithm is indeed customizing the comment area."
For example, a netizen with a Sichuan IP said that every time he saw the first comment, he was from Sichuan.
There are also a lot of people saying that it’s no wonder that every time they comment, they find that others are laning in the void, and the co-authored comment area is not the same version.
There are even more "conspiracy theories" saying that this is a short video platform deliberately provoking confrontation between men and women through algorithms.
In fact, after hearing about this, I also took the mobile phones of three colleagues (1 male and 2 female) for testing.
However, we found that is below the original video. Except for the slightly different order of individual comments, the comment area is generally the same.
In order to rule out the situation of the same IP, I also found a friend dozens of kilometers away to test, and the situation was the same.
Maybe we are late, or are we still in grayscale?
Later, I also read a few other bloggers, including a beautiful anchor, an andrology doctor, and a lawyer on the topic of bride price.
These topics are relatively easy to cause gender opposition. I want to see if there is any "gender customization" mentioned by netizens in their comment area.
As a result, in the comment area of the last lawyer, we encountered a situation where the first comments were quite different. As for the other two, they were completely consistent.
Based on this and past experience, I can’t say that the short video platform’s comment sorting is deliberately promoting something, but I can say:
It is by no means completely sorted according to the dimension of popularity & time.
In the past, when we opened the comment area of some social platforms, we would see two options: popularity & time.
However, on the short video platform, users do not have the right to choose the order of comments.
For example, comments with low popularity on Douyin sometimes appear before those with high popularity;
A similar situation also occurs on Kuaishou.
We cannot judge whether the short video comment area is connected to the algorithm. However, if users are not even given the right to choose independently in the sorting of comments, this will undoubtedly increase the information cocoon and bias everyone's perspective.
The first thing to make clear is that the "Information Cocoon" is not a product of the algorithmic era. It originated from Sunstein's 2006 book "Information Utopia", which is about A phenomenon:
The public will only pay attention to the things they choose and the areas that make them happy. Over time, they will shackle themselves in a "cocoon room" like a cocoon.
The emergence of algorithms will intensify the formation of "information cocoons".
Because we are constantly fed what we like to watch and what we want to watch. Once the information input becomes simplistic, our perspective on things will also become unidimensional, and our thinking will become narrow.
German film scientist Siegfried Kracauer wrote a book "The Nature of Film", which tells a story.
A director made a short urban film and showed it to African indigenous people who had never been exposed to movies.
The video showed bright lights and high-rise buildings, but after watching it, the audience had no reaction to these, and only enthusiastically discussed a chicken that briefly appeared in the short film.
The director himself didn’t even know there was a chicken in the short film, but later he discovered that there was a chicken wandering around the corner of a certain one-second shot.
Why do Aboriginal people pay attention to chickens? Because they only know chickens, the chickens become the protagonists, and the unknown high-rise buildings become the background.
Later there was a saying in film studies: Have you seen a chicken?
It means that when everyone reads a work, what we see is just the chicken in our eyes, which depends on the information we have received.
It's like asking everyone to name their favorite movie. You might choose "Oppenheimer", your friend might choose "Barbie", and your cousin might choose "Wolf Warrior".
But no matter who chooses, his answer must be limited to "the movies he has watched."
What determines the answer is experience, cognition, and the information input into the brain. Once the algorithm makes the information you receive monolithic, your view and analysis of things will become one-sided.
One-sidedness is one, and the other is extreme.
Because we can only hear opinions that we agree with. After repeated and deepened, our thinking will solidify and exclude dissidents. Eventually, an echo chamber effect will occur, and opinions will amplify, expand and become extreme in our minds.
On the Internet, we often see people with different opinions quarreling.
Because in the world they see, they all feel that they are right and the majority, and people who are different from them are simply incomprehensible.
But in the real and complex world, things are not black and white.
I don’t know if you all feel the same way. Even in the era when comments are sorted by popularity, in many posts, it often happens that the views of the entire floor are skewed by high-like comments, and those with opposite opinions can only be seen at the very back.
Because people have a herd mentality, people are more inclined not to isolate themselves than to publicize their own judgments. Sometimes you have to see the direction of the wind to determine your own ideas.
So what would it look like if the comment section was no longer sorted by popularity and was ruled by an algorithm?
This will cause people with common labels (gender, hobbies) to be pulled into the same group comment area, and will cause some opposing opinions that should have been scanned by you to disappear completely.
People are more likely to converge, become more extreme, and become more isolated from other groups.
You can imagine, If men and women really surf the Internet separately, and both sides can't hear each other's thoughts at all, will gender opposition be reduced or greatly deepened?
Of course, the above is just a hidden worry for most people.
The conditions for the formation of a real information cocoon are relatively harsh.
Two scholars from Tsinghua University and Communication University of China once published an article, mentioning that "information cocoon room" is a specious concept. There is no strong research to confirm its existence, making it difficult for an "information cocoon room" environment to appear.
For example, in the third quarter of 2019, there were 606 million Douyin users and 414 million Kuaishou users. The overlap rate of these users reached 36.4%, which means that people are generally unlikely to be in a "single information environment" that can form an information cocoon.
After all, we usually receive information in many ways, including various social media platforms, friend circles, etc., which can help us understand the world.
What is really worth worrying about are some neglected groups, such as middle-aged and elderly people.
They are often in a low-frequency, single-threaded social environment. The way they understand the Internet world is that apart from WeChat, there may only be a certain short video platform. Is it really okay for them to only obtain the information they "should" obtain in the long term?
However, even if the conditions for the formation of information cocoon rooms are harsh, it does not prevent everyone from paying attention to and being vigilant about it.
Toutiao is one of the earliest news apps to use algorithmic mechanisms. It has only been launched for four years and has more than 60 million daily users, with an average user usage time of 76 minutes.
This is the magic of personalized recommendations, attracting users and retaining users.
At the time, we didn’t think there was anything wrong with it, we just felt it was fresh and even a little addicting.
In the past few years, more and more apps have been connected to the algorithm system. From Weibo, which makes a living by "following the system", to Qiqiudi, which started out as a professional game report, to Hupu, which is loved by netizens as a traditional forum, it has been revised one after another.
Countless apps would rather abandon their own traditions and genes, but must devote themselves to the algorithm like a brave man.
Although users were a little uncomfortable at first, the daily activity and data showed two words: really delicious.
Over time, users found it acceptable, but the content became more and more complex.
It was not until this incident was exposed that everyone began to feel that something was wrong.
Because, it finally touched the audience's cake.
As we all know, the voice of video producers and comment area speakers is not equal. The influence of the opinions expressed in the video can be compared with the comments section by 10,000 people who hold the same opinions.
Everyone is afraid that the algorithm will deprive them of the right to come together to happen.
Besides, the algorithm does not recommend content to you, but you can still search for it.
But the algorithm does not recommend comments to you, and it is difficult for you to accurately locate them.
They are buried in tens of thousands of comment areas, and may even be completely hidden and disappear from your Internet.
I understand that algorithms are the inevitable product of simplifying people's acceptance of information in the current era of information overload. They are also a technical means that many platforms will use sooner or later in order to extend user retention time.
But the algorithm deserves everyone’s vigilance. It is like a secretly invading robot, from feeding you videos to subtly influencing and even shaping our personality.
In fact, it is nothing more than a mathematical model, which can be adjusted and optimized, and can prevent us from being in an overly single information environment, but the decision-making power lies with the brain behind them.
A long time ago, there was a joke circulating in the world:
The role of the Internet was originally to open the eyes of the frogs in the well and learn about the world outside the wellhead. However, the actual situation is that thousands of frogs in the well know each other through the Internet, recognize and affirm each other, and after a long period of communication, they reach a consensus: the world is indeed only as big as the mouth of the well.
Behind the humor is a heavy truth.
But now it seems that the algorithm seems to have made it heavier.