News

How the polarizing effect of social media is speeding up : NPR


The YouTube algorithm has been criticized for pushing content that radicalizes users instead of informing them.

AFP via Getty Images


hide captions

switch captions

AFP via Getty Images


The YouTube algorithm has been criticized for pushing content that radicalizes users instead of informing them.

AFP via Getty Images

For many people, checking social media has become a habit when they log in, see something that angers or upset them, and repeat the cycle immediately.

If that feels right to you, it’s not your imagination.

Max Fisher is a journalist who focuses on the impact of social media on global conflicts and our daily lives, and has covered extensively. The The New York Times.

In his new book, Chaos machine, Fisher details how the polarizing effect of social networks is accelerating. He joined All things Considered to talk about why tech companies benefit from this outrage and the danger it can pose to society.

“Remember that the number of seconds in your day never changes. However, the amount of social media content competing for those seconds doubles every year or so, depending on how you measure it. For example, imagine that your network generates 200 posts a day that you have time to read about 100. Due to the tilt of the platform, you will find half of your feed heavily offended. Next year, when 200 doubles to 400, you will find the most offended quarter, the following year the eighth most outraged. Over time, your impression of your own community becomes more virtuous. , more serious and infuriating, and so are you, and at the same time, less compelling content forms. like the stars in Times Square.”

– An excerpt from Chaos machine

This interview has been lightly edited for length and clarity.

Highlights of the interview

On why social media algorithms drive users to outrage

When you log into Facebook, Twitter, or YouTube, you think what you’re seeing is a neutral reflection of your community and what’s going on. [your community] are talking about. When you interact with it, you think you are getting feedback from your peers, from other people online. But in reality, what you’re seeing and what you’re experiencing are choices made by these incredibly complex automated systems designed to find the exact combination of posts. , how to order those posts, how to present them to you. most engage with some very specific cognitive triggers and cognitive weaknesses that aim to promote certain emotions. They have the effect of activating certain impulses and instincts that make you feel really compelled to go back to the platform to spend more time on it.

The other things [upsetting posts] are the ones that are most appealing to us, because they speak to a sense of social coercion, of the group identity being “threatened”. In particular, moral outrage is perhaps the most powerful form of online content. And that’s the kind of content that catches your eyeballs, and largely your emotions, because it taps into these deeply developed instincts that we have as social animals. , like group animals, is essentially self-preservation.

Meta, formerly known as the Facebook company, owns Facebook, Whatsapp and Instagram.

Photo Leon Neal / Getty


hide captions

switch captions

Photo Leon Neal / Getty


Meta, formerly known as the Facebook company, owns Facebook, Whatsapp and Instagram.

Photo Leon Neal / Getty

On how this ties in with social media platforms to reach viewership goals

So what the systems that govern YouTube and govern what you see have realized, is to serve [viewership] target, they will need to deliver new content that can create some sort of sense of crisis and some sort of feeling that you and your identity are at stake.

So that could mean that if you’re looking for, say, health tips, vaccine info, the best thing for YouTube to show you isn’t simple health information. The best thing for YouTube to show you is something that makes you feel like you’re part of a community, say mothers who care about their children, and that community is under threat from a number of dangers from the outside. And that will trigger the alarm, making you want to go back and spend more time watching.

How many people can use social media without being radicalized?

For the vast majority of us, the effect is minimal. Spending more time on social media will make you significantly more polarized, it will give you a much sharper view of the people on the other side, or maybe the people who support only one character. other in the political party you support, it will make you have harsher views of people outside the group in general, and it will make it easier for you to feel inwardly resentful of yourself. and your own moral outrage. That’s something I think we all feel. And that may be true for those of us who spend time on social media who don’t turn into crazy conspiracy theorists, but will feel it pulling us.

About possible solutions

Whenever I ask the experts who research this, what do they think? It’s always some version of turning it off. Don’t shut down the entire platform, don’t shut down the site. But turning off the algorithm. With likes off, the little counter at the bottom of a post tells you how many people have liked or retweeted that post. It’s something that even Jack Dorsey, the former head of Twitter, came up with, because he saw that it was so harmful.

But turning off these interactions maximizers is something we’ve actually tested. And such a version of social media, I think can bring a lot of good [social media] yield, this is real, and minimizes some harm.

This story has been adapted for the web by Manuela Lopez Restrepo.



Source link

news7f

News7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button