• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World

Shadow Wolf

Certified People sTabber & Business Owner
You want to ban people from seeing what they want to see?
Alot of time it's not what you want to see amd not what you set out to find. Things I've seen without looking for it including pro-Trump, anime rape, pro-Confederate amd even things for black women's hair.
 

Secret Chief

Degrow!
Regarding Facebook's pivot to the "emotionally engaging interactions" algorithm in 2014, p. 121:

... while the relationship between a cable TV network and the viewer is one-way, the relationship between a Facebook algorithm and the user is bidirectional. Each trains the other. The process, Facebook researchers put it, somewhat gingerly, in an implied warning that the company did not heed, was "associated with adopting more extreme attitudes over time and misperceiving facts about current events."​

That seems like the understatement of the century, looking back from 2024.

Regarding YouTube's recommendation algorithm, analyzed after an outbreak of violence in Chemnitz, Germany which was fomented and organized online, p. 200:

Disturbingly, YouTube's recommendations clustered tightly around a handful of conspiracy or far-right videos. This suggested that any user who entered the network of Chemnitz videos - say, by searching for news updates or watching a clip sent to them by a friend - would be pulled by Youtube's algorithm toward extremist content. Asked how many steps it would take, on average, for a youTube viewer who pulled up a Chemnitz news clip to find themselves watching far-right propaganda, Serrato answered, "Only two." He added, "By the second you're quite knee-deep in the alt right."​
Recommendations rarely led users back to mainstream news coverage, or to liberal or apolitical content of any kind. Once among extremists, the algorithm ended to stay there, as if that had been the destination all along.​
Just up to p.96. Frightening stuff...
 

Secret Chief

Degrow!
Regarding Facebook's pivot to the "emotionally engaging interactions" algorithm in 2014, p. 121:

... while the relationship between a cable TV network and the viewer is one-way, the relationship between a Facebook algorithm and the user is bidirectional. Each trains the other. The process, Facebook researchers put it, somewhat gingerly, in an implied warning that the company did not heed, was "associated with adopting more extreme attitudes over time and misperceiving facts about current events."​

That seems like the understatement of the century, looking back from 2024.

Regarding YouTube's recommendation algorithm, analyzed after an outbreak of violence in Chemnitz, Germany which was fomented and organized online, p. 200:

Disturbingly, YouTube's recommendations clustered tightly around a handful of conspiracy or far-right videos. This suggested that any user who entered the network of Chemnitz videos - say, by searching for news updates or watching a clip sent to them by a friend - would be pulled by Youtube's algorithm toward extremist content. Asked how many steps it would take, on average, for a youTube viewer who pulled up a Chemnitz news clip to find themselves watching far-right propaganda, Serrato answered, "Only two." He added, "By the second you're quite knee-deep in the alt right."​
Recommendations rarely led users back to mainstream news coverage, or to liberal or apolitical content of any kind. Once among extremists, the algorithm ended to stay there, as if that had been the destination all along.​
Just finished it. Thanks for the heads up. Sobering reading. I'm going to offer it to the computer science teacher at my school to read.
 

Secret Chief

Degrow!
58950649._SX318_.jpg


The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World
By Max Fisher

NYT bio: Max Fisher is a New York-based international reporter and columnist.
He has reported from five continents on conflict, diplomacy, social change and other topics.


The Chaos Machine details how Facebook, YouTube, Twitter, Instagram, TikTok, and (more outside the U.S.) WhatsApp, have not only amplified fringe ideas that led to real world harm, but have actually been the Petri dishes where hoaxes, rumors, and conspiracies were first cultivated before going viral in a way that took users from online rhetoric to physical violence.

Max Fisher is an international investigative journalist who traced the radicalization of internet users from 4Chan and Gamergate through the early days of Facebook and YouTube, into the platforms' forays into “emotionally engaging interactions” and “deep learning." In a word, algorithms. Algorithms which fed hyper-partisanship, racism, and extremism; which had the ability to turn relatively moderate, even apolitical users into radicalized conspiracy theorists who truly believed they were in an us-against-them fight, sometimes literally to the death. What radicalized users fear is more likely to be a concocted figment of the lies and misinformation of their echo chambers which they perceive as a threat to their demographic status. The algorithms work most efficiently with emotions of fear, anger, and outrage, relentlessly delivering users to the fringes of their ideologies and keeping them there.

Tech companies made billions off their platforms’ fomenting of unrest not just in the United States (Stop the Steal, QAnon, Plandemic), but around the world. Fisher went from the United States to Sri Lanka, India, Myanmar, Germany, and Brazil, investigating how rumors about immigrants, elections, Covid, vaccines, birth control, teachers brainwashing students, child trafficking, moved from social media to real life violence -massacres, riots, and individual disruption of people's lives and safety via harassment and threats. Anything that would cause fear or outrage in one particular demographic and pit them against another: the common denominators, again and again, were Facebook and YouTube.

Said a Sri Lankan presidential advisor after riots driven by Facebook rumors that spread like wildfire through populations whose main source of news was Facebook:

“You, the United States itself, should fight the algorithm. What compels Facebook beyond that?”

What indeed? Money.

What started with Facebook invariably moved on to YouTube. According to the many digital experts in countries across the globe who consulted with Fisher, their consensus is “look at YouTube.” Sociologist Zeynep Tufeckci calls YouTube “one of the most powerful radicalizing instruments of the twenty-first century.”

The massacres in 2019 at two mosques in Christchurch, New Zealand, was perpetrated by a killer who was radicalized online, who live-streamed the attack on Facebook, who thanked Candace Owens on YouTube for teaching him to embrace violence. Says Fisher: “When New Zealand government investigators finished their yearlong examination of how the Christchurch massacre had happened, the greater culpability lay, they indicated, with YouTube.”

I can’t even begin to do justice to the information this book provides, but I hope this will be enough to pique the interest of anyone wanting to know about how and why the right-wing hoaxes and conspiracies develop, amplify, and become so fixed in the minds of conservatives that they will not believe anything other than that they are true. Because as Fisher explains, the algorithm pushes more people to the fringe right than the left. This is a right-wing problem that affects everyone.

Not mentioned in the book, but as an addendum to this OP:

Leon Festinger was a social psychologist who came up with the theory of cognitive dissonance. He studied a doomsday cult who believed the world was going to end on December 21, 1954. When the world didn’t end as they'd planned for, such as leaving their jobs and disposing of their possessions, rather than admit they’d been duped into believing the apocalypse was upon them, the cult came up with alternative explanations that reinforced their beliefs. They doubled down. As the link at wiki outlines, Festinger and his team found the cult resisted disconfirmation of their beliefs by these defense mechanisms:
1. The belief must be held with deep conviction and be relevant to the believer's actions or behavior.​
2. The belief must have produced actions that are arguably difficult to undo.​
3. The belief must be sufficiently specific and concerned with the real world such that it can be clearly disconfirmed.​
4. The disconfirmatory evidence must be recognized by the believer.​
5. The believer must have social support from other believers.​

When people believe in Trump enough to storm the Capitol and lose their jobs and end up in prison, when they believe in QAnon enough to gather in Dealey Plaza to see an undead JFK and lose family members and friends in the process, when they believe the Covid vaccine will inject a microchip into them, when they believe in other countries that false rumors about immigrants or minorities are threatening enough to converge on neighborhoods to riot and attack, when they fear even routine vaccinations for themselves or their children because of rumors and conspiracies, when social media can propel a president into power (Bolsonaro) or subvert an election to keep one there unconstitutionally (Trump) - you can see those points play out from beginning to end. The social support comes from social media groups of likeminded whose likes and shares can move, via the algorithm, a fringe idea from the shadows into the spotlight of millions and millions of views. It only takes one of them to commit another Christchurch.

"We have all experienced the futility of trying to change a strong conviction, especially if the convinced person has some investment in his belief. We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks.​
But man’s resourcefulness goes beyond simply protecting a belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view.”​
― Leon Festinger - When Prophecy Fails: A Social & Psychological Study of a Modern Group that Predicted the Destruction of the World​
I'm now reading a related book - Stolen Focus: Why You Can't Pay Attention by Johann Hari. You might want to check it out.

- https://www.amazon.co.uk/gp/aw/d/15...oaB6n4foNJ1P2_GFaFrOVAw&qid=1716500388&sr=8-1
 
58950649._SX318_.jpg


The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World
By Max Fisher

NYT bio: Max Fisher is a New York-based international reporter and columnist.
He has reported from five continents on conflict, diplomacy, social change and other topics.


The Chaos Machine details how Facebook, YouTube, Twitter, Instagram, TikTok, and (more outside the U.S.) WhatsApp, have not only amplified fringe ideas that led to real world harm, but have actually been the Petri dishes where hoaxes, rumors, and conspiracies were first cultivated before going viral in a way that took users from online rhetoric to physical violence.

Max Fisher is an international investigative journalist who traced the radicalization of internet users from 4Chan and Gamergate through the early days of Facebook and YouTube, into the platforms' forays into “emotionally engaging interactions” and “deep learning." In a word, algorithms. Algorithms which fed hyper-partisanship, racism, and extremism; which had the ability to turn relatively moderate, even apolitical users into radicalized conspiracy theorists who truly believed they were in an us-against-them fight, sometimes literally to the death. What radicalized users fear is more likely to be a concocted figment of the lies and misinformation of their echo chambers which they perceive as a threat to their demographic status. The algorithms work most efficiently with emotions of fear, anger, and outrage, relentlessly delivering users to the fringes of their ideologies and keeping them there.

Tech companies made billions off their platforms’ fomenting of unrest not just in the United States (Stop the Steal, QAnon, Plandemic), but around the world. Fisher went from the United States to Sri Lanka, India, Myanmar, Germany, and Brazil, investigating how rumors about immigrants, elections, Covid, vaccines, birth control, teachers brainwashing students, child trafficking, moved from social media to real life violence -massacres, riots, and individual disruption of people's lives and safety via harassment and threats. Anything that would cause fear or outrage in one particular demographic and pit them against another: the common denominators, again and again, were Facebook and YouTube.

Said a Sri Lankan presidential advisor after riots driven by Facebook rumors that spread like wildfire through populations whose main source of news was Facebook:

“You, the United States itself, should fight the algorithm. What compels Facebook beyond that?”

What indeed? Money.

What started with Facebook invariably moved on to YouTube. According to the many digital experts in countries across the globe who consulted with Fisher, their consensus is “look at YouTube.” Sociologist Zeynep Tufeckci calls YouTube “one of the most powerful radicalizing instruments of the twenty-first century.”

The massacres in 2019 at two mosques in Christchurch, New Zealand, was perpetrated by a killer who was radicalized online, who live-streamed the attack on Facebook, who thanked Candace Owens on YouTube for teaching him to embrace violence. Says Fisher: “When New Zealand government investigators finished their yearlong examination of how the Christchurch massacre had happened, the greater culpability lay, they indicated, with YouTube.”

I can’t even begin to do justice to the information this book provides, but I hope this will be enough to pique the interest of anyone wanting to know about how and why the right-wing hoaxes and conspiracies develop, amplify, and become so fixed in the minds of conservatives that they will not believe anything other than that they are true. Because as Fisher explains, the algorithm pushes more people to the fringe right than the left. This is a right-wing problem that affects everyone.

Not mentioned in the book, but as an addendum to this OP:

Leon Festinger was a social psychologist who came up with the theory of cognitive dissonance. He studied a doomsday cult who believed the world was going to end on December 21, 1954. When the world didn’t end as they'd planned for, such as leaving their jobs and disposing of their possessions, rather than admit they’d been duped into believing the apocalypse was upon them, the cult came up with alternative explanations that reinforced their beliefs. They doubled down. As the link at wiki outlines, Festinger and his team found the cult resisted disconfirmation of their beliefs by these defense mechanisms:
1. The belief must be held with deep conviction and be relevant to the believer's actions or behavior.​
2. The belief must have produced actions that are arguably difficult to undo.​
3. The belief must be sufficiently specific and concerned with the real world such that it can be clearly disconfirmed.​
4. The disconfirmatory evidence must be recognized by the believer.​
5. The believer must have social support from other believers.​

When people believe in Trump enough to storm the Capitol and lose their jobs and end up in prison, when they believe in QAnon enough to gather in Dealey Plaza to see an undead JFK and lose family members and friends in the process, when they believe the Covid vaccine will inject a microchip into them, when they believe in other countries that false rumors about immigrants or minorities are threatening enough to converge on neighborhoods to riot and attack, when they fear even routine vaccinations for themselves or their children because of rumors and conspiracies, when social media can propel a president into power (Bolsonaro) or subvert an election to keep one there unconstitutionally (Trump) - you can see those points play out from beginning to end. The social support comes from social media groups of likeminded whose likes and shares can move, via the algorithm, a fringe idea from the shadows into the spotlight of millions and millions of views. It only takes one of them to commit another Christchurch.

"We have all experienced the futility of trying to change a strong conviction, especially if the convinced person has some investment in his belief. We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks.​
But man’s resourcefulness goes beyond simply protecting a belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view.”​
― Leon Festinger - When Prophecy Fails: A Social & Psychological Study of a Modern Group that Predicted the Destruction of the World​

Here’s a mathematician’s perspective.

Suspend Your Disbelief (or, how to ruin everything in 7 steps)​

 

Twilight Hue

Twilight, not bright nor dark, good nor bad.
I agree with most everything you've excerpted and said in this thread... but...

Let's not kid ourselves into thinking that social media's pernicious and devastating impact on the world is limited to the right wing.

Any time we see people advocating for magical thinking (religious or otherwise), and/or spouting dogma, they should be regarded with great suspicion.

You can see the programming working all to well by the behavior of the leftests fanatical chronic obsession to vilify the right at every turn and opportunity.
 
Top