• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World

anna.

colors your eyes with what's not there
58950649._SX318_.jpg


The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World
By Max Fisher

NYT bio: Max Fisher is a New York-based international reporter and columnist.
He has reported from five continents on conflict, diplomacy, social change and other topics.


The Chaos Machine details how Facebook, YouTube, Twitter, Instagram, TikTok, and (more outside the U.S.) WhatsApp, have not only amplified fringe ideas that led to real world harm, but have actually been the Petri dishes where hoaxes, rumors, and conspiracies were first cultivated before going viral in a way that took users from online rhetoric to physical violence.

Max Fisher is an international investigative journalist who traced the radicalization of internet users from 4Chan and Gamergate through the early days of Facebook and YouTube, into the platforms' forays into “emotionally engaging interactions” and “deep learning." In a word, algorithms. Algorithms which fed hyper-partisanship, racism, and extremism; which had the ability to turn relatively moderate, even apolitical users into radicalized conspiracy theorists who truly believed they were in an us-against-them fight, sometimes literally to the death. What radicalized users fear is more likely to be a concocted figment of the lies and misinformation of their echo chambers which they perceive as a threat to their demographic status. The algorithms work most efficiently with emotions of fear, anger, and outrage, relentlessly delivering users to the fringes of their ideologies and keeping them there.

Tech companies made billions off their platforms’ fomenting of unrest not just in the United States (Stop the Steal, QAnon, Plandemic), but around the world. Fisher went from the United States to Sri Lanka, India, Myanmar, Germany, and Brazil, investigating how rumors about immigrants, elections, Covid, vaccines, birth control, teachers brainwashing students, child trafficking, moved from social media to real life violence -massacres, riots, and individual disruption of people's lives and safety via harassment and threats. Anything that would cause fear or outrage in one particular demographic and pit them against another: the common denominators, again and again, were Facebook and YouTube.

Said a Sri Lankan presidential advisor after riots driven by Facebook rumors that spread like wildfire through populations whose main source of news was Facebook:

“You, the United States itself, should fight the algorithm. What compels Facebook beyond that?”

What indeed? Money.

What started with Facebook invariably moved on to YouTube. According to the many digital experts in countries across the globe who consulted with Fisher, their consensus is “look at YouTube.” Sociologist Zeynep Tufeckci calls YouTube “one of the most powerful radicalizing instruments of the twenty-first century.”

The massacres in 2019 at two mosques in Christchurch, New Zealand, was perpetrated by a killer who was radicalized online, who live-streamed the attack on Facebook, who thanked Candace Owens on YouTube for teaching him to embrace violence. Says Fisher: “When New Zealand government investigators finished their yearlong examination of how the Christchurch massacre had happened, the greater culpability lay, they indicated, with YouTube.”

I can’t even begin to do justice to the information this book provides, but I hope this will be enough to pique the interest of anyone wanting to know about how and why the right-wing hoaxes and conspiracies develop, amplify, and become so fixed in the minds of conservatives that they will not believe anything other than that they are true. Because as Fisher explains, the algorithm pushes more people to the fringe right than the left. This is a right-wing problem that affects everyone.

Not mentioned in the book, but as an addendum to this OP:

Leon Festinger was a social psychologist who came up with the theory of cognitive dissonance. He studied a doomsday cult who believed the world was going to end on December 21, 1954. When the world didn’t end as they'd planned for, such as leaving their jobs and disposing of their possessions, rather than admit they’d been duped into believing the apocalypse was upon them, the cult came up with alternative explanations that reinforced their beliefs. They doubled down. As the link at wiki outlines, Festinger and his team found the cult resisted disconfirmation of their beliefs by these defense mechanisms:
1. The belief must be held with deep conviction and be relevant to the believer's actions or behavior.​
2. The belief must have produced actions that are arguably difficult to undo.​
3. The belief must be sufficiently specific and concerned with the real world such that it can be clearly disconfirmed.​
4. The disconfirmatory evidence must be recognized by the believer.​
5. The believer must have social support from other believers.​

When people believe in Trump enough to storm the Capitol and lose their jobs and end up in prison, when they believe in QAnon enough to gather in Dealey Plaza to see an undead JFK and lose family members and friends in the process, when they believe the Covid vaccine will inject a microchip into them, when they believe in other countries that false rumors about immigrants or minorities are threatening enough to converge on neighborhoods to riot and attack, when they fear even routine vaccinations for themselves or their children because of rumors and conspiracies, when social media can propel a president into power (Bolsonaro) or subvert an election to keep one there unconstitutionally (Trump) - you can see those points play out from beginning to end. The social support comes from social media groups of likeminded whose likes and shares can move, via the algorithm, a fringe idea from the shadows into the spotlight of millions and millions of views. It only takes one of them to commit another Christchurch.

"We have all experienced the futility of trying to change a strong conviction, especially if the convinced person has some investment in his belief. We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks.

But man’s resourcefulness goes beyond simply protecting a belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view.”​
― Leon Festinger - When Prophecy Fails: A Social & Psychological Study of a Modern Group that Predicted the Destruction of the World​
 

anna.

colors your eyes with what's not there
I wasn't expecting this, but I'm also not surprised. Pg. 209 ff:

One of the online alt right's most important gateways is the YouTube page of Jordan Peterson, a Canadian psychology professor. In 2013, Peterson began posting videos addressing, amid esoteric Jungian philosophy, youth male distress. He offered life advice (clean your room, sit up straight) and exhortations against racial and gender equality as imperiling "the masculine spirit."​
YouTube searches for "depression" or certain self-help keywords often led to Peterson. His videos' unusual length, sixty minutes or more, align with the algorithm's drive to maximize watch time. So does his college-syllabus method of serializing his argument over weeks, which requires returning for the next lecture and the next. But most of all, Peterson appeals to what the sociologist Michael Kimmel calls "aggrieved entitlement."​
. . . .​
Users who comment on Peterson's videos subsequently became twice as likely to pop up in the comments of extreme-right YouTube channels, a Princeton study found. Peterson himself doesn't recommend the channels - the algorithm makes the connection. . . .​
The social platforms had arrived, however unintentionally, at a recruitment strategy embraced by generations of extremists. The scholar J.M. Berger calls it "the crisis-solution construct." When people feel destabilized, they often reach for a strong group identity to regain a sense of control. It can be as broad as nationality or narrow as a church group. Identities that promise to recontextualize individual hardships into a wider conflict hold special appeal. You're not unhappy because of your struggle to contend with personal circumstances; your're unhappy because of Them and their persecution of Us. It makes those hardships feel comprehensible and, because you're no longer facing them alone, a lot less scary.​
. . . .​
YouTube can be an especially effective indoctrinator because it moves users in increments. Jordan Peterson tells viewers that their individual travails stem from a conflict pitting them agains social justice warriors - crisis. Millennial Woes rallies them to collectively defend themselves against the feminists and minorities opposing them - resolution.​
 

anna.

colors your eyes with what's not there
Regarding Facebook's pivot to the "emotionally engaging interactions" algorithm in 2014, p. 121:

... while the relationship between a cable TV network and the viewer is one-way, the relationship between a Facebook algorithm and the user is bidirectional. Each trains the other. The process, Facebook researchers put it, somewhat gingerly, in an implied warning that the company did not heed, was "associated with adopting more extreme attitudes over time and misperceiving facts about current events."​

That seems like the understatement of the century, looking back from 2024.

Regarding YouTube's recommendation algorithm, analyzed after an outbreak of violence in Chemnitz, Germany which was fomented and organized online, p. 200:

Disturbingly, YouTube's recommendations clustered tightly around a handful of conspiracy or far-right videos. This suggested that any user who entered the network of Chemnitz videos - say, by searching for news updates or watching a clip sent to them by a friend - would be pulled by Youtube's algorithm toward extremist content. Asked how many steps it would take, on average, for a youTube viewer who pulled up a Chemnitz news clip to find themselves watching far-right propaganda, Serrato answered, "Only two." He added, "By the second you're quite knee-deep in the alt right."​
Recommendations rarely led users back to mainstream news coverage, or to liberal or apolitical content of any kind. Once among extremists, the algorithm ended to stay there, as if that had been the destination all along.​
 

Stevicus

Veteran Member
Staff member
Premium Member
It makes sense. The people who built this technology were probably just thinking in terms of profits and how much money they would make, but they may not have been aware of how it would be used by certain segments of the population.

Even back to when I was a kid, I recall people talking about conspiracy theories and all kinds of weird stuff, but it was usually in private conversations. It wasn't the kind of thing you'd see on TV or in the regular news - although people had doubts about that, too. It didn't seem like they were telling the full story, and too many things didn't make sense, so people would fill in the blanks on their own with their own theories. Just idle speculation that usually went nowhere.

Occasionally, one might find someone with leaflets or some "underground" or alternative newspaper with something about aliens or silent radio controlling people's minds (with tinfoil being a way of blocking out the mind control rays). I've also run across a number of people who fall into a Bircher way of thinking.

Of course, to be able to put this all to paper, print it up, and try to distribute it costs money, so one didn't find much printed material on this, unless went looking for it deliberately. Likewise, it was expensive to broadcast on radio or TV, so it wasn't on the airwaves either. However, I volunteered at the local community access cable station for a while and got a pretty clear idea of just how dedicated people are to their particular cause or way of thinking that they've just got to get their words out there - however they can with whatever meager resources they have. This was before the WWW came on the scene and years before YouTube.

Once people started connecting to the internet, the technology opened up new avenues of disseminating information. Those who had been practically shut out of the broadcast and print media suddenly had a new medium at their disposal. It seemed like a whole new level of communication, with the potential to bring people together, to promote greater understanding of other people's views - and this has happened to a large extent.

But there's been a downside, as noted here.

Trying to control it or censor it won't really work in the long run, as there are also weaknesses in the technology which have been exploited and continue to be exploited. But there may be ways of making people more aware and able to defend and protect against it.

Another part of the problem is that the mainstream media and the "official" sources of government aren't always reliable or honest either, which is how people get put on to alternative media in the first place. If the government could simply tell the truth and operate with a greater degree of transparency and openness, then it would probably take the wind out of the sails of most of the conspiracy theories out there.
 

anna.

colors your eyes with what's not there
page 213:

In 2018, an outlet called Bellingcat scoured an archive of private far-right chat rooms that totaled hundreds of thousands of messages. The investigators scanned for instances where users had mentioned how they'd arrived at the cause. The single most common entry point they cited: YouTube. They would start with banal videos, many said, then be recommended into channels that were more and more extreme.​
page 216-17, researchers Kaiser and Rauchfleich:

"Being a conservative on YouTube means that you're only one or two clicks away from extreme far-right channels, conspiracy theories, and radicalizing content."​
Others soon confirmed the "radicalization pipeline," as the Brazilian researcher Manoel Horta Ribeiro called it. His team, analyzing 72 million comments across 330,000 videos, found that "users consistently migrate from milder to more extreme content." Right-wing users, a huge population, moved from "intellectual dark web" contrarians like Jordan Peterson to alt-right voices like Milo Yiannopoulos to hate leaders like the neo-Nazis Andrew Anglin and Mike Peinovich. And the users moved in parallel with YouTube's recommendations, further evidence that it was the algorithm that drove them.​
 
Last edited:

anna.

colors your eyes with what's not there
It makes sense. The people who built this technology were probably just thinking in terms of profits and how much money they would make, but they may not have been aware of how it would be used by certain segments of the population.

Even back to when I was a kid, I recall people talking about conspiracy theories and all kinds of weird stuff, but it was usually in private conversations. It wasn't the kind of thing you'd see on TV or in the regular news - although people had doubts about that, too. It didn't seem like they were telling the full story, and too many things didn't make sense, so people would fill in the blanks on their own with their own theories. Just idle speculation that usually went nowhere.

Occasionally, one might find someone with leaflets or some "underground" or alternative newspaper with something about aliens or silent radio controlling people's minds (with tinfoil being a way of blocking out the mind control rays). I've also run across a number of people who fall into a Bircher way of thinking.

Of course, to be able to put this all to paper, print it up, and try to distribute it costs money, so one didn't find much printed material on this, unless went looking for it deliberately. Likewise, it was expensive to broadcast on radio or TV, so it wasn't on the airwaves either. However, I volunteered at the local community access cable station for a while and got a pretty clear idea of just how dedicated people are to their particular cause or way of thinking that they've just got to get their words out there - however they can with whatever meager resources they have. This was before the WWW came on the scene and years before YouTube.

Once people started connecting to the internet, the technology opened up new avenues of disseminating information. Those who had been practically shut out of the broadcast and print media suddenly had a new medium at their disposal. It seemed like a whole new level of communication, with the potential to bring people together, to promote greater understanding of other people's views - and this has happened to a large extent.

But there's been a downside, as noted here.

Trying to control it or censor it won't really work in the long run, as there are also weaknesses in the technology which have been exploited and continue to be exploited. But there may be ways of making people more aware and able to defend and protect against it.

Another part of the problem is that the mainstream media and the "official" sources of government aren't always reliable or honest either, which is how people get put on to alternative media in the first place. If the government could simply tell the truth and operate with a greater degree of transparency and openness, then it would probably take the wind out of the sails of most of the conspiracy theories out there.


I agree with almost 100% of your really excellent post. Thank you for contributing. Remember Coast to Coast? It was like your community access station, people would listen but they couldn't really connect with other listeners in the way they could now, with Facebook groups and YouTube channels. The book credits YouTube with a huge increased interest in flat earth conspiracy. Who could've known, back in the Bircher and Jack Chick pamphlet days?

The only place I might differ is your last sentence. I don't think it's possible anymore to take the wind out of the sails of conspiracy theories. Even as far as we've come in the last few years in the scope and proliferation of conspiracies, all you have to do is mention the name Clinton, and certain people will revert right back to the Clinton Body Count. It's demographic reflex.
 

icehorse

......unaffiliated...... anti-dogmatist
Premium Member
Recommendations rarely led users back to mainstream news coverage, or to liberal or apolitical content of any kind. Once among extremists, the algorithm ended to stay there, as if that had been the destination all along.

I agree with most everything you've excerpted and said in this thread... but...

Let's not kid ourselves into thinking that social media's pernicious and devastating impact on the world is limited to the right wing.

Any time we see people advocating for magical thinking (religious or otherwise), and/or spouting dogma, they should be regarded with great suspicion.
 

anna.

colors your eyes with what's not there
I agree with most everything you've excerpted and said in this thread... but...

Let's not kid ourselves into thinking that social media's pernicious and devastating impact on the world is limited to the right wing.

Any time we see people advocating for magical thinking (religious or otherwise), and/or spouting dogma, they should be regarded with great suspicion.

Sorry, I'm not gonna get into bothsidesism with you. There were multiple experts in their fields of data analysis cited in the book who detailed how the algorithms pull users inexorably to the right, into the world of the conspiratorial fringe. I'm gonna go with their expertise. And I'll suggest you read the book for yourself, if you're interested in challenging your bias.
 

Secret Chief

Degrow!
58950649._SX318_.jpg


The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World
By Max Fisher

NYT bio: Max Fisher is a New York-based international reporter and columnist.
He has reported from five continents on conflict, diplomacy, social change and other topics.


The Chaos Machine details how Facebook, YouTube, Twitter, Instagram, TikTok, and (more outside the U.S.) WhatsApp, have not only amplified fringe ideas that led to real world harm, but have actually been the Petri dishes where hoaxes, rumors, and conspiracies were first cultivated before going viral in a way that took users from online rhetoric to physical violence.

Max Fisher is an international investigative journalist who traced the radicalization of internet users from 4Chan and Gamergate through the early days of Facebook and YouTube, into the platforms' forays into “emotionally engaging interactions” and “deep learning." In a word, algorithms. Algorithms which fed hyper-partisanship, racism, and extremism; which had the ability to turn relatively moderate, even apolitical users into radicalized conspiracy theorists who truly believed they were in an us-against-them fight, sometimes literally to the death. What radicalized users fear is more likely to be a concocted figment of the lies and misinformation of their echo chambers which they perceive as a threat to their demographic status. The algorithms work most efficiently with emotions of fear, anger, and outrage, relentlessly delivering users to the fringes of their ideologies and keeping them there.

Tech companies made billions off their platforms’ fomenting of unrest not just in the United States (Stop the Steal, QAnon, Plandemic), but around the world. Fisher went from the United States to Sri Lanka, India, Myanmar, Germany, and Brazil, investigating how rumors about immigrants, elections, Covid, vaccines, birth control, teachers brainwashing students, child trafficking, moved from social media to real life violence -massacres, riots, and individual disruption of people's lives and safety via harassment and threats. Anything that would cause fear or outrage in one particular demographic and pit them against another: the common denominators, again and again, were Facebook and YouTube.

Said a Sri Lankan presidential advisor after riots driven by Facebook rumors that spread like wildfire through populations whose main source of news was Facebook:

“You, the United States itself, should fight the algorithm. What compels Facebook beyond that?”

What indeed? Money.

What started with Facebook invariably moved on to YouTube. According to the many digital experts in countries across the globe who consulted with Fisher, their consensus is “look at YouTube.” Sociologist Zeynep Tufeckci calls YouTube “one of the most powerful radicalizing instruments of the twenty-first century.”

The massacres in 2019 at two mosques in Christchurch, New Zealand, was perpetrated by a killer who was radicalized online, who live-streamed the attack on Facebook, who thanked Candace Owens on YouTube for teaching him to embrace violence. Says Fisher: “When New Zealand government investigators finished their yearlong examination of how the Christchurch massacre had happened, the greater culpability lay, they indicated, with YouTube.”

I can’t even begin to do justice to the information this book provides, but I hope this will be enough to pique the interest of anyone wanting to know about how and why the right-wing hoaxes and conspiracies develop, amplify, and become so fixed in the minds of conservatives that they will not believe anything other than that they are true. Because as Fisher explains, the algorithm pushes more people to the fringe right than the left. This is a right-wing problem that affects everyone.

Not mentioned in the book, but as an addendum to this OP:

Leon Festinger was a social psychologist who came up with the theory of cognitive dissonance. He studied a doomsday cult who believed the world was going to end on December 21, 1954. When the world didn’t end as they'd planned for, such as leaving their jobs and disposing of their possessions, rather than admit they’d been duped into believing the apocalypse was upon them, the cult came up with alternative explanations that reinforced their beliefs. They doubled down. As the link at wiki outlines, Festinger and his team found the cult resisted disconfirmation of their beliefs by these defense mechanisms:
1. The belief must be held with deep conviction and be relevant to the believer's actions or behavior.​
2. The belief must have produced actions that are arguably difficult to undo.​
3. The belief must be sufficiently specific and concerned with the real world such that it can be clearly disconfirmed.​
4. The disconfirmatory evidence must be recognized by the believer.​
5. The believer must have social support from other believers.​

When people believe in Trump enough to storm the Capitol and lose their jobs and end up in prison, when they believe in QAnon enough to gather in Dealey Plaza to see an undead JFK and lose family members and friends in the process, when they believe the Covid vaccine will inject a microchip into them, when they believe in other countries that false rumors about immigrants or minorities are threatening enough to converge on neighborhoods to riot and attack, when they fear even routine vaccinations for themselves or their children because of rumors and conspiracies, when social media can propel a president into power (Bolsonaro) or subvert an election to keep one there unconstitutionally (Trump) - you can see those points play out from beginning to end. The social support comes from social media groups of likeminded whose likes and shares can move, via the algorithm, a fringe idea from the shadows into the spotlight of millions and millions of views. It only takes one of them to commit another Christchurch.

"We have all experienced the futility of trying to change a strong conviction, especially if the convinced person has some investment in his belief. We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks.​
But man’s resourcefulness goes beyond simply protecting a belief. Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view.”​
― Leon Festinger - When Prophecy Fails: A Social & Psychological Study of a Modern Group that Predicted the Destruction of the World​
Thanks, I'll get this and also Stolen Focus by Johann Hari.
 

Stevicus

Veteran Member
Staff member
Premium Member
I agree with almost 100% of your really excellent post. Thank you for contributing. Remember Coast to Coast? It was like your community access station, people would listen but they couldn't really connect with other listeners in the way they could now, with Facebook groups and YouTube channels. The book credits YouTube with a huge increased interest in flat earth conspiracy. Who could've known, back in the Bircher and Jack Chick pamphlet days?

The only place I might differ is your last sentence. I don't think it's possible anymore to take the wind out of the sails of conspiracy theories. Even as far as we've come in the last few years in the scope and proliferation of conspiracies, all you have to do is mention the name Clinton, and certain people will revert right back to the Clinton Body Count. It's demographic reflex.

Was Coast to Coast operated by Art Bell? I do vaguely remember that, although I didn't really listen to him very much.

Some of what we're seeing might also be a clash of ideas and philosophical differences which people might be inclined to avoid in real life situations. That seems to be underlying much of what gets discussed, with the conspiracy theories and other similar "flotsam" appearing at the surface level, but not getting much deeper.
 

anna.

colors your eyes with what's not there
Thanks, I'll get this and also Stolen Focus by Johann Hari.

Oooh, I like the looks of Stolen Focus, I'm putting that on my list. Thank you!

Coincidentally, before Chaos, I read How To Do Nothing: Resisting the Attention Economy by Jenny Odell.

For many years now, I've been watching the ideas mentioned in this thread unfold. I never did join Facebook or Instagram, just resistant to the idea of over-sharing and over-connectedness. I've never regretted it. I don't log in to YouTube (mostly for music, sometimes for recipes and less than sometimes for breaking news on a big story from a reputable source), but recently realized it was tracking me by my device, because it offered me a chance to delete/pause my history "on this device." So yeah, I cleared and paused, for whatever help it'll give me. But I watched right-wing users of forums I've belonged to more and more and more start threads based on links to, or what they heard on, not only Facebook and YouTube but the farther right platforms like Rumble, Gab, Gettr, Truth Social, etc.

Anyway. Being online for twenty-odd years - it's really messed with my attention span. I've been working to get it back, I know it can be done and I've already seen improvement.
 

anna.

colors your eyes with what's not there
Was Coast to Coast operated by Art Bell? I do vaguely remember that, although I didn't really listen to him very much.

Some of what we're seeing might also be a clash of ideas and philosophical differences which people might be inclined to avoid in real life situations. That seems to be underlying much of what gets discussed, with the conspiracy theories and other similar "flotsam" appearing at the surface level, but not getting much deeper.

Yes, Art Bell. I never listened to it, but someone I've know for many years - smart guy, an engineer - went from that to all the current Covid and Stop the Steal conspiracies. I just can't understand the dichotomy. Well, I guess I understand how, but still, this friend... how?!

A lot of it's plain psychology, too. Some of which I've studied, particularly the psychology of prejudice and stereotypes. The book goes into this aspect as well. Status threat. Choice-supportive bias, indentity validation, deindividuation, and a lot more. Social media takes human nature with all its pathologies and cognitive biases and filters, gives it steroids and attaches it to a rocket.
 

icehorse

......unaffiliated...... anti-dogmatist
Premium Member
Sorry, I'm not gonna get into bothsidesism with you. There were multiple experts in their fields of data analysis cited in the book who detailed how the algorithms pull users inexorably to the right, into the world of the conspiratorial fringe. I'm gonna go with their expertise. And I'll suggest you read the book for yourself, if you're interested in challenging your bias.
Not sure how you got to "bothsidesism" based on what I wrote? I was NOT making a left wing vs. right wing comparison.

My observation was much broader than that. Some of the most vicious social media sites on the internet have to do with pet care. Not a left or right topic. The brain chemistry concerning addictive behaviors is not a right wing only issue. When Vegas casinos use variable intermittent rewards in their slot machines, they are targeting ALL people, it's not political. I was in Silicon Valley when gamification was all the rage. FourSquare was built on a gamification model, which is again attempting to hack ALL human brains, not just right wing brains. A game like Farmville is the same. Hardly a hot spot for the alt-right, but clearly designed to tap into the brain chemistry that produces the flow state, that again ALL humans share.

And whenever a slogan reaches meme status, these same mechanisms are in play.

So whatever your political leanings might be, social media giants have you in their radar.
 

anna.

colors your eyes with what's not there
Not sure how you got to "bothsidesism" based on what I wrote? I was NOT making a left wing vs. right wing comparison.

My observation was much broader than that. Some of the most vicious social media sites on the internet have to do with pet care. Not a left or right topic. The brain chemistry concerning addictive behaviors is not a right wing only issue. When Vegas casinos use variable intermittent rewards in their slot machines, they are targeting ALL people, it's not political. I was in Silicon Valley when gamification was all the rage. FourSquare was built on a gamification model, which is again attempting to hack ALL human brains, not just right wing brains. A game like Farmville is the same. Hardly a hot spot for the alt-right, but clearly designed to tap into the brain chemistry that produces the flow state, that again ALL humans share.

And whenever a slogan reaches meme status, these same mechanisms are in play.

So whatever your political leanings might be, social media giants have you in their radar.

You were blunting the research references to the algorithms'push to the right-wing more than to the left. Look, it's well known what likes and engagement do to our brains. The book is looking at how social media platforms knew their algorithms were causing outbreaks of violence in real life based on the conspiracies and rumors from the fringes that they knew were being pushed to front and center of user feeds and amplified exponentially. And the algorithms persistently pushed farther and farther to the right.
 

icehorse

......unaffiliated...... anti-dogmatist
Premium Member
You were blunting the research references to the algorithms'push to the right-wing more than to the left. Look, it's well known what likes and engagement do to our brains. The book is looking at how social media platforms knew their algorithms were causing outbreaks of violence in real life based on the conspiracies and rumors from the fringes that they knew were being pushed to front and center of user feeds and amplified exponentially. And the algorithms persistently pushed farther and farther to the right.

I understand these algorithms - they polarize along many different dimensions. To the right, of course. But many other dimensions as well.

I listened to an interview with the author, and in that hour long interview he did not push the right-wing imbalance at all. Perhaps he does in the book, but in this talk he did not. Instead, he talked about what I've been talking about, a sort of cynical, shotgun polarization put in place by social media to make money by preying on everyone's emotional and psychological weak spots.

We agree on most of this - they knew what they were doing. But they amplified EVERYTHING, not just right wing crap.
 
Last edited:

anna.

colors your eyes with what's not there
I understand these algorithms - they polarize along many different dimensions. To the right, of course. But many other dimensions as well.

I listened to an interview with the author, and in that hour long interview he did not push the right-wing imbalance at all. Perhaps he does in the book, but in this talk he did not. Instead, he talked about what I've been talking about, a sort of cynical, shotgun polarization put in place by social media to make money by praying on everyone's emotional and psychological weak spots.

We agree on most of this - they knew what they were doing. But they amplified EVERYTHING, not just right wing crap.

I posted his excepts. I highly recommend the book.
 

Brickjectivity

Veteran Member
Staff member
Premium Member
@Brickjectivity @Shadow Wolf

I forgot to tag you - here's the thread, I hope it's of interest to you!
Thanks for providing some handy information and getting a conversation going! What do you think of services like Ground News ? I see them advertised frequently on a few channels. They supposedly list the current news stories and tell you which media sites are covering them and who is left or right leaning.

Another part of the problem is that the mainstream media and the "official" sources of government aren't always reliable or honest either, which is how people get put on to alternative media in the first place.
Yes! What is the alternative to the very disappointing 'Journalists' that are on TV and on the main news sites? But of course that is not really why we resort to these channels. We desire to be entertained and informed at the same time. We're looking for psychological information, medical, holistic, technical and all kinds of information. We aren't merely scrolling or just taking the algorithm. We surf.
 

Shadow Wolf

Certified People sTabber & Business Owner
Trying to control it or censor it won't really work in the long run, as there are also weaknesses in the technology which have been exploited and continue to be exploited.
I wonder if a crew of AI mods could? This would be something that could effectively read and watch the entirety of something, in mere seconds, long before it gets suggested to someone.
 
Top