• @Minotaur@lemm.ee
    link
    fedilink
    English
    1588 months ago

    I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.

    It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.

    Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.

    It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.

    • Dave.
      link
      fedilink
      English
      1438 months ago

      This appears to be more the angle of the person being fed an endless stream of hate on social media and thus becoming radicalised.

      What causes them to be fed an endless stream of hate? Algorithms. Who provides those algorithms? Social media companies. Why do they do this? To maintain engagement with their sites so they can make money via advertising.

      And so here we are, with sites that see you viewed 65 percent of a stream showing an angry mob, therefore you would like to see more angry mobs in your feed. Is it any wonder that shit like this happens?

      • Ð Greıt Þu̇mpkin
        link
        fedilink
        English
        388 months ago

        It’s also known to intentionally show you content that’s likely to provoke you into fights online

        Which just makes all the sanctimonious screed about avoiding echo chambers a bunch of horse shit, because that’s not how outside digital social behavior works, outside the net if you go out of your way to keep arguing with people who wildly disagree with you, your not avoiding echo chambers, you’re building a class action restraining order case against yourself.

        • @Monument@lemmy.sdf.org
          link
          fedilink
          English
          88 months ago

          I’ve long held this hunch that when people’s beliefs are challenged, they tend to ‘dig in’ and wind up more resolute. (I think it’s actual science and I learned that in a sociology class many years ago but it’s been so long I can’t say with confidence if that’s the case.)

          Assuming my hunch is right (or at least right enough), I think that side of social media - driving up engagement by increasing discord also winds up radicalizing people as a side effect of chasing profits.

          It’s one of the things I appreciate about Lemmy. Not everyone here seems to just be looking for a fight all the time.

          • @Kalysta@lemmy.world
            link
            fedilink
            English
            38 months ago

            It depends on how their beliefs are challenged. Calling them morons won’t work. You have to gently question them about their ideas and not seem to be judging them.

            • @Monument@lemmy.sdf.org
              link
              fedilink
              English
              38 months ago

              Oh, yeah, absolutely. Another commenter on this post suggested my belief on it was from an Oatmeal comic. That prompted me to search it out, and seeing it spelled out again sort of opened up the memory for me.

              The class was a sociology class about 20 years ago, and the professor was talking about cognitive dissonance as it relates to folks choosing whether or not they wanted to adopt the beliefs of another group. I don’t think he got into how to actually challenge beliefs in a constructive way, since he was discussing how seemingly small rifts can turn into big disagreements between social groups, but subsequent life experience and a lot of good articles about folks working with radicals to reform their beliefs confirm exactly what you commented.

            • @Monument@lemmy.sdf.org
              link
              fedilink
              English
              28 months ago

              Nah. I picked that up about 20 years ago, but the comic is a great one.
              I haven’t read The Oatmeal in a while. I guess I know what I’ll be doing later tonight!

        • deweydecibel
          link
          fedilink
          English
          58 months ago

          People have been fighting online long before algorithmic content suggestions. They may amplify it, but you can’t blame that on them entirely.

          The truth is many people would argue and fight like that in real life if they could be anonymous.

      • @Eldritch@lemmy.world
        link
        fedilink
        English
        28 months ago

        Absolutely. Huge difference between hate speech existing. And funneling a firehose of it at someone to keep them engaged. It’s not clear how this will shake out. But I doubt it will be the end of free speech. If it exists and you actively seek it out that’s something else.

    • @Zak@lemmy.world
      link
      fedilink
      English
      648 months ago

      I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn’t responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.

      With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don’t like this case. I especially don’t like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.

      • Refurbished Refurbisher
        link
        fedilink
        English
        198 months ago

        This is the real shit right here. The problem is that social media companies’ data show that negativity and hate keep people on their website for longer, which means that they view more advertisement compared to positivity.

        It is human nature to engage with disagreeable topics moreso than agreeable topics, and social media companies are exploiting that for profit.

        We need to regulate algorithms and force them to be open source, so that anybody can audit them. They will try to hide behind “AI” and “trade secret” excuses, but lawmakers have to see above that bullshit.

        Unfortunately, US lawmakers are both stupid and corrupt, so it’s unlikely that we’ll see proper change, and more likely that we’ll see shit like “banning all social media from foreign adversaries” when the US-based social media companies are largely the cause of all these problems. I’m sure the US intelligence agencies don’t want them to change either, since those companies provide large swaths of personal data to them.

        • just another dev
          link
          fedilink
          English
          38 months ago

          While this is true for Facebook and YouTube - last time I checked, reddit doesn’t personalise feeds in that way. It was my impression that if two people subscribe to the same subreddits, they will see the exact same posts, based on time and upvotes.

          Then again, I only ever used third party apps and old.reddit.com, so that might have changed since then.

          • CopHater69
            link
            fedilink
            English
            48 months ago

            Mate, I never got the same homepage twice on my old reddit account. I dunno how you can claim that two people with identical subs would see the same page. That’s just patently not true and hasn’t been for years.

            • just another dev
              link
              fedilink
              English
              3
              edit-2
              8 months ago

              Quite simple, aniki. The feeds were ordered by hot, new, or top.

              New was ORDER BY date DESC. Top was ORDER BY upvotes DESC. And hot was a slightly more complicated order that used a mixture of upvotes and time.

              You can easily verify this by opening 2 different browsers in incognito mode and go to the old reddit frontpage - I get the same results in either. Again - I can’t account for the new reddit site because I never used it for more than a few minutes, but that’s definitely how they old one worked and still seems to.

          • deweydecibel
            link
            fedilink
            English
            2
            edit-2
            8 months ago

            It’s probably not true anymore, but at the time this guy was being radicalized, you’re right, it wasn’t algorithmically catered to them. At least not in the sense that it was intentionally exposing them to a specific type of content.

            I suppose you can think of the way reddit works (or used to work) as being content agnostic. The algorithm is not aware of the sorts of things it’s suggesting to you, it’s just showing you things based on subreddit popularity and user voting, regardless of what it is.

            In the case of YouTube and Facebook, their algorithms are taking into account the actual content and funneling you towards similar content algorithmically, in a way that is unique to you. Which means at some point their algorithm is acknowledging “this content has problematic elements, let’s suggest more problematic content”

            (Again, modern reddit, at least on the app, is likely engaging in this now to some degree)

            • CopHater69
              link
              fedilink
              English
              3
              edit-2
              8 months ago

              That’s a lot of baseless suppositions you have there. Stuff you cannot possibly know - like how reddit content algos work.

      • deweydecibel
        link
        fedilink
        English
        58 months ago

        Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.

        The problem then becomes if the clearly defined rules aren’t enough, then the people that run these sites need to start making individual judgment calls based on…well, their gut, really. And that creates a lot of issues if the site in question could be held accountable for making a poor call or overlooking something.

        The threat of legal repercussions hanging over them is going to make them default to the most strict actions, and that’s kind of a problem if there isn’t a clear definition of what things need to be actioned against.

        • @rambaroo@lemmynsfw.com
          link
          fedilink
          English
          48 months ago

          Bullshit. There’s no slippery slope here. You act like these social media companies just stumbled onto algorithms. They didn’t, they designed these intentionally to drive engagement up.

          Demanding that they change their algorithms to stop intentionally driving negativity and extremism isn’t dystopian at all, and it’s very frustrating that you think it is. If you choose to do nothing about this issue I promise you we’ll be living in a fascist nation within 10 years, and it won’t be an accident.

        • HACKthePRISONS
          link
          fedilink
          18 months ago

          this is exactly why section 230 exists. sites aren’t responsible for what other people post and they are allowed to moderate however they want.

        • @VirtualOdour@sh.itjust.works
          link
          fedilink
          English
          08 months ago

          It’s the chilling effect they use in China, don’t make it clear what will get you in trouble and then people are too scared to say anything

          Just another group looking to control expression by the back door

          • @rambaroo@lemmynsfw.com
            link
            fedilink
            English
            9
            edit-2
            8 months ago

            There’s nothing ambiguous about this. Give me a break. We’re demanding that social media companies stop deliberately driving negativity and extremism to get clicks. This has fuck all to do with free speech. What they’re doing isn’t “free speech”, it’s mass manipulation, and it’s very deliberate. And it isn’t disclosed to users at any point, which also makes it fraudulent.

            It’s incredibly ironic that you’re accusing people of an effort to control expression when that’s literally what social media has been doing since the beginning. They’re the ones trying to turn the world into a dystopia, not the other way around.

      • @rambaroo@lemmynsfw.com
        link
        fedilink
        English
        38 months ago

        Reddit is the same thing. They intentionally enable and cultivate hostility and bullying there to drive up engagement.

          • @Kalysta@lemmy.world
            link
            fedilink
            English
            18 months ago

            Which is even worse because more people see the bullying and hatred, especially when it shows up on a default sub.

    • @galoisghost@aussie.zone
      link
      fedilink
      English
      398 months ago

      Nah. This isn’t guilt by association

      In her decision, the judge said that the plaintiffs may proceed with their lawsuit, which claims social media companies — like Meta, Alphabet, Reddit and 4chan — ”profit from the racist, antisemitic, and violent material displayed on their platforms to maximize user engagement,”

      Which despite their denials the actually know: https://www.nbcnews.com/tech/tech-news/facebook-knew-radicalized-users-rcna3581

    • @rambaroo@lemmynsfw.com
      link
      fedilink
      English
      23
      edit-2
      8 months ago

      I don’t think you understand the issue. I’m very disappointed to see that this is the top comment. This wasn’t an accident. These social media companies deliberately feed people the most upsetting and extreme material they can. They’re intentionally radicalizing people to make money from engagement.

      They’re absolutely responsible for what they’ve done, and it isn’t “by proxy”, it’s extremely direct and deliberate. It’s long past time that courts held them liable. What they’re doing is criminal.

      • @rbesfe@lemmy.ca
        link
        fedilink
        English
        6
        edit-2
        8 months ago

        Proving this “intent to radicalize” in court is impossible. What evidence exists to back up your claim beyond a reasonable doubt?

        • @Kalysta@lemmy.world
          link
          fedilink
          English
          38 months ago

          The algorithms themselves. This decision opens the algorithms up to discovery and now we get to see exactly how various topics are weighted. These companies will sink or swim by their algorithms.

      • @Minotaur@lemm.ee
        link
        fedilink
        English
        38 months ago

        I do. I just very much understand the extent that the justice system will take decisions like this and utilize them to accuse any person or business (including you!) of a crime that they can then “prove” they were at fault for.

    • @WarlordSdocy@lemmy.world
      link
      fedilink
      English
      198 months ago

      I think the distinction here is between people and businesses. Is it the fault of people on social media for the acts of others? No. Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes. The blame here is on the social medias for not doing more to stop the spread of this kind of content. Because yes even though that won’t stop this kind of content from existing making it harder to access and find will at least reduce the number of people who will go down this path.

      • @rambaroo@lemmynsfw.com
        link
        fedilink
        English
        68 months ago

        I agree, but I want to clarify. It’s not about making this material harder to access. It’s about not deliberately serving that material to people who weren’t looking it up in the first place in order to get more clicks.

        There’s a huge difference between a user looking up extreme content on purpose and social media serving extreme content to unsuspecting people because the company knows it will upset them.

      • @0x0@programming.dev
        link
        fedilink
        English
        -6
        edit-2
        8 months ago

        Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes.

        Really? Then add videogames and heavy metal to the list. And why not most organized religions? Same argument, zero sense. There’s way more at play than Person watches X content = person is now radicalized, unless we’re talking about someone with severe cognitive deficit.

        And since this is the US… perhaps add easy access to guns? Nah, that’s totally unrelated.

        • @Chetzemoka@lemmy.world
          link
          fedilink
          English
          68 months ago

          “Person watches X creative and clearly fictional content” is not analogous in any way to “person watches X video essay crafted to look like a documentary, but actually just full of lies and propaganda”

          Don’t be ridiculous

          • @0x0@programming.dev
            link
            fedilink
            English
            -18 months ago

            So it’s the severe cognitive deficit. Ok. Watching anything inherently bad and thinking it’s ok to do so becauses it seems legit… that’s ridiculous.

            • @Chetzemoka@lemmy.world
              link
              fedilink
              English
              68 months ago

              I mean, yes. People are stupid. That’s why we have safety regulations. This court case is about a lack of safety regulations.

      • @Minotaur@lemm.ee
        link
        fedilink
        English
        08 months ago

        Sure, and I get that for like, healthcare. But ‘systemic solutions’ as they pertain to “what constitutes a crime” lead to police states really quickly imo

        • @rambaroo@lemmynsfw.com
          link
          fedilink
          English
          1
          edit-2
          8 months ago

          The article is about lawsuits. Where are you getting this idea that anyone suggested criminalizing people? Stop putting words in other people’s mouths. The most that’s been suggested in this thread is regulating social media algorithms, not locking people up.

          Drop the melodrama and paranoia. It’s getting difficult to take you seriously when you keep making shit up about other people’s positions.

    • @morrowind@lemmy.ml
      link
      fedilink
      English
      118 months ago

      Do you not think if someone encouraged a murderer they should be held accountable? It’s not everyone they interacted with, there has to be reasonable suspicion they contributed.

      Also I’m pretty sure this is nothing new

      • deweydecibel
        link
        fedilink
        English
        88 months ago

        Depends on what you mean by “encouraged”. That is going to need a very precise definition in these cases.

        And the point isn’t that people shouldn’t be held accountable, it’s that there are a lot of gray areas here, we need to be careful how we navigate them. Irresponsible rulings or poorly implemented laws can destabilize everything that makes the internet worthwhile.

      • @Minotaur@lemm.ee
        link
        fedilink
        English
        58 months ago

        I didn’t say that at all, and I think you know I didn’t unless you really didn’t actually read my comment.

        I am not talking about encouraging someone to murder. I specifically said that in overt cases there is some common sense civil responsibility. I am talking about the potential for the the police to break down your door because you Facebook messaged a guy you’re friends with what your favorite local gun store was, and that guy also happens to listen to death metal and take antidepressants and the state has deemed him a risk factor level 3.

        • @morrowind@lemmy.ml
          link
          fedilink
          English
          38 months ago

          I must have misunderstood you then, but this still seems like a pretty clear case where the platforms, not even people yet did encourage him. I don’t think there’s any new precedent being set here

          • @Minotaur@lemm.ee
            link
            fedilink
            English
            08 months ago

            Rulings often start at the corporation / large major entity level and work their way down to the individual. Think piracy laws. At first, only giant, clear bootlegging operations were really prosecuted for that, and then people torrenting content for profit, and then people torrenting large amounts of content for free - and now we currently exist in an environment where you can torrent a movie or whatever and probably be fine, but also if the criminal justice system wants to they can (and have) easily hit anyone who does with a charge for tens of thousands of dollars or years of jail time.

            Will it happen to the vast majority of people who torrent media casually? No. But we currently exist in an environment where if you get unlucky enough or someone wants to punish you for it enough, you can essentially have this massive sentence handed down to you almost “at random”.

        • Ð Greıt Þu̇mpkin
          link
          fedilink
          English
          68 months ago

          Is there currently a national crisis of Jacobins kidnapping oligarchs and beheading them in public I am unaware of?

        • @rambaroo@lemmynsfw.com
          link
          fedilink
          English
          18 months ago

          Literally no one suggested that end users should be arrested for jokes on the internet. Fuck off with your attempts at trying to distract from the real issue.

    • @Socsa@sh.itjust.works
      link
      fedilink
      English
      108 months ago

      This wasn’t just a content issue. Reddit actively banned people for reporting violent content too much. They literally engaged with and protected these communities, even as people yelled that they were going to get someone hurt.

    • deweydecibel
      link
      fedilink
      English
      88 months ago

      Also worth remembering, this opens up avenues for lawsuits on other types of “harm”.

      We have states that have outlawed abortion. What do those sites do when those states argue social media should be “held accountable” for all the women who are provided information on abortion access through YouTube, Facebook, reddit, etc?

    • Ð Greıt Þu̇mpkin
      link
      fedilink
      English
      28 months ago

      I dunno about social media companies but I quite agree that the party who got the gunman the gun should share the punishment for the crime.

      Firearms should be titled and insured, and the owner should have an imposed duty to secure, and the owner ought to face criminal penalty if the firearm titled to them was used by someone else to commit a crime, either they handed a killer a loaded gun or they inadequately secured a firearm which was then stolen to be used in committing a crime, either way they failed their responsibility to society as a firearm owner and must face consequences for it.

      • @solrize@lemmy.world
        link
        fedilink
        English
        48 months ago

        This guy seems to have bought the gun legally at a gun store, after filling out the forms and passing the background check. You may be thinking of the guy in Maine whose parents bought him a gun when he was obviously dangerous. They were just convicted of involuntary manslaughter for that, iirc.

          • @solrize@lemmy.world
            link
            fedilink
            English
            2
            edit-2
            8 months ago

            Well you were talking about charging the gun owner if someone else commits a crime with their gun. That’s unrelated to this case where the shooter was the gun owner.

            The lawsuit here is about radicalization but if we’re pursuing companies who do that, I’d start with Fox News.

      • @Minotaur@lemm.ee
        link
        fedilink
        English
        08 months ago

        If you lend your brother, who you know is on antidepressants, a long extension cord he tells you is for his back patio - and he hangs himself with it, are you ready to be accused of being culpable for your brothers death?

        • @jkrtn@lemmy.ml
          link
          fedilink
          English
          48 months ago

          Oh, it turns out an extension cord has a side use that isn’t related to its primary purpose. What’s the analogous innocuous use of a semiautomatic handgun?

          • @Minotaur@lemm.ee
            link
            fedilink
            English
            -18 months ago

            Self defense? You don’t have to be a 2A diehard to understand that it’s still a legal object. What’s the “innocuous use” of a VPN? Or a torrenting client? Should we imprison everyone who ever sends a link about one of these to someone who seems interested in their use?

            • @jkrtn@lemmy.ml
              link
              fedilink
              English
              28 months ago

              You’re deliberately ignoring the point that the primary use of a semiautomatic pistol is killing people, whether self-defense or mass murder.

              Should you be culpable for giving your brother an extension cord if he lies that it is for the porch? Not really.

              Should you be culpable for giving your brother a gun if he lies that he needs it for self defense? IDK the answer, but it’s absolutely not equivalent.

              It is a higher level of responsibility, you know lives are in danger if you give them a tool for killing. I don’t think it’s unreasonable if there is a higher standard for loaning it out or leaving it unsecured.

              • @Minotaur@lemm.ee
                link
                fedilink
                English
                -1
                edit-2
                8 months ago

                “Sorry bro. I’d love to go target shooting with you, but you started taking Vynase 6 months ago and I’m worried if you blow your brains out the state will throw me in prison for 15 years”.

                Besides, youre ignoring the point. This article isn’t about a gun, it’s about basically “this person saw content we didn’t make on our website”. You think that wont be extended to general content sent from a person to another? That if you send some pro-Palestine articles to your buddy and then a year or two later your buddy gets busted at an anti-Zionist rally and now you’re a felon because you enabled that? Boy, that would be an easy way for some hypothetical future administrations to control speech!!

                You might live in a very nice bubble, but not everyone will.

                • @jkrtn@lemmy.ml
                  link
                  fedilink
                  English
                  28 months ago

                  So you need a strawman argument transitioning from loaning a weapon unsupervised to someone we know is depressed. Now it is just target shooting with them, so distancing the loan aspect and adding a presumption of using the item together.

                  This is a side discussion. You are the one who decided to write strawman arguments relating guns to extension cords, so I thought it was reasonable to respond to that. It seems like you’re upset that your argument doesn’t make sense under closer inspection and you want to pull the ejection lever to escape. Okay, it’s done.

                  The article is about a civil lawsuit, nobody is going to jail. Nobody is going to be able to take a precedent and sue me, an individual, over sharing articles to friends and family, because the algorithm is a key part of the argument.

        • @rambaroo@lemmynsfw.com
          link
          fedilink
          English
          0
          edit-2
          8 months ago

          Knowingly manipulating people into suicide is a crime and people have already been found guilty of doing it.

          So the answer is obvious. If you knowingly encourage a vulnerable person to commit suicide, and your intent can be proved, you can and should be held accountable for manslaughter.

          That’s what social media companies are doing. They aren’t loaning you extremist ideas to help you. That’s a terrible analogy. They’re intentionally serving extreme content to drive you into more and more upsetting spaces, while pretending that there aren’t any consequences for doing so.

        • Ð Greıt Þu̇mpkin
          link
          fedilink
          English
          -58 months ago

          Did he also use it as improvised ammunition to shoot up the local elementary school with the chord to warrant it being considered a firearm?

          I’m more confused where I got such a lengthy extension chord from! Am I an event manager? Do I have generators I’m running cable from? Do I get to meet famous people on the job? Do I specialize in fairground festivals?

          • @Minotaur@lemm.ee
            link
            fedilink
            English
            -18 months ago

            …. Aside from everything else, are you under the impression that a 10-15 ft extension cord is an odd thing to own…?

    • @jumjummy@lemmy.world
      link
      fedilink
      English
      18 months ago

      And ironically the gun manufacturers or politicians who support lax gun laws are not included in these “nets”. A radicalized individual with a butcher knife can’t possibly do as much damage as one with a gun.

      • CopHater69
        link
        fedilink
        English
        238 months ago

        Marilyn Manson led a charge to overthrow the government??

            • @Passerby6497@lemmy.world
              link
              fedilink
              English
              08 months ago

              Because I don’t like than an artist I once enjoyed is a drugged out and drunken mess? Based on the reaction it definitely sounds like it.

              Didn’t think that many lemmings likes washed up has been metal acts, but to each their own I guess.

              • CopHater69
                link
                fedilink
                English
                58 months ago

                I actually responded to the wrong person and I apologize. I’ve actually heard the same thing about MM lately – just washed-up and sad.

        • CopHater69
          link
          fedilink
          English
          108 months ago

          Because it’s not funny or relevant and is an attempt to join two things - satanic panic with legal culpability in social media platforms.

            • @allcopsarebad@lemm.ee
              link
              fedilink
              English
              3
              edit-2
              8 months ago

              And this is neither of those things. This is something much more tangible, with actual science behind it.

              • This is fine🔥🐶☕🔥
                link
                fedilink
                English
                28 months ago

                Yes, that exactly is the point.

                How people who supposedly care for children’s safety are willing to ignore science and instead choose to hue and cry about bullshit stuff they perceive (or told by their favourite TV personality) as evil.

                Have you got it now? Or should I explain it further?

                Didn’t expect Lemmy to have people who lack reading comprehension.

      • @isles@lemmy.world
        link
        fedilink
        English
        48 months ago

        People don’t appreciate having spurious claims attached to their legitimate claims, even in jest. It invokes the idea that since the previous targets of blame were false that these likely are as well.

        • @0x0@programming.dev
          link
          fedilink
          English
          18 months ago

          They’re all external factors. Music and videogames have been (wrongly, imo) blamed in the past. Media, especially nowadays, is probably more “blameable” than music and games, but i still think it’s bs to use external factors as an excuse to justify mass shootings.

          • @isles@lemmy.world
            link
            fedilink
            English
            18 months ago

            What are the internal factors of a person that are not influenced by the environment or culture?

  • Phanatik
    link
    fedilink
    72
    edit-2
    8 months ago

    I don’t understand the comments suggesting this is “guilty by proxy”. These platforms have algorithms designed to keep you engaged and through their callousness, have allowed extremist content to remain visible.

    Are we going to ignore all the anti-vaxxer groups who fueled vaccine hesitancy which resulted in long dead diseases making a resurgence?

    To call Facebook anything less than complicit in the rise of extremist ideologies and conspiratorial beliefs, is extremely short-sighted.

    “But Freedom of Speech!”

    If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don’t deserve to have that speech. Sorry, you’ve violated the social contract and those people’s blood is on your hands.

    • @firadin@lemmy.world
      link
      fedilink
      English
      298 months ago

      Not just “remain visible” - actively promoted. There’s a reason people talk about Youtube’s right-wing content pipeline. If you start watching anything male-oriented, Youtube will start slowly promoting more and more right-wing content to you until you’re watching Ben Shaprio and Andrew Tate

      • @Ragnarok314159@sopuli.xyz
        link
        fedilink
        English
        188 months ago

        I got into painting mini Warhammer 40k figurines during covid, and thought the lore was pretty interesting.

        Every time I watch a video, my suggested feed goes from videos related to my hobbies to entirely replaced with red pill garbage. The right wing channels have to be highly profitable to YouTube to funnel people into, just an endless tornado of rage and constant viewing.

        • WolfdadCigarette@threads.net
          link
          fedilink
          English
          7
          edit-2
          8 months ago

          The algorithm is, after all, optimized for nothing other than advertisements/time period. So long as the algorithm believes that a video suggestion will keep you on the website for a minute more, it will suggest it. I occasionally wonder about the implications of one topic leading to another. Is WH40k suggested the pipeline by demographics alone or more?

          Irritation at suggestions was actually what originally led me to invidious. I just wanted to watch a speech without hitting the “____ GETS DUNKED ON LIKE A TINY LITTLE BITCH” zone. Fuck me for trying to verify information.

        • r3df0x ✡️✝☪️
          link
          fedilink
          English
          38 months ago

          One thing to consider is that conservatives are likely paying for progressives to see their content, and geeks tend to have liberal views and follow the harm principle without many conditions.

          Otherwise, it really shows the demographics of the people who play Warhammer. Before my sister transitioned, she played Warhammer and was a socialist but had a lot of really wehraboo interests. She has been talking about getting back into it, but she passes really well and imagines how it would go with the neckbeards.

      • @BeMoreCareful@lemmy.world
        link
        fedilink
        English
        138 months ago

        YouTube is really bad about trying to show you right wing crap. It’s overwhelming. The shorts are even worse. Every few minutes there’s some new suggestion for some stuff that is way out of the norm.

        Tiktok doesn’t have this problem and is being attacked by politicians?

      • Alien Nathan Edward
        link
        fedilink
        English
        88 months ago

        it legit took youtube’s autoplay about half an hour after I searched “counting macros” to bring me to american monarchist content

    • Kühe sind toll
      link
      fedilink
      English
      118 months ago

      “But freedom of speech”

      If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don’t deserve to have that speech.

      In Germany we have a very good rule for this(its not written down, but that’s something you can usually count onto). Your freedom ends, where it violates the freedom of others. Examples for this: Everyone has the right to live a healthy life and everyone has the right to walk wherever you want. If I now take my right to walk wherever to want to cause a car accident with people getting hurt(and it was only my fault). My freedom violated the right that the person who has been hurt to life a healthy life. That’s not freedom.

      • @Syringe@lemmy.world
        link
        fedilink
        English
        78 months ago

        In Canada, they have an idea called “right to peace”. It means that you can’t stand outside of an abortion clinic and scream at people because your right to free speech doesn’t exceed that person’s right to peace.

        I don’t know if that’s 100% how it works so someone can sort me out, but I kind of liked that idea

    • @SuperSaiyanSwag@lemmy.zip
      link
      fedilink
      English
      68 months ago

      This may seem baseless, but I have seen this from years of experience in online forums. You don’t have to take it seriously, but maybe you can relate. We have seen time and time again that if there is no moderation then the shit floats to the top. The reason being that when people can’t post something creative or fun, but they still want the attention, they will post negative. It’s the loud minority, but it’s a very dedicated loud minority. Let’s say we have 5 people and 4 of them are very creative time and funny, but 1 of them complains all the time. If they make posts to the same community then there is a very good chance that the one negative person will make a lot more posts than the 4 creative types.

      • Kaity
        link
        fedilink
        English
        18 months ago

        Oh absolutely, and making something creative takes days, weeks, months.

        Drama, complaining, conspiracy theorizing, and hate-videos take a few minutes to make more than the video itself lasts.

    • @driving_crooner@lemmy.eco.br
      link
      fedilink
      English
      58 months ago

      What about youtube? That had actually paid those people to spread their sick ideas, making the world a worst place and getting rich while doing it.

      • Phanatik
        link
        fedilink
        18 months ago

        YouTube will actually take action and has done in most instances. I won’t say they’re the fastest but they do kick people off the platform if they deem them high risk.

  • nomad
    link
    fedilink
    English
    698 months ago

    Nice, now do all regigions and churches next

  • @Socsa@sh.itjust.works
    link
    fedilink
    English
    688 months ago

    Please let me know if you want me to testify that reddit actively protected white supremacist communities and even banned users who engaged in direct activism against these communities

    • FenrirIII
      link
      fedilink
      English
      378 months ago

      I was banned for activism against genocide. Reddit is a shithole.

      • @misspacific@lemmy.blahaj.zone
        link
        fedilink
        English
        118 months ago

        i was banned for similar reasons.

        seems like a lot of mods just have the ability to say whatever about whoever and the admins just nuke any account they target.

        • @Ragnarok314159@sopuli.xyz
          link
          fedilink
          English
          28 months ago

          I have noticed a massive drop in the quality of posting in Reddit over the last year. It was on a decline, but there was a massive drop off.

          It’s anecdotal to what I have read off Lemmy, but a lot of high Karma accounts have been nuked due to mods and admins being ridiculously over zealous in handing out permabans.

  • PorkSoda
    link
    fedilink
    English
    67
    edit-2
    8 months ago

    Back when I was on reddit, I subscribed to about 120 subreddits. Starting a couple years ago though, I noticed that my front page really only showed content for 15-20 subreddits at a time and it was heavily weighted towards recent visits and interactions.

    For example, if I hadn’t visited r/3DPrinting in a couple weeks, it slowly faded from my front page until it disappeared all together. It was so bad that I ended up writing a browser automation script to visit all 120 of my subreddits at night and click the top link. This ended up giving me a more balanced front page that mixed in all of my subreddits and interests.

    My point is these algorithms are fucking toxic. They’re focused 100% on increasing time on page and interaction with zero consideration for side effects. I would love to see social media algorithms required by law to be open source. We have a public interest in knowing how we’re being manipulated.

    • @Fedizen@lemmy.world
      link
      fedilink
      English
      158 months ago

      I used google news phone widget years ago and clicked on a giant asteroid article, and for whatever reason my entire feed became asteroid/meteor articles. Its also just such a dumb way to populate feeds.

  • @skozzii@lemmy.ca
    link
    fedilink
    English
    648 months ago

    YouTube feeds me so much right wing bullshit I’m constantly marking it as not interested. It’s a definite problem.

    • @Duamerthrax@lemmy.world
      link
      fedilink
      English
      138 months ago

      It’s amazing how often I get a video from some right wing source suggested to me companting about censorship and being buried by youtube. I ended up installing a third party channel blocker to deal with it.

    • @afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      118 months ago

      I can’t prove that they were related but I used to report all conservative ads (Hillsdale Epoch times etc) to Google with all caps messages how I was going to start calling the advertisers directly and yell at them for the ads, about 2-3 days after I started doing that the ads stopped.

      I would love for other people to start doing this to confirm that it works and to be free of the ads.

    • @S_H_K@lemmy.dbzer0.com
      link
      fedilink
      English
      78 months ago

      Is fucking insane how much that happens I stopped using Instagram for that reason at least yt listened to my “not interested” choices. I also have revnaced so IDK what ads it would shoot at me.

    • @CaptPretentious@lemmy.world
      link
      fedilink
      English
      38 months ago

      YouTube started feeding me that stuff too. Weirdly once I started reporting all of them as misinformation they stop showing up for some reason…

  • @Krudler@lemmy.world
    link
    fedilink
    English
    50
    edit-2
    8 months ago

    I just would like to show something about Reddit. Below is a post I made about how Reddit was literally harassing and specifically targeting me, after I let slip in a comment one day that I was sober - I had previously never made such a comment because my sobriety journey was personal, and I never wanted to define myself or pigeonhole myself as a “recovering person”.

    I reported the recommended subs and ads to Reddit Admins multiple times and was told there was nothing they could do about it.

    I posted a screenshot to DangerousDesign and it flew up to like 5K+ votes in like 30 minutes before admins removed it. I later reposted it to AssholeDesign where it nestled into 2K+ votes before shadow-vanishing.

    Yes, Reddit and similar are definitely responsible for a lot of suffering and pain at the expense of humans in the pursuit of profit. After it blew up and front-paged, “magically” my home page didn’t have booze related ads/subs/recs any more! What a totally mystery how that happened /s

    The post in question, and a perfect “outing” of how Reddit continually tracks and tailors the User Experience specifically to exploit human frailty for their own gains.

    Edit: Oh and the hilarious part that many people won’t let go (when shown this) is that it says it’s based on my activity in the Drunk reddit which I had never once been to, commented in, posted in, or was even aware of. So that just makes it worse.

    • @mlg@lemmy.world
      link
      fedilink
      English
      188 months ago

      Its not reddit if posts don’t get nuked or shadowbanned by literal sitewide admins

      • @Krudler@lemmy.world
        link
        fedilink
        English
        98 months ago

        Yes I was advised in the removal notice that it had been removed by the Reddit Administrators so that they could keep Reddit “safe”.

        I guess their idea of “safe” isn’t 4+ million users going into their privacy panel and turning off exploitative sub recommendations.

        Idk though I’m just a humble bird lawyer.

    • @KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      8
      edit-2
      8 months ago

      Yeah this happens a lot more than people think. I used to work at a hotel, and when the large sobriety group got together yearly, they changed bar hours from the normal hours, to as close to 24/7 as they could legally get. They also raised the prices on alcohol.

  • @ristoril_zip@lemmy.zip
    link
    fedilink
    English
    458 months ago

    “Noooo it’s our algorithm we can’t be held liable for the program we made specifically to discover what people find a little interesting and keep feeding it to them!”

    • @RagingRobot@lemmy.world
      link
      fedilink
      English
      168 months ago

      I wonder if you built a social media site where the main feature was that the algorithm just showed you things in sequential order like in the old days, would it be popular

      • @RaoulDook@lemmy.world
        link
        fedilink
        English
        148 months ago

        I enjoy using Lemmy mostly that way, just sorting the feed by new / hot / whatever and looking at new posts of random shit. Much more entertaining than video-spamming bullshit.

      • @Hillock@feddit.de
        link
        fedilink
        English
        118 months ago

        No, there is too much content for that nowadays. YouTube has over 3 million new videos each day. Facebook, TikTok, Instagram also has ridiculous amounts of new posts every day. Browsing Reddit on New was a terrible experience on r/all or even many of the bigger subs. Even on the fediverse sorting by new is not enjoyable. You are swarmed with reposts, and content that’s entirely uninteresting to you.

        It works in smaller communities but there it isn’t really necessary. You usually have an overview of all the content anyhow and it doesn’t matter how it’s ordered.

        Any social media that plans on scaling up needs a more advanced system.

      • The Quuuuuill
        link
        fedilink
        English
        28 months ago

        People complain about mastodons lack of algorithms a lot. Its part of how misskey, ice shrimp, and catodon came to be

    • @John_McMurray@lemmy.world
      link
      fedilink
      English
      -28 months ago

      I find it very weird to be living in a country legalizing drugs and assisted suicide (even for depression) but simultaneously trying to severely curtail free speech, media freedom and passing legislation to jail people at risk of breaking the law who don’t meet the “conspiracy to commit” threshold".

  • @The_Tired_Horizon@lemmy.world
    link
    fedilink
    English
    258 months ago

    I gave up reporting on major sites where I saw abuse. Stuff that if you said that in public, also witnessed by others, you’ve be investigated. Twitter was also bad for responding to reports with “this doesnt break our rules” when a) it clearly did and b) probably a few laws.

    • Alien Nathan Edward
      link
      fedilink
      English
      20
      edit-2
      8 months ago

      I gave up after I was told that people DMing me photographs of people committing suicide was not harassment but me referencing Yo La Tengo’s album “I Am Not Afraid Of You And I Will Beat Your Ass” was worthy of a 30 day ban

      • Panda (he/him)
        link
        fedilink
        English
        88 months ago

        I remember one time somebody tweeted asking what the third track off Whole Lotta Red and I watched at least 50 people get perma’d before my eyes.

        The third track is named Stop Breathing.

        • LiveLM
          link
          fedilink
          English
          28 months ago

          I TAKE MY SHIRT OFF AND ALL THE HOES STOP BREATHIN’ accessing their Twitter accounts WHEH? 🧛‍♂️🦇🩸

      • @The_Tired_Horizon@lemmy.world
        link
        fedilink
        English
        78 months ago

        On youtube I had a persistent one who only stopped threatening to track me down and kill me (for a road safety video) when I posted the address of a local police station and said “pop in, any time!”

      • Kühe sind toll
        link
        fedilink
        English
        08 months ago

        That’s true, but a lotnof things are illegal eeverywhere. Sexual Harassment or death treads will get you a lawsuit in probably every single country of the world.

        • prole
          link
          fedilink
          English
          4
          edit-2
          8 months ago

          Lawsuits are for civil cases. If someone breaks a law, they’re charged by authorities at their discretion.

  • Jaysyn
    link
    fedilink
    25
    edit-2
    8 months ago

    Good.

    There should be no quarter for fascists, violent racist or their enablers.

    Conspiracy for cash isn’t a free speech issue.

  • @Fedizen@lemmy.world
    link
    fedilink
    English
    228 months ago

    media: Video games cause violence

    media: Weird music causes violence.

    media: Social media could never cause violence this is censorship (also we don’t want to pay moderators)

  • Scott
    link
    fedilink
    English
    228 months ago

    Excuse me what in the Kentucky fried fuck?

    As much as everyone says fuck these big guys all day this hurts everyone.

    • athos77
      link
      fedilink
      22
      edit-2
      8 months ago

      I agree with you, but … I was on reddit since the Digg exodus. It always had it’s bad side (violentacrez, jailbait, etc), but it got so much worse after GamerGate/Ellen Pao - the misogyny became weaponized. And then the alt-right moved in, deliberately trying to radicalize people, and we worked so. fucking. hard to keep their voices out of our subreddits. And we kept reporting users and other subreddits that were breaking rules, promoting violence and hatred, and all fucking spez would do is shrug and say, “hey it’s a free speech issue”, which was somewhere between “hey, I agree with those guys” and “nah, I can’t be bothered”.

      So it’s not like this was something reddit wasn’t aware of (I’m not on Facebook or YouTube). They were warned, repeatedly, vehemently, starting all the way back in 2014, that something was going wrong with their platform and they need to do something. And they deliberately and repeatedly choose to ignore it, all the way up to the summer of 2021. Seven fucking years of warnings they ignored, from a massive range of users and moderators, including some of the top moderators on the site. And all reddit would do is shrug it’s shoulders and say, “hey, free speech!” like it was a magic wand, and very occasionally try to defend itself by quoting it’s ‘hate speech policy’, which they invoke with the same regular repetitiveness and ‘thoughts and prayers’ inaction as a school shooting brings. In fact, they did it in this very article:

      In a statement to CNN, Reddit said, “Hate and violence have no place on Reddit. Our sitewide policies explicitly prohibit content that promotes hate based on identity or vulnerability, as well as content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or group of people. We are constantly evaluating ways to improve our detection and removal of this content, including through enhanced image-hashing systems, and we will continue to review the communities on our platform to ensure they are upholding our rules.”

      As someone who modded for a number of years, that’s just bullshit.

      Edit: fuck spez.

      • Binthinkin
        link
        fedilink
        2
        edit-2
        8 months ago

        Yep that’s how the Nazis work on every site. The question is who lets them on these sites so easily to do this work on society. And why do sites fight for them to stay? Are Nazis high up in government? Is it the wealthy? Probably something like that.

        • BirdEnjoyer
          link
          fedilink
          18 months ago

          Part of the reason they get so high up on nerd sites (And Reddit at least started as a nerd site) is that they hunger for power, and the right people are too shy to seek power themselves.

          This would all be greatly relieved if communities asked for communities to nominate other members, and asked for the type of folks who are the types who would mostly only consider the position of asked/ or if they were write-ins.

          People with the capacity but are looked over because they maybe lack the ego or self confidence to take such power.

          This works especially well in smaller communities under 4K users or so, which kinda falls apart in our Big Internet world, sadly…

  • @Not_mikey@slrpnk.net
    link
    fedilink
    English
    188 months ago

    Sweet, I’m sure this won’t be used by AIPAC to sue all the tech companies for causing October 7th somehow like unrwa and force them to shutdown or suppress all talk on Palestine. People hearing about a genocide happening might radicalize them, maybe we could get away with allowing discussion but better safe then sorry, to the banned words list it goes.

    This isn’t going to end in the tech companies hiring a team of skilled moderators who understand the nuance between passion and radical intention trying to preserve a safe space for political discussion, that costs money. This is going to end up with a dictionary of banned and suppressed words.

      • @Alpha71@lemmy.world
        link
        fedilink
        English
        188 months ago

        It’s already out there. For example you can’t use the words “Suicide” or “rape” or “murder” in YouTube, TikTok etc. even when the discussion is clearly about trying to educate people. Heck, you can’t even mention Onlyfans on Twitch…

        • Makhno
          link
          fedilink
          English
          128 months ago

          Heck, you can’t even mention Onlyfans on Twitch…

          They don’t like users mentioning their direct competition