The very first place everybody’s minds need to go here is: was it russia
Yeah it could be maga, it wouldn’t be surprising. Which russia knows. And is 100% going to try to exploit.
This should be the default assumption, until proven otherwise.
“We’ve said it before and we’ll say it again – there is no place for hate in a democracy.
Trump is getting rid of that democracy problem
What’s really scary about this is that someone apparently has a list of black people
Now wait until you hear what happened with 23andMe.
And I’m both amazed and quite worried that nothing has come of it yet.
Yet. Give them time, they’ll need to figure out where it can be best integrated into Project 2025.
That’s why I’m worried.
Google has that list, and it’s for sale. Regardless if you tell them, they know.
Google doesn’t sell their private data collection. They use it to target ads so it’s the most valuable thing they have (or as they would call it their “unfair advantage”), so selling it directly is out of question. There are multiple data brokers that also have very comprehensive lists that will happily sell it to anyone.
They sell the targeting, from there it’s simple for the advertisers to compile the list with the tracking cookies left behind.
Given the state of data harvesting I imagine purchasing a list of phone numbers associated with a given demographic is trivial.
You could get a highschool kid to build this with available data.
Nothing to hide nothing to fear. Till suddenly u got so thing to hide. I hate to say it but i fucking told you so.
Not that hard to buy people’s data these days.
No kidding. When I heard about this I asked how the fuck do they even know it’s a black person.
I think there is a decent chance that this is Russia trying to divide America still. That is Putin’s long term game because he knows he can’t win an actual war with all of America.
Yep.
::rant incoming::
Their psyops game is horrendously effective, and it makes me furious.
And because it’s so effective, it makes sense for them to continue it, to continue to fracture the English-speaking world.
(It’s also cheap compared to manufacturing weapons…get a bunch of laptops, hire a bunch of trolls for a slightly-better-than-average wage in some local currency doing a job that ISN’T hard labor, which probably seems cushy. Put them in a room, and have them do a script. Very cheap compared to actually designing and manufacturing real weapons, or doing real logistics for war, etc. Cheap and effective = huge incentive for continuing indefinitely.)
The goal I’ve noticed is to make no place online safe. Poison everything. Texts in this particular case, make PoC in America and elsewhere apt to vent and lash out because of the pressure, but also they also poison forums, social media of any and all stripes, etc. Divide and conquer, anyone, everyone, everywhere.
Have a hobby? They’ll slide into the hobby discussion sites and start flame wars.
(I saw this happen a lot on Reddit’s Star Trek sub. Why was that sub a target? My suspicion is that it’s because Star Trek is a comfort show for a lot of people over generations and across nationalities in the West, and also acts as a way to promote Western values of tolerance and thoughtfulness and curiosity, so they want to poison the comforting retreat people go to when they can’t stand the overt hopelessness of the political or news subs.)
It happens with all sorts of loves and hobbies too. Have a favorite team, a favorite book, a favorite movie and want to just geek out over it? They slide into that and start shit.
And it’s really insidious sometimes…they’ll take an already-hot topic and start up a new thread with wording that makes them look clueless but not aggressive. So because they’re not obviously an asshole, people hop in giving them the benefit of the doubt and the flame war on (whatever topic specific to that interest) ignites again.
Unless the mods know how to spot that and moderate (which is fairly rare)…but even if that happens, the problem there is that if the mods do their jobs, a true clueless newbie coming in won’t know the history of this or that topic and will accidentally get hit by a sudden banhammer without knowing why…which in its way also starts shit, because the real person caught in the net gets their feelings hurt. So it’s a catch-22…bad mods, and an online social space is easily manipulated to become a cesspool, but good mods sometimes also accidentally catch a real person in their net…so shit STILL goes down and poisons the well.
Nations doing psyops shit online play both sides, too. So they won’t always start shit by posting a far-right viewpoint…they’ll choose a lefty viewpoint too, it costs them nothing to lie, but they’ll speckle it in with enough “tinder” that flames still ignite. Or they’ll have multiple accounts responding to each other.
The only reason I notice this, btw, is because I was a geek in fandoms BEFORE this sort of manipulation started, so I remember what a “legit” forum SHOULD look like. Like, there were always trolls and people with shitty social skills…but it was a very different type of trolling than this psyops shit we see now, because real people with real egos and desires and motivations were behind it. It had a different rise and fall, a different pattern. I guess it was more like real life–with allowances that people will say things anonymously that they’d never say face to face?
Younger folks who have never known a “good” and sane discussion forum think the toxicity and hopelessness online everywhere in every topic is NORMAL. But it’s not.
(When Lemmy was unknown, it was more like the forums of old, but now it’s on various antagonists’ radar there’s been an uptick of bad actors starting shit in comments.)
Are there any ways we can fight this? In my head “guerilla counter psyops” sounds really cool…
But like is the move to ignore? Point out what they’re doing? Verbal judo? Report and move on?
Lemmy is still fine if you block hexbear, grad, and .ml. .ee also gets infested sometimes but the mods do their best.
This should be required reading before being allowed on lemmy.
Get the word out more, we need to remind people or explain to people.
Honestly we need what you wrote but way more succinct in a way we can just copy and paste it to every post and news article we see that is just trying to divide us.
Computers to the rescue. AI succinctification:
Here’s a distilled version of the article:
Russian Psyops: Poisoning Online Communities
Russia has developed an effective online tactics game, using cheap and widespread methods to manipulate public opinion and sow discord. Their goal is to create an environment where no online space feels safe or trustworthy.
Tactics:
- Creating fake accounts to spread false information and ignite conflicts
- Targeting sensitive topics like race, politics, and hobbies to exploit emotions and provoke reactions
- Posting “clueless” comments to elicit responses from genuine users, then fueling the resulting flame wars
- Playing both sides by promoting opposing views with a mix of truth and disinformation
Consequences:
- Fracturing English-speaking communities and eroding trust online
- Normalizing toxic behavior and making it seem like “just how things are”
- Disrupting healthy discussions and debates, creating an atmosphere of hopelessness and cynicism
The Threat:
Russia’s online psyops campaign is a real and significant threat to global democracy and community cohesion. By recognizing this threat and taking steps to mitigate its effects, we can work towards preserving the integrity and safety of online spaces.
Although that may be effective to some, that format is too dry and science-y to tap into the people who need to be reached.
We need the power of a human being’s impassioned words, presented in the context of a natural conversation, converted into a meme-able format. The simplest way would be to copy/paste the original comment and start sharing it on other platforms. If there is a way to make the message more succinct, without losing that crucial human touch that inspires people to relate to the message, that would be ideal.
Trump winning is also Putin’s long term game, for all we know
It’s not his long term plan, it was a short term plan to help his long term plans. Trump is a pawn
Oh that’s what I meant - a step in his long term plan, not the ultimate goal, of course
Not even in question.
It’s true.
The question is how do we look after our brothers/sisters?
I don’t care who’s sending it, it’s not ok.
The south is celebrating, they’re finally going to get justice for daring elect a black man over them.
Again
No, he didn’t finish the job the first time.
This time they’re drooling.
“The FBI is aware of the offensive and racist text messages sent to individuals around the country and is in contact with the Justice Department and other federal authorities on the matter,” the FBI said on Thursday.
FBI Link: https://www.fbi.gov/news/press-releases/fbi-statement-on-offensive-and-racist-text-messages
Brian Hughes, of the Trump campaign, told NBC that they would take legal action “if we can find the origin of these messages which promote this kind of ugliness in our name.
“President Trump built a diverse and broad coalition of support, with voters of all races and backgrounds,” he said in a statement to NBC. “The result was a landslide victory for his commonsense mandate for change. This will result in a second term that is beneficial to every working man and woman in our nation.”
Website scheduled for deactivation on January 21, 2025.
Why have FBI when you can have the
USSSWhere do you see that?
Sorry. I should’ve tagged for sarcasm.
Trump will be sworn in on January 20th.
… and if not in the second term, we’ll change the term limits to enable us to best follow this up.
What the FBI isn’t saying is they’re the ones who gave the suspects the list.
coincidence? hell no!
Yes everyone this year got insane amounts of text messages. I think they may have been part of voter fraud schemes.
Also anyone see that Joe Rogan asserts Elon Musk had an app with election results before anyone else? https://grabien.com/story.php?id=499986