Russia is the disruptive king on Facebook, the company said

Of Facebook ReportPublished Wednesday, shows how foreign and domestic secret influence operators have changed their tactics and become more sophisticated in response to efforts by social media companies to influence fake accounts and influence operations.

Facebook has removed more than 1,150 counterfeit networks that have been coordinated since 2017, the report said. Twenty-seven networks are connected to Russia and 223 to Iran. Nine originated within the United States.

The United States is the primary target of foreign influence campaigns, according to a report by Facebook. Various sources have highlighted two such efforts from 2017 to 2020. (Ukraine has followed suit.)

Although it was the American domestic actors in the 2020 election season, not the foreign executives, who were increasingly responsible for sowing the seeds of disintegration. In the run-up to this election, Facebook removed many US networks as so-called coordinated unauthorized practices (CIBs) targeting the United States, such as Russian or Iranian networks, the company reported.

“Most notably, one of the CIB networks we found was run by Rally Forge, a US-based marketing firm that was working on behalf of its clients, including the Political Action Committee Turning Point USL.” “The campaign took advantage of the authentic community and recruited a staff of teenagers to run fake and fake accounts who were forced to comment on news pages and political actors as insecure voters.”

That campaign was first reported The Washington Post In September 2020. In a statement to the Post at the time, a spokesman for Turning Point described the effort as “not an unnamed troll form in Russia, but a sincere political activism driven by real people who believe in the faith they have created online.” At the time, the group declined to comment Request from CNN.
Another US network, Facebook Announced It was removed in July 2020, after a relationship with Roger Stone, a friend and political adviser to former President Donald Trump. The network has more than 500 accounts, pages0 pages and four Instagram accounts. It was an access that covered 200,000 Facebook accounts and more than 600,000 Instagram accounts. (After the takedown of Facebook, Dhuga Shared news Her ban on the alternative social media site Parlor, along with a statement: “We are exposing the work of the railways on how deep and clear it was during my case, so they should silence me. As soon as they learn, I cannot and will not remain silent.”)
Following the 2011 media election, the presence of counterfeit and misleading content on social media on Tekgya Tech platforms, including Facebook, Twitter and YouTube, became a major issue, following the revelations about Russia. Tries to mediate In the American democratic process. By presenting themselves as American voters, targeting voters with misleading digital advertising, creating false news stories and other methods, foreign influence campaigns have sought to prevent a split in the electorate.

The search for those campaigns has put intense political and regulatory pressure on Big Tech and has repeatedly raised questions about the industry’s inconsistent power in politics and the wider economy. Many critics have since called for the disbandment of major tech companies and the law on how social media platforms mediate content on their websites.

Tech companies like Facebook have responded by placing more content moderators and establishing new platform policies on counterfeit activity.

In a separate announcement on Wednesday, Facebook said it was extending penalties to individual Facebook users, a fact frequently shared by partners investigating misinformation. Currently, when a user shares a post that contains debunk claims, Facebook’s algorithm demotes that post to its news feed, making it less visible to other users. But under Wednesday’s change, repeat offenders could risk putting all of their posts forward.

Facebook has repeatedly used blanket account-level demos on pages and groups that share frequently fact-checked misinformation, it says, but Wednesday’s announcement covers individual users for the first time. (Politicians’ account changes are not covered because political figures are exempt from Facebook’s fact-checking program.)

But while Facebook has improved its moderate efforts, many misunderstood secret writers have developed their own tactics, the report said. By creating more adaptive and targeted campaigns that could prevent them from outsourcing their campaigns to third parties, threat actors are trying to optimize Facebook’s enforcement in the more complex game of cat and mouse, the company said.

“So when you put together the symbolic effect of four years, what are the trends?” Ben Nimmo, a co-author of the report, Wrote on Twitter Wednesday “More operators are trying, but more operators have been caught. The challenge is to stay ahead and keep catching them.”

.

Leave a Comment

x