SAN FRANCISCO — A Kremlin-backed group of internet trolls that meddled in the 2016 presidential election appeared to be trying to influence American voters using Facebook days ahead of the midterm elections, the social network said Tuesday night.
Just hours after most of the polls had closed, Facebook said it had blocked more than 100 Facebook and Instagram accounts “due to concerns that they were linked to the Russia-based Internet Research Agency,” Nathaniel Gleicher, Facebook’s head of cybersecurity policy, said in a statement.
In February, more than a dozen members of the group, also called the I.R.A., were indicted in connection with a far-reaching conspiracy to illegally influence the 2016 election through elaborate social media campaigns on Facebook, Instagram, YouTube and Twitter.
“This is a timely reminder that these bad actors won’t give up — and why it’s so important we work with the U.S. government and other technology companies to stay ahead,” Mr. Gleicher said.
It is also a reminder that despite Facebook’s efforts to stem election meddling, the platform continues to be a battleground for mischief makers looking to spread disinformation to the widest possible audience.
The accounts recently removed by Facebook were identified following a tip from law enforcement agencies. It was the first time the company had publicly acknowledged acting on an influence campaign based on intelligence received from a government agency.
In a joint statement issued with the F.B.I., Facebook pledged to continue working with government agencies.
“Americans should be aware that foreign actors — and Russia in particular — continue to try to influence public sentiment and voter perceptions,” the statement said. “The United States will not tolerate foreign interference in our elections from Russia, China, Iran or other nations.”
While Facebook linked the new influence network to the Internet Research Agency, the company stopped short of fully tying the Russian troll farm to the activity. In the statement, Mr. Gleicher said that a website that claimed to be affiliated with the group appeared to take responsibility for some of the accounts Facebook had removed.
Facebook took down the accounts on Monday, a day before it provided more details. The Facebook accounts were largely communicating in French and Russian, while the Instagram accounts were aimed at an English-language audience, Mr. Gleicher said in a blog post.
The company has spent the past 18 months grappling with the scope of its disinformation problem, and it has spent millions on additional resources and employees to deal with it. Mark Zuckerberg, the chief executive of Facebook, has compared the effort to an “arms race” between company security teams and groups trying to spread disinformation.
As far back as 2013, the Internet Research Agency has been linked to influence campaigns on social media. Based in St. Petersburg, Russia, the agency served as a kind of professional online trolling network, carrying out information operations intended to sway public opinion among the groups it targets.
The group, which has been linked to the Kremlin, specializes in pumping out legions of Twitter bots, YouTube videos and Facebook posts under the guise of activism, often posing as both far-right and left-wing groups.
In 2014, the troll farm began targeting candidates in the United States presidential race, according to a February indictment naming 13 Russians linked to the agency. More than 126 million Americans were exposed to Russia-linked content through Facebook alone, the company said last year. The Internet Research Agency also uploaded more than 1,000 videos to YouTube and posted more than 131,000 Twitter messages.
Last month, the Justice Department announced it had uncovered another attempt by Russian state-backed actors to meddle in the midterm elections.
The criminal complaint charged Elena Alekseevna Khusyaynova, a Russian citizen, with managing a multimillion-dollar budget to “sow division and discord” in the United States ahead of Tuesday’s voting. Ms. Alekseevna’s budget allowed her to buy websites, as well as Facebook and Instagram ads aimed at spreading divisive content.
Since the 2016 election, Facebook has hired 10,000 people to work on security, and created tools dedicated to spotting disinformation around major elections. Last month, it unveiled a “war room” at its Menlo Park, Calif., headquarters, where employees would be monitoring the company’s networks for disinformation.
Facebook has announced a series of actions in recent months to remove active influence campaigns by foreign states.
In August, Facebook said it found an influence operation that originated in Iran and Russia. And in October, Facebook announced a second Iran-run network of Facebook and Instagram accounts that were followed by approximately one million people in the United States and Britain.