CryptoPolyTech.com
Crypto, Politics, Tech, Gaming & World News.

Pentagon reviews psychological operations amid Facebook, Twitter complaints

| cutline • press clip • news of the day |

Cryptopolytech Public Press Pass

Title: Pentagon reviews psychological operations amid Facebook, Twitter complaints Originally reported on www.washingtonpost.com by Ellen Nakashima

|20000604 • National Security|United States • ISO 3166-2:US 840 USA |Politics • 11000000|

Pentagon reviews psychological operations amid Facebook, Twitter complaints.

The Pentagon has ordered a sweeping audit of how it conducts clandestine information warfare after major social media companies identified and took offline fake accounts suspected of being run by the U.S. military in violation of the platforms’ rules.

Colin Kahl, the undersecretary of defense for policy, last week instructed the military commands that engage in psychological operations online to provide a full accounting of their activities by next month after the White House and some federal agencies expressed mounting concerns over the Defense Department’s attempted manipulation of audiences overseas, according to several defense and administration officials familiar with the matter.

The takedowns in recent years by Twitter and Facebook of more than 150 bogus personas and media sites created in the United States was disclosed last month by internet researchers Graphika and the Stanford Internet Observatory. While the researchers did not attribute the sham accounts to the U.S. military, two officials familiar with the matter said that U.S. Central Command is among those whose activities are facing scrutiny. Like others interviewed for this report, they spoke on the condition of anonymity to discuss sensitive military operations.

The researchers did not specify when the takedowns occurred, but those familiar with the matter said they were within the past two or three years. Some were recent, they said, and involved posts from the summer that advanced anti-Russia narratives citing the Kremlin’s “imperialist” war in Ukraine and warning of the conflict’s direct impact on Central Asian countries. Significantly, they found that the pretend personas — employing tactics used by countries such as Russia and China — did not gain much traction, and that overt accounts actually attracted more followers.

Centcom, headquartered in Tampa, has purview over military operations across 21 countries in the Middle East, North Africa and Central and South Asia. A spokesman declined to comment.

Air Force Brig. Gen. Patrick Ryder, the Pentagon press secretary, said in a statement that the military’s information operations “support our national security priorities” and must be conducted in compliance with relevant laws and policies. “We are committed to enforcing those safeguards,” he said.

Spokespersons for Facebook and Twitter declined to comment.

Facebook and Twitter removed pro-West fake accounts originating in the United States

According to the researchers’ report, the accounts taken down included a made-up Persian-language media site that shared content reposted from the U.S.-funded Voice of America Farsi and Radio Free Europe. Another, it said, was linked to a Twitter handle that in the past had claimed to operate on behalf of Centcom.

One fake account posted an inflammatory tweet claiming that relatives of deceased Afghan refugees had reported bodies being returned from Iran with missing organs, according to the report. The tweet linked to a video that was part of an article posted on a U.S.-military affiliated website.

Centcom has not commented on whether these accounts were created by its personnel or contractors. If the organ-harvesting tweet is shown to be Centcom’s, one defense official said, it would “absolutely be a violation of doctrine and training practices.”

Independent of the report, The Washington Post has learned that in 2020 Facebook disabled fictitious personas created by Centcom to counter disinformation spread by China suggesting the coronavirus responsible for covid-19 was created at a U.S. Army lab in Fort Detrick, Md., according to officials familiar with the matter. The pseudo profiles — active in Facebook groups that conversed in Arabic, Farsi and Urdu, the officials said — were used to amplify truthful information from the U.S. Centers for Disease Control and Prevention about the virus’s origination in China.

The U.S. government’s use of ersatz social media accounts, though authorized by law and policy, has stirred controversy inside the Biden administration, with the White House pressing the Pentagon to clarify and justify its policies. The White House, agencies such as the State Department and even some officials within the Defense Department have been concerned that the policies are too broad, allowing leeway for tactics that even if used to spread truthful information, risk eroding U.S. credibility, several U.S. officials said.

“Our adversaries are absolutely operating in the information domain,” said a second senior defense official. “There are some who think we shouldn’t do anything clandestine in that space. Ceding an entire domain to an adversary would be unwise. But we need stronger policy guardrails.”

A spokeswoman for the National Security Council, which is part of the White House, declined to comment.

Kahl disclosed his review at a virtual meeting convened by the National Security Council on Tuesday, saying he wants to know what types of operations have been carried out, who they’re targeting, what tools are being used and why military commanders have chosen those tactics, and how effective they have been, several officials said.

The message was essentially, “You have to justify to me why you’re doing these types of things,” the first defense official said.

Pentagon policy and doctrine discourage the military from peddling falsehoods, but there are no specific rules mandating the use of truthful information for psychological operations. For instance, the military sometimes employs fiction and satire for persuasion purposes, but generally the messages are supposed to stick to facts, officials said.

In 2020, officers at Facebook and Twitter contacted the Pentagon to raise concerns about the phony accounts they were having to remove, suspicious they were associated with the military. That summer, David Agranovich, Facebook’s director for global threat disruption, spoke to Christopher C. Miller, then assistant director for Special Operations/Low Intensity Conflict, which oversees influence operations policy, warning him that if Facebook could sniff them out, so could U.S. adversaries, several people familiar with the conversation said.

“His point‚” one person said, “was ‘Guys, you got caught. That’s a problem.’ ”

Before Miller could take action, he was tapped to head a different agency — the National Counterterrorism Center. Then the November election happened and time ran out for the Trump administration to address the matter, although Miller did spend the last few weeks of Donald Trump’s presidency serving as acting defense secretary.

With the rise of Russia and China as strategic competitors, military commanders have wanted to fight back, including online. And Congress supported that. Frustrated with perceived legal obstacles to the Defense Department’s ability to conduct clandestine activities in cyberspace, Congress in late 2019 passed a law affirming that the military could conduct operations in the “information environment” to defend the United States and to push back against foreign disinformation aimed at undermining its interests. The measure, known as Section 1631, allows the military to carry out clandestine psychologic operations without crossing what the CIA has claimed as its covert authority, alleviating some of the friction that had hindered such operations previously.

“Combatant commanders got really excited,” recalled the first defense official. “They were very eager to utilize these new authorities. The defense contractors were equally eager to land lucrative classified contracts to enable clandestine influence operations.”

At the same time, the official said, military leaders were not trained to oversee “technically complex operations conducted by contractors” or coordinate such activities with other stakeholders elsewhere in the U.S. government.

Last year, with a new administration in place, Facebook’s Agranovich tried again. This time he took his complaint to President Biden’s deputy national security adviser for cyber, Anne Neuberger. Agranovich, who had worked at the NSC under Trump, told Neuberger that Facebook was taking down fake accounts because they violated the company’s terms of service, according to people familiar with the exchange.

The accounts were easily detected by Facebook, which since Russia’s campaign to interfere in the 2016 presidential election has enhanced its ability to identify mock personas and sites. In some cases, the company had removed profiles, which appeared to be associated with the military, that promoted information deemed by fact-checkers to be false, said a person familiar with the matter.

Report on Russian disinformation amid 2016 election shows the operation’s scale and sweep

Agranovich also spoke to officials at the Pentagon. His message was: “We know what DOD is doing. It violates our policies. We will enforce our policies” and so “DOD should knock it off,” said a U.S. official briefed on the matter.

In response to White House concerns, Kahl ordered a review of Military Information Support Operations, or MISO, the Pentagon’s moniker for psychological operations. A draft concluded that policies, training and oversight all needed tightening, and that coordination with other agencies, such as the State Department and the CIA, needed strengthening, according to officials.

The review also found that while there were cases in which fictitious information was pushed by the military, they were the result of inadequate oversight of contractors and personnel training — not systemic problems, officials said.

Pentagon leadership did little with the review, two officials said, before Graphika and Stanford published their report on Aug. 24, which elicited a flurry of news coverage and questions for the military.

The State Department and CIA have been perturbed by the military’s use of clandestine tactics. Officers at State have admonished the Defense Department, “Hey don’t amplify our policies using fake personas, because we don’t want to be seen as creating false grass roots efforts,” the first defense official said.

One diplomat put it this way: “Generally speaking, we shouldn’t be employing the same kind of tactics that our adversaries are using because the bottom line is we have the moral high ground. We are a society that is built on a certain set of values. We promote those values around the world and when we use tactics like those, it just undermines our argument about who we are.”

Psychological operations to promote U.S. narratives overseas is nothing new in the military, but the popularity of western social media across the globe has led to an expansion of tactics, including the use of artificial personas and images sometimes called “deep fakes.” The logic is that views expressed by what appears to be, say, an Afghan woman or an Iranian student might be more persuasive than if they were openly pushed by the U.S. government.

U.S. Cybercom contemplates information warfare to counter Russian interference in the 2020 election

The majority of the military’s influence operations are overt, promoting U.S. policies in the Middle East, Asia and elsewhere under its own name, officials said. And there are valid reasons to use clandestine tactics, such as trying to infiltrate a closed terrorist chat group, they said.

A key issue for senior policymakers now is determining whether the military’s execution of clandestine influence operations is delivering results. “Is the juice worth the squeeze? Does our approach really have the potential for the return on investment we hoped or is it just causing more challenges?” one person familiar with the debate said.

The report by Graphika and Stanford suggests that the clandestine activity did not have much impact. It noted that the “vast majority of posts and tweets” reviewed received “no more than a handful of likes or retweets,” and only 19 percent of the concocted accounts had more than 1,000 followers. “Tellingly,” the report stated, “the two most-followed assets in the data provided by Twitter were overt accounts that publicly declared a connection to the U.S. military.”

Clandestine influence operations have a role in support of military operations, but it should be a narrow one with “intrusive oversight” by military and civilian leadership, said Michael Lumpkin, a former senior Pentagon official handling information operations policy and a former head of the State Department’s Global Engagement Center. “Otherwise, we risk making more enemies than friends.”

Alice Crites contributed to this report.


‘News of the Day’ content, as reported by public domain newswires.

Find more, like the above, right here on Cryptopolytech.com by following our extensive quiclick links appearing on images or within categories [NEWSer CHEWSer].

Source Information (if available)

It appears the above article may have originally appeared on www.washingtonpost.com and has been shared elsewhere on the internet, repeatedly. News articles have become eerily similar to manufacturer descriptions.

We will happily entertain any content removal requests, simply reach out to us. In the interim, please perform due diligence and place any content you deem “privileged” behind a subscription and/or paywall.

We compile ‘news of the day’ content in an unbiased manner and contextually classify it to promote the growth of knowledge by sharing it just like Pentagon reviews psychological operations amid Facebook, Twitter complaints

First to share? If share image does not populate, please close the share box & re-open or reload page to load the image, Thanks!

You might also like