SUPPORT THINKCIVICS — Donate HERE
OR Subscribe for free to the Civics+ newsletter.
NEWS FEED — GAB — FACEBOOK — TWITTER — RUMBLE
In recent news, it has been reported that the United States Special Operations Command (SOCOM) is seeking to use deepfakes as part of their psychological operations, or “PSYOPs.” Deepfakes are computer-generated images, videos, or audio files that are designed to look and sound like real people. While deepfakes have been used for entertainment purposes in recent years, their use in PSYOPs is raising concerns about the potential dangers.
According to an article by Defense One, SOCOM is seeking technologies that can “create, modify, or manipulate information” to support their psychological operations. One potential use for deepfakes would be to create false personas or misleading information in order to deceive an enemy or influence their behavior.
The Pentagon is also reportedly hard at work countering the foreign deepfake threat. According to a 2018 news report, the Defense Advanced Research Projects Agency (DARPA), the military’s tech research division, has spent tens of millions of dollars developing methods to detect deepfaked imagery. Similar efforts are underway throughout the Department of Defense.
Deepfakes have surfaced more recently during the Russian invasion of Ukraine, with fake videos showing both the Ukrainian and Russian presidents surrendering to each other.
Proponents of the use of deepfakes in PSYOPs argue that they could be a valuable tool for military operations, providing a way to disseminate disinformation without risking the lives of soldiers. They suggest that deepfakes could be used to create fake news reports, social media posts, and even audio recordings that would be difficult to distinguish from real ones.
However, others have raised concerns about the potential dangers of using deepfakes in this way. Experts warn that the use of deepfakes in PSYOPs could undermine public trust in information and exacerbate the already significant problem of disinformation.
“When it comes to disinformation, the Pentagon should not be fighting fire with fire,” Chris Meserole, head of the Brookings Institution’s Artificial Intelligence and Emerging Technology Initiative, told The Intercept. “At a time when digital propaganda is on the rise globally, the U.S. should be doing everything it can to strengthen democracy by building support for shared notions of truth and reality. Deepfakes do the opposite. By casting doubt on the credibility of all content and information, whether real or synthetic, they ultimately erode the foundation of democracy itself.”
Moreover, the use of deepfakes in military operations could have unintended consequences. If the enemy discovers that they have been tricked, it could lead to a loss of trust and potentially escalate tensions between nations. In addition, the use of deepfakes could be seen as a violation of international law, which prohibits the dissemination of false information during times of war.
There are also ethical concerns surrounding the use of deepfakes in PSYOPs. Creating false personas or manipulating information in this way could be seen as a form of deception or propaganda, which could violate principles of transparency and accountability.
While the United States military has not yet made any official statements on the use of deepfakes in PSYOPs, some experts believe that it is only a matter of time before this technology is incorporated into military operations. As such, there is a growing need for policymakers and military leaders to develop guidelines and regulations that address the potential risks and benefits of using deepfakes in warfare.
One potential solution could be to develop international norms around the use of deepfakes in warfare, similar to the Geneva Conventions, which govern the conduct of war. Such norms could include a prohibition on the use of deepfakes to spread false information, as well as guidelines for how to verify the authenticity of information in a conflict zone.
Overall, the use of deepfakes in psy-ops is a complex issue with far-reaching implications. While their potential for deception and manipulation is cause for concern, there may also be legitimate uses for these technologies.
As the technology for creating deepfakes continues to advance, it is clear that the use of this technology in military operations will only become more prevalent. It is up to policymakers and military leaders to develop responsible guidelines for its use to prevent the potential risks and dangers that come with this technology.
“The Use of Deepfakes in Military Operations: Risks and Benefits” by Danielle Citron and Robert Chesney, Council on Foreign Relations, September 2021.
“Deepfakes and the Future of Warfare” by David Shedd, Cipher Brief, September 2021.
Defense One: “SOCOM Wants to Use Deepfakes for Psychological Operations.” https://www.defenseone.com/technology/2022/03/socom-wants-use-deepfakes-psychological-operations/372559/
The Intercept: “The U.S. Special Forces Want To Use Deepfakes For Psy-Ops.” https://theintercept.com/2023/03/06/pentagon-socom-deepfake-propaganda/
Join ThinkCivics+ premium Substack subscription to support independent news. Purchase today to get 25% off your membership! Our independence and ability to speak truth to power depend on all the resources we can get. Help us and become part of the fight!
Michael Price is a Founder and editor for ThinkCivics. He has been writing about politics, government, and culture for over a decade. He has a BA in Political Science and an Masters in Public Administration.