Social Media, AI and Propaganda Warfare: Ugly Face of Modern Warfare

Shahzad Masood Roomi

“Truth is the first casualty in war” – Aeschylus

Pakistan has been in state of war since last 2 decades. The string of geopolitical events triggered in the region, post 9/11, pushed Pakistan into a vicious cycle of violence and instability. During this period, apart from religiously motivated TTP, some sleeping sidewinders were also resurrected from hibernation like ethno-linguistic led insurgency in Baluchistan, a province that was otherwise quiet on internal security axis since 1970s.  Apart from usual means of violence, these organizations got sophistication in the art of propaganda and misinformation. Social media and related emerging technologies came to their aid in this psy-op against the security forces of Pakistan.

It is no secret that social media has become a powerful weapon of warfare in the 21st century. It is not only a platform for communication, entertainment and information, but also a tool for manipulation, deception and influence. This is exactly how TTP and BLA have been deploying it. Social media is their biggest tool for recruitment, disinformation psy-ops and anti-state propaganda.

In modern warfare, political actors, both state and non-state, use social media to spread propaganda and disinformation, to shape public opinion, to sow doubt and division, and to undermine democracy and human rights. This phenomenon became visible in Pakistan when India launched multiple fake news outlets around the world to malign Pakistan. These outlets generated tons of propaganda material on internet as well which was disseminated to Pakistani masses particularly to young minds through social media.

In warfare and statecraft, propaganda is not a new phenomenon, but social media has enabled it to reach unprecedented levels of sophistication and scale.Social Media, AI and Propaganda Warfare With the emergence of artificial intelligence (AI), propaganda can be generated, distributed and tailored to specific audiences and contexts, with minimal human intervention and oversight. AI can also be used to create fake or manipulated content, such as videos, images and texts, that are indistinguishable from reality. These techniques are known as generative AI or deep fakes. These are not assumptions only, there has been cases of this growing trend of using AI along with Social media by various actors in different parts of the world.

Some examples of how social media and AI are used for propaganda warfare are:

– In 2023, Venezuelan state media outlets used Synthesia, a company that produces custom deepfakes, to create AI-generated videos of news anchors from a nonexistent international English-language channel, spreading pro-government messages.Social Media, AI and Propaganda Warfare: Ugly Face of Modern Warfare

– In the United States, AI-manipulated videos and images of political leaders circulated on social media, depicting President Biden making transphobic comments and Donald Trump hugging Anthony Fauci.

– In 2023, the U.S. government waged psychological warfare on the nation by adopting fake social media identities and AI-created profile pictures to surveil, target and capture potential suspects.

– In 2023, Ukraine became a living lab for AI warfare, as Russian-backed separatists used AI to generate disinformation and propaganda on social media platforms, targeting Ukrainian soldiers and civilians.

– In 2023, ISIS used AI to boost its online recruitment efforts, creating personalized messages and content for potential sympathizers and recruits.

These examples show how social media and AI are reshaping propaganda and disinformation operations in modern warfare. They pose serious challenges for democracy, security and human rights. They also raise ethical and legal questions about the responsibility and accountability of the actors involved, as well as the regulation and verification of online content.

In the specific context of Pakistan, AI is not being used extensively by anti-state organizations on social media but as social media platforms now have begun offering AI enabled tools, it is only a matter of time when we will witness manifold increase in anti-state propaganda both in terms of frequency and sophistication. Pakistani state needs to prepare itself and the nation to be aware of this new era of falsehood.

To counter these threats, government, civil society and the private sector need to work together to develop effective strategies and solutions. Some possible measures are:

– Enhancing digital literacy and critical thinking skills among the public, especially the youth, to help them identify and resist propaganda and disinformation.

– Pakistan must raise voice on diplomatic channels for transparency and accountability of social media platforms and AI developers, by requiring them to disclose their sources, methods and algorithms, as well as to monitor and flag harmful or misleading content.

– Developing technical tools and standards for detecting and verifying online content, such as digital watermarks, blockchain-based authentication and independent fact-checking organizations. Government must encourage and support private sector to invest and introduce new social media platforms with infrastructure within Pakistan.

– Pakistan’s foreign policy must strive for strengthening international cooperation and dialogue among stakeholders, by sharing best practices, exchanging information and building trust. China can help Pakistan in this area.

Social media and AI have transformed the nature of modern warfare. They have also created new opportunities for dialogue, participation and empowerment. It is up to us to use them wisely and responsibly.

About the author

Leave a Comment

Add a Comment

Your email address will not be published. Required fields are marked *

Shopping Basket