The third episode of ‘Radio Resilience. Conversations about our ability to adapt to change‘ is about resilience against disinformation and foreign interference. For this exchange, held on 3 November 2022, we were joined by:

  • Beata Patasova, program officer, Engagements Section, NATO
  • Caroline Orr-Bueno, behavioural scientist and postdoc, University of Maryland
  • Liubov Tsybulska, founder of Ukraine StratCom Centre
  • Lutz Güllner, Head of StratCom, European Commission
  • Alice Stollmeyer, Executive Director, Defend Democracy (moderator)

You can listen to the conversation here, and read the transcript below.

Alice:

Welcome, everyone. It’s the third of November already. And this is the third episode of Radio Resilience. So welcome.

Resilience broadly means: our ability to adapt and respond to change. Over the course of 10 ‘Twitter Spaces’, we will focus on what is called ‘societal resilience’, in particular in the current context of Russia’s war against Ukraine – and against Western democracies in general. Radio Resilience is a Defend Democracy project and is kindly supported by NATO’s Public Diplomacy Division.

But before we kick off, let me briefly introduce myself and our organisation. I’m Alice Stollmeyer, the founder & Executive Director of Defend Democracy, a non-partisan civil society organisation working to defend and strengthen democracy against foreign, domestic and technological threats.

The first episode of Radio Resilience was a general introduction to the theme of our ‘resilience’, especially to so-called hybrid threats. Hybrid threats combine military and non-military as well as covert and overt means, including disinformation, cyberattacks, economic pressure, deployment of irregular armed groups and use of regular forces. So, basically what Russia has been doing since at least 2008.

The idea of #RadioResilience is to help increase our resilience — by better understanding what these hybrid challenges are, and also by learning about the actions and measures taken by the EU, by NATO and their member states; about the importance of working on resilience together; and by giving some hope and moral support in challenging times.

Our next conversations are about some of these ‘hybrid’ challenges more concretely. What are they, and what can we do about them? Today’s topic is on information war or information manipulation, especially in the context of Russia’s war on Ukraine and democracy. We have four speakers. Welcome, Beata, Caroline, Liubov and Lutz! We have a very international panel of speakers today.

Let’s get some clarity on a topic we are talking about. Because not everyone means the same when they use words like disinformation, or bots, or information war. And of course, when we don’t mean the same when using certain words, it can make conversations quite confusing. So as the context of this series is about resilience to hybrid threats, perhaps Beata and Lutz, you could start from the NATO and EU perspective.

Lutz:

So if Beata allows, I would kick off just with a few thoughts that are so important when we talk about hybrid threats. I would say ‘hybrid threats’ is the most ill-defined term that we have at the moment. It’s constantly changing. While we have an understanding that this an issue that we need to deal with, and we certainly are putting in place various elements, it is still a work in progress.

One important element of hybrid threads, beside the kinetic part or the cyber part of hybrid threats, is the cognitive part – the very deliberate activities, by either state actors or non-state actors, to influence, in whatever shape or form, the information environment. And we’re using many different words to describe one phenomenon from very different perspectives.

Disinformation may be the most well known term because everybody’s using it. However, I would argue that we need to be more precise, because disinformation is not clearly defined and not clearly definable. I would argue we need to distinguish much more clearly between misinformation, for example sloppy reporting and non-intentional disinformation, disinformation as a domestic or even economic approach and economic strategy, and disinformation as a security issue, as a foreign policy issue, which is the deliberate, coordinated and systematic activity of, usually, a state actor.

Against this background, just for the sake of clarity, we have tried to condense this in one term, not for academic pleasure, but to identify the real issue before we develop the responses to that, also in terms of resilience, and that term is FIMI. The F is for Foreign, the I-M is for Information Manipulation and the I is for Interference.

Information manipulation is a better term. Because disinformation is not just about content that might be wrong, or true or false, black or white. We limit the interpretation of the term disinformation if we only look at content. That’s why, when looking at information manipulation, our approach is also to look at the behaviour behind it – looking at what strategies, tactics, techniques and procedures are being used to manipulate, to use this as a strategic tool. We focus on the foreign policy side, not only because we’re a foreign policy shop, but because we see different forms and shapes of this. Of course, we cannot totally distinguish foreign from domestic, but I think it’s a good starting point.

And last but not least, we also included in FIMI the last I, which is for Interference. What we increasingly see is that actors use more combinations of different tools to interfere in the communication space, in the information environment. Sometimes with cyber elements, sometimes they use other forms that are not about the narrative setting but rather the suppression of voices, for example. So this is our starting point – if we want to develop policy responses, we first need to be 100% clear of what the problem is. And it starts with terminology.

Alice:

Thank you Lutz, it makes sense that if we want to solve a problem, we need to get clarity on what the problem exactly is. Beata, would you like to come in from the NATO perspective?

Beata:

Yes, indeed, I agree with Lutz and with you that definitions are important. Because we cannot fight what we cannot define. And at NATO, we understand hostile information activities as activities that aim to influence audiences. Primarily through use of propaganda, disinformation, or a combination of the two. So a catch-all term that I would use is hostile information activities.

And then if we dig in deeper, we can talk about the hostile narratives – an overarching story that is designed to influence the audiences and influence them in a way that negatively impacts national organisational unity and cohesion, or undermines policy and security.

Meanwhile, propaganda is information that is biased or misleading in nature, and is used to promote generally a political cause or point of view. And then there is disinformation, straight-up false information, which is spread with the deliberate intent to mislead. So as Lutz was saying, the intent is key here. While misinformation is also false information, there is no intent to mislead. And we have been dealing with disinformation for many decades.

In fact, NATO has been a target, first of Soviet propaganda, and now Russian propaganda and disinformation. We first set up our information office in 1950. So less than a year after the Washington Treaty that established NATO. The understanding has always been there that we need to explain our purpose and priorities to audiences worldwide, and NATO, especially today, is a big target for Russian propaganda and disinformation. It has been on the rise, especially since 2014 with Russia’s illegal annexation of Crimea. Since the full scale invasion of Ukraine on the 21st of February, we see even more Russian propaganda and disinformation targeting NATO.

What is important is that we have take this seriously because, as you know, Russia has been ramping up anti-Ukrainian disinformation and propaganda for more than eight years. And many of the observers watched Russian propaganda TV shows, thinking “Oh that’s just funny, no one believes that, this is not important.” So even the analysts were just making fun of it as something ridiculous because the statements they had been making were not serious. But in fact, this was a precursor for what we see right now.

So hostile information activities, disinformation, and propaganda often serve as a foundation for something kinetic, for something real – they can lead to a war, they can indicate the start of an aggression. So I think it’s very important to take it seriously.

Alice:

I agree Beata, that online activity can be a precursor of offline activity. And I think no one is in a better position to tell more about that than Liubov. Liubov, can you come in from Ukraine?

Liubov:

Sure. Firstly, I fully agree with my colleagues. I think that, yes, there is a lack of clarity in these issues. And disinformation is a very misleading word, especially for the general public. I would rather call it malign information, or hostile information activity as Beata said.

When my organisation conducted a research in 2017, we called it ‘Image of Europe in Russian media’. We analysed all mentions of the EU countries on the three Russian main TV channels, fully financed and fully orchestrated by the Kremlin. And we expected to see a lot of pure fakes. But it turned out that there were not that many fakes, not that much false information. Rather, it was a huge amount of thoroughly tailored narratives. And narrative might contain just some small amount of falsities, of lies. It’s more about speculation, manipulation, generalisation, and other techniques. So yes, we have to explain these differences to the general public.

In Ukraine, we see clearly that it leads to polarisation of society. Especially after the full scale invasion started, we see that Russian disinformation or Russian propaganda aims at demoralising society, not just in Ukraine, but also the western societies. Spreading this fatigue, the narrative that the war might take too long, that it’s better to shift the attention and focus of society to something else.

So yes, indeed, we have to be very clear, when we speak about disinformation and misinformation, especially for broader audiences.

Alice:

Thank you, Liubov. Polarisation is definitely not just a problem in Ukraine. Caroline, would you mind coming in here?

Caroline:

Certainly, and I would like to hit on the points that the previous two guests just made, which I think are very important. The issues of disinformation and misinformation rose to the public consciousness around the 2016 election, and certainly became a topic of research in any field of study. And I think as a result of that was there was an urgency to build this field and to build this body of evidence and of science.

There wasn’t necessarily time to do some of the usual preparatory work that would go into establishing a field as you would usually see in other scientific fields. I think there was a feeling like, there’s an acute crisis and something needs to be done, and so we’re going to rush in and do this research. And we skipped over the work of establishing common ground terminology and establishing how to measure some of these things that we’re talking about, how to operationalise them, how to make sure that when I say a word, that it’s being understood the same way by the listener.

Yet to make sure that we all know what we’re referring to when we use a certain term, is the absolute foundation of a scientific field and of scientific inquiry, and I don’t think that was necessarily done here.

For those of us who study this, I think it’s worth pausing and considering going back and doing the foundational work so that, moving forward, we have that common ground. We can establish metrics for when something is misinformation is disinformation. A lot of that right now depends on an analysis of intentions. And that’s something that is not readily accessible, it’s hard to get into somebody’s mind and see what their intentions are. That makes it a difficult concept to measure. Establishing guardrails and guidelines would help us a lot in our work. It would help us be able to know when we are making progress, or when we’re making something worse, because as it is right now, we really don’t have a way to assess that.

Alice:

Thank you very much, Caroline. I think there’s one more term that I really would like to take a dive into before we touch on other topics. Because, in my view, one of the concepts that is very confusing is ‘information war’. What do we mean by that? And is it specific to war situations? Or is it conducted anytime, any place? And also, does it even make sense to ask “Who is winning the information war?” Or is that comparing apples and pears? Who would like to comment?

Caroline:

I do think the use of the term warfare isn’t necessarily helpful. It implies that it’s one side versus another side, or that there’s always a winner, when a lot of these issues that we’re dealing with, are tricky and don’t have defined sides. Determining a singular truth or a winner is not necessarily possible in all situations. Using the term can be counterproductive because I think it tends to encourage a more warlike mentality, where people feel like they have to take sides. And once they join a particular side, they start to then just accept information from others on their side and reject information from the other side, instead of looking at the information for what it is and really being able to look at it impartially. That sort of wartime mentality makes people let down their cognitive defences, so to speak.

Lutz:

I would agree with this. When we look at malign, coordinated activities, we see that many different techniques are being used, and it’s a permanently evolving field as well. So the term ‘information war’ gives you the feeling that this is a specific situation, it can be fought and can be won, it can be lost.

The much more structural problem is that increasingly, international actors and Russia in particular, are using this type of manipulation to achieve their strategic games. This may happen below the level of a war or linked even to a war-like situation. So we shouldn’t use terms that may sound sexy or interesting to write about, because they distract from the nature of the challenge that we’re facing.

The structural issue that we need to deal with, that many people see as a question of narratives, of communication, is a serious policy issue here as security policymakers, but also as foreign policymakers that we need to deal with.

Alice:

Great, thank you, Lutz. I know that the EU is coming up with a new package of measures called ‘Defence of democracies’, which will look into the security and foreign policy aspects of foreign hostile information activities. Beata, would you also like to comment on this topic?

Beata:

I agree that using the term ‘war’ is not always useful in this context. At the same time, when we look at the war that Russia is waging on Ukraine right now, we see that Ukraine is defending itself on land, in the air and at sea, but also in the information space. So in this case, I would argue that using the term ‘information war’ is in fact suitable, because it actually does go hand in hand with the real war on the ground.

Who is winning the information war, who’s losing, depends on the perspective and the audience. So we can all agree that Russian propaganda is aimed at their own internal audiences, which unfortunately are very difficult for us to address. Even though at NATO, for example, we have an effort aimed at communicating to Russian-speaking audiences everywhere that we call ‘NATO po Russki’. We have social media presence, for example on Telegram, where we try to address Russian-speaking audiences everywhere.

From the perspective of this panel, it probably looks like Russia is losing the information war, because the information that I consume shows me that Ukraine is performing very well on the ground, that Ukrainian people are very driven. But then if you ask Margarita Simonyon, Vladimir Solovyov or some other Russian propagandists, they will tell you that they’re winning this war. So it depends on which perspective, who is doing the talking and what audiences are listening.

Alice:

Great points. Liubov, would you like to come in?

Liubov:

I would like to add that according to Russian military strategic documents, information operation has been conducted regardless the kind of relationship with the Soviet country. For them, there does not necessarily have to be a war to conduct information operations.

Regarding winning or losing an information war, in Europe, for example in BBC reporting, Ukraine’s narratives mostly prevail – people support Ukraine and understand what our fight is about. But in countries in the global South, and specifically countries in Africa, Russia’s narratives have a very solid presence. So it is a process, and an every-day battle. And we should approach this as a process. We should remember that we can’t think we’re ever done. If we win a battle today, tomorrow there will be another one.

Alice:

It’s not a sprint but a marathon, as they say. So let’s move on to the next topic. In the first episode of Radio Resilience, I mentioned that disinformation seems to add an extra layer to Russia’s hybrid attacks. So besides using energy, food, and migration as weapons against Ukraine and the West, on top of that the Kremlin is blatantly lying about weaponising these things? Is that the way to look at it?

Caroline:

I think it does add a layer of complexity. And we are already dealing with a variety of emerging and evolving threats and challenges. So we have this struggle of, who is a reliable source and what [information] is reliable, and who should we be listening to. This is not a one and done kind of issue, it’s something that that changes every day. That requires staying on top of it. And the fire-hose of myths and disinformation and propaganda certainly makes that challenge a lot more complex, because it’s hard to get an accurate understanding of what the challenge even is on a given day.

Alice:

Thank you, Caroline. Anyone else would like to comment on this topic?

Liubov:

It is clear these days that any issue might be weaponised by an aggressor. Whether it’s security or whether it’s food security, or energy or migration. So basically, this is the thing about hybrid threats that any gap, any crack, any mistake, anything which is really important, might be used by the aggressor against you.

Alice:

As I used to say about trolls, anything can and will be used against you. What Russia is doing is the same but on a larger scale.

Lutz:

We need to be very wary of not seeing the forest for the trees. We can’t draw conclusions based on ways of trolling, or kinds of situations, that have occurred before. I think that we really need to understand how the information environment, so everything that relates to cognitive [security], how people consume information, to references, etc, how all of this is being targeted as a strategic field to support strategic aims.

Many players worldwide are recognising this. It may take some resources and knowledge, but in the end, it’s not that expensive. But building up this knowledge is really crucial. So understanding the broader information, environment, the context, how sometimes the social media are being leveraged, sometimes it’s traditional media, sometimes it’s not media at all. Sometimes it’s other factors, like working directly with or influencing NGOs or academic structures, etc. These are all parts of it.

If we don’t understand this and design also our responses accordingly, it will not work. We need to focus on the obligations, the necessary kind of actions that some big social platforms need to take. But this is just one of many things that need to happen.

Resilience cannot be built in just one area or regarding just one issue, it requires addressing many different areas at the same time. We can’t let ourselves get away with easy answers to a quite complex issue. Media literacy campaigns are certainly one of the many things that we need to do, but they don’t solve everything. In the end, we need to think holistically. And that’s why every conversation like this one is so important – to bring the actors together, because no single organisation can do this alone. It takes the famous whole-of-society approach.

Beata:

I’d like to add that often authoritarian foreign actors take local issues to further Russia’s, or China’s political goals. For example, the building of a nuclear power plant in a certain country may seem like a very local issue for a particular town. But we have seen cases where Russian propaganda actors would use this issue and communicate to local people about it. You wouldn’t even know that this is coming from Russia, they could infiltrate, for example, a local Facebook group for dog walkers in a certain town. These Russian actors would then use that user group and these very local issues to further Russia propaganda goals. So indeed, a holistic approach is very important. A lot is happening below our radars, things that are happening locally that are not visible to us, but actually they can be detrimental to democracies and detrimental to international rules based order.

Alice:

Thank you for your comments. We do need to look at the bigger picture. As Lutz already mentioned, we’re not just talking about foreign state actors here. There’s also domestic actors. And there is definitely also a technological aspect to the overall problem of hostile information activities or foreign information, manipulation, and interference. So let’s zoom in on the technological aspect.

What is the role of digital technologies, like social media and search engines in this overall hybrid threat of disinformation? And more specifically, what’s the potential impact of Elon Musk’s takeover of Twitter on disinformation, especially in the context of Russia’s war on Ukraine?

Caroline:

Well, for one thing, there’s great variability between the almost infinite results from different search engines. The algorithms that show certain results for certain keywords are unclear. So there’s no way to put any guardrails on that, no way to have any accountability there, when we don’t even know how that technology is working.

There are opportunities there, as well as challenges. But there’s a bridge that needs to be built between those two, because, for example, I lack the technological expertise to do a better search on a search engine or find a better way to visually represent information. As a social scientist, I don’t necessarily have the technological know how, and so we’re coming back to the issue of terminology and language. We’re speaking different languages and so building that bridge has proven almost impossible. But I think if we could figure out a way to do that, it wouldn’t get rid of the challenges, but it would at least make them more manageable. And again, give the user some possibilities for, you know, establishing some guardrails and guidelines and possibilities for accountability.

Alice:

Thank you so much, Caroline. Any of the other speakers would like to comment on this?

Lutz:

Beside social platforms, the problem is also about traditional media. The problem is also about other forms of digital kind of information. Some of these obscure websites that still exist have a direct link to the Russian security systems that are being leveraged. But of course, social media companies are important platforms, where things are being shared, where strategies are being implemented, and they have this amplifying effect.

I will spare you the debate about what regulations we need. As most listeners know, here in the EU we have the Digital Services Act as well as the code of practice. So stringent rules, maybe self-binding rules for the social media companies, or platforms.

Two things are key here. The first is transparency, we need to know what is happening, we need to have a better understanding, researchers need to have access. The second is accountability. Whether they want to or not, social media platforms have the obligation to recognise that that they are important players, they have an obligation to put in the right resources. Now, as soon as the Digital Services Act is fully implemented and the code of practice is fully functional, we will see how these rules are working. But personally I believe we should not overburden these companies, but let’s remind them that they play a hugely important role in this. And hopefully we’ll see the EU experience of transparency and accountability spill over to other parts of the world. I think it’s one piece of the puzzle that we need to put in place.

Alice:

Thank you, Lutz. In Commissioner Breton’s brilliant words: “In Europe, the bird will fly by our EU rules.” Let’s move on. So how can we become more resilient against foreign and also domestic information manipulation? What can democracies do to tackle this as a hybrid threat? And let’s break up this question in parts. So what can governments do? What can the private sector do? And what can we, as citizens and as civil society do?

Liubov:

I have a very vivid example. In Ukraine right now, all of society and the government and businesses and the private sector, all parts are united. That’s why our communications are so effective. It is very important to understand that everybody can do something. When the full invasion started, I personally got so many requests from different people, people were texting me, What can I do? How can I help, I have a laptop, I have my phone, I can fight Russian disinformation, I can produce something. And we started creating content.

The crucial thing is that our government defence agencies, first of all, expressed that they trusted us, and they were using this content. And the Russians never anticipated this whole-of-society approach. It was a very, very solid, and united response. And even now, after eight months of war, there is not a single enterprise, one single organisation in Ukraine, that wouldn’t help to repel Russians or to help our military. Either they’re volunteering and helping our soldiers or they’re producing some content, or they’re helping government with other issues. So here we have a very vivid example that a holistic, whole-of-society approach actually works.

This shouldn’t be just the government’s burden and government’s task. Obviously any government should have a very well-built, strategic communication system. But as you can see in our case, all efforts are useful, and especially when you are under attack, it becomes obvious how united we can be.

Alice:

Thank you Liubov. We can learn great lessons from the incredible resilience and unity from the people of Ukraine.

Beata:

As Liubov was saying, Ukraine has demonstrated incredible resilience against this disinformation. I’d like to add that from our side as an international institution, it’s important not only to focus on countering disinformation, but also on engaging on proactive communications. We cannot be always chasing against disinformation and only reacting to the false statements or to hostile narratives.

What is also important is that we develop our own narratives that are engaging and that people want to respond to and that are convincing. Focusing on being proactive, engaging with civil society, is a big part of we do in the NATO public diplomacy division. We engage with civil societies of NATO allies, but also of partner countries.

Finally, it’s also important to work with others, to work with our partners, with other international organisations to share the information and the practices. If we’re united in this goal, then we stand a chance to actually counter disinformation, counter hostile narratives, and make sure that democracies and our way of life prevails.

Alice:

Thanks so much, Beata. Caroline or Lutz, would you like to chime in on resilience, and how can we increase our resilience?

Lutz:

Liubov gave the best example, of how whole-of-society effort works. It is both the government’s job and a civil society job. I think there are four aspects we need to consider. Firstly, we need to have good systems that allow us to detect, to analyse disinformation or FIMI campaigns, because if we don’t know what’s going on, if we don’t expose it, it’s very difficult to deal with this. Secondly, resilience as such. We need to strengthen our own fabric to deal with strategic communications. Organisations like [Defend Democracy] are crucial in this field. We need to build this societal fabric. The third element is of course regulation as protection, the Europe Digital Services Act and things like that. And last but not least, we also need to address the actors and the activities of the actors head on. Sanctioning certain behaviours is crucially important. It will only work if we can bring these four aspects together at the same time, and I think we’re on the right track. But as Liubov pointed out, having all of society mobilised campaigns is still the key to everything.

Alice:

Thank you, Lutz. At Defend Democracy we try our best to help build societal resilience. Now, Caroline, I know that you are a behavioural scientist, and you’re working on cognitive security. So you probably have something to add to this very question on how we increase resilience against this habit threat of disinformation and information manipulation?

Caroline:

Cognitive security is an emerging field and hopefully, it will build with a proper foundation, with some common ground terminology and ways to establish that understanding that I think was lacking in the field that became known as misinformation studies.

We need to approach this as not just an individual challenge, not just a governmental challenge, but something that requires action at multiple levels.

Going back to our previous topic of the use of the term warfare, one other reason that I tend to shy away from that term is because it implies that, you know, a battle is won, and it’s over. When I think, in reality, it is an ongoing challenge and something that we have to build up resilience to, for the long term, not just for, one specific battle, because it’s not going anywhere.

Alice:

Thank you, Caroline, and thank you to all speakers. Thank you all to all listeners. Let’s wrap this up with a final short remark.

Beata:

Thank you so much for this discussion. It was a real pleasure. And it’s very important that we address this issue, and organisations like [Defend Democracy] are very valuable.

Caroline:

I would also just like to express my thanks towards Alice and [Defend Democracy], for creating the space to have these conversations that are really incredibly important. And I very much appreciate all the hard work that has been put into creating this space for us to do important work.

Lutz:

I just wanted to add one thing. Often I get the question, if you address disinformation, you automatically reduce freedom of speech. I think by addressing the information manipulation activities, these interferences in our conversations in our public space, actually, in the end, we protect freedom of speech. We need to turn it around and see that if you address disinformation, it’s not necessarily at the expense of freedom of expression, if we do it right. And I think that is a big challenge.

Alice:

I fully agree with that, Lutz. If we don’t somehow deal with information manipulation, we fail in protecting the freedom of speech of actual individuals. But we will probably need a whole new episode of Radio Resilience to discuss just that one topic.

So let me close this Space today by saying that our next conversation will very likely be on Friday the 18th of November, and probably more or less the same time. Keep an eye on our Twitter for details about the next episode. Thank you to all our speakers, thank you to our listeners. Keep up the good work all of you who are helping to defend and strengthen democracy, because we definitely need the whole of society.


Stay tuned for the next ‘Radio Resilience’ via our social media. Did you appreciate this episode? Please take a moment for a survey from NATO’s Public Diplomacy Division.