“Russia will win the information war in the long run”

Jean-Christophe Boucher, assistant professor of political science at the University of Calgary says that Russian communication has changed a lot since the all-out war against Ukraine: they no longer rely on proxies and troll farms but use openly the channels of the state apparatus. And they communicate in a coordinated and professional way. We had the opportunity to interview Professor Boucher on foreign influence campaigns at the Embassy of Canada in Budapest.

Photo by Gábor Bankó / 444

We are entering the third year of the Russo-Ukrainian war. But you have been researching the Russian information space for a long time now, mainly based on big data. How has the focus of your research changed in a decade?

We first looked at Ukrainian mobilization on social media in 2014, during the Crimean crisis. Already then, it was visible that Russians were active, but at that time their activity was very small and very embryonic, not at all as professional as it is now. Then in 2022, right before the war, we started again collect data on Twitter, when it was already clear that the Russians were preparing to cross the Ukrainian border. It was interesting to see how the digital information space was moving so much faster than the news: there was already a sign on social media that something was happening before the BBC had even reported the events. We then used the data to produce a more detailed analysis specifically for Canada.

What was the result?

We suspected that in Canada, as in many other places around the world, the far-right ecosystem would be used by the Russians to produce their disinformation. This has been partly true, but at the same time, the extremists on the other side have come into the picture. But one important difference between the far right and the far left has emerged:

while the far left is often in direct relationship with the Russian state apparatus and its organs, such as RT (the state media outlet Russia Today), this was less the case among the far right. There, the Russian connection could be seen in amplifying existing narratives.

On the far-right, therefore, the content of the messages was not controlled so much but rather increased their reach. For example, in every case when in Canada someone criticized the mainstream media, Trudeau, or the government's support for Ukraine – all these essentially reflected the Russian position.

Has the situation changed in the last two years?

When I look at the information space, the same actors are defining the public discourse in Canada today as at the beginning of the war. Although the far left, right now is occupied with Iran and the Palestinian cause. These groups have been campaigning for peace which means stopping support for Ukraine to end the war.

Of course, this would also mean the loss of Ukraine – an interesting form of “peace”.

The far right is also increasingly critical of Canada's support for Ukraine. Whereas in 2022 there was no difference between Liberals and Conservatives on this issue, with everyone agreeing that we would send as much money and as many weapons as we could, this is no longer true.

In essence, what we see is that the Russians are winning the information war in the long run, especially on the political right.

What leads to this conclusion?

First of all, many argue that one of Russia's strategic goals is to sow chaos, and to do that they're doing everything they can on a trial-and-error basis in the public discourse – like throwing spaghetti on the wall and seeing what sticks. I think that is a deep misunderstanding of things. The Russians are not that chaotic, they’re strategic. On the other hand, their arsenal has changed a lot since total war.

Between 2014 and 2020, the dissemination of disinformation relied more on proxies and covert operations, through such bureaus as the Internet Research Agency, known for its troll farm in St Petersburg, the Federal Security Service (FSB), which is responsible for internal security and counter-intelligence, or Russia's military intelligence unit, the GRU. It was a largely decentralized and porous system: the Wagner group, for example, was also deployed for information operations, but they were never very tightly controlled. What we are seeing in the data now since the war in Ukraine is the professionalization of Russian information strategy. From the Telegram to the Foreign Ministry to Lavrov to Russian embassies, every state actor has become part of the information operations.

In a recent study, they examined official communications from Russian embassies on Telegram. What changes have you noticed on these channels?

It is important to note that Russian embassies were not active on this platform before the war.

This leads me to conclude that the Russian state, as it has gone to war with Ukraine, is no longer relying mainly on proxies and troll farms, but on state control mechanisms.

What used to be hybrid warfare has become overt warfare, using the information space in a much more directed way. An example of this is the Russian embassies' communication: the institution maintains 130 Telegram channels worldwide. In Africa, some of these embassy channels cover several countries, but within Europe, with the interesting exceptions of Poland and the Czech Republic, no trace of them was found.

The Telegram channels of 3 Russian embassies out of 130. From left to right: Hungary, Cambodia, and the US. Source: Telegram.

Our research was based on the assumption that embassy communication reveals a lot about how the Russian state thinks about its place in the world, the narratives it wants to spread to the West, how it manages its global influence, and how well it understands its audience. We also wanted to know how unified Russia's understanding of the world is: if there is no difference in communication between countries if all the embassies say the same thing at the same time, it means that someone in Moscow has the password to all the accounts and can send information out from the center at the touch of a button.

Did they find any discrepancies?

On the one hand, we found that embassy channels communicate consistently, pushing the same strategic, operational, and tactical narratives everywhere. However, the system allows for a certain flexibility: they tailor their messages from country to country, which shows that they are not only working with a broad template, but also know their target audience, or at least who they want to address.

The Russian Embassy in Hungary, for example, is an interesting exception. While in the USA and the United Kingdom, these accounts usually emphasize their strategic rivalry with Russia, there is no such thing on the Hungarian Telegram channel. Instead of confrontation, they focus on bilateral relations between countries through common issues such as the Paks nuclear power plant or grain. Another important difference is that NATO and other strategic issues are not discussed at all, and the level of disinformation is almost minimal. This is in stark contrast to what we found on the Telegram page of the Russian embassy in Ukraine, where all the classic Russian narratives are paraded, from bio labs to “denazification”. The communication on the Ukrainian channel is, of course, only in Russian, while the Hungarian site is bilingual.

Source: The official Telegram channel of the Russian Embassy in Hungary

How can such an analysis be useful if we know how Russian embassies communicate through their official channels?

If we know the intentions of the Russians, we can develop our counter-narratives. At the moment, most Western countries do not seriously engage with the Russians in the information space. While support for Ukraine is declining worldwide, the real problem is that we are not taking a stand against Russia on this front. To do this, we need to be clear about what we want to convey and link well-defined objectives to military operations. That is the essence of strategic communication. Fact-checking does not help here, because it is a reactive strategy; in the information space, it is the offense that has the advantage.

Photo by Gábor Bankó/444

What factors can help foreign influence to take hold?

The information environment is not properly regulated, the platforms are not willing to do anything about it, and the Russians are using it for their benefit. In some ways, it can be argued that this system cannot be defended in this way. In the Western world, we do not have the tools to deal with the problems: we are in favor of freedom of expression because we believe that is one of the pillars of democracy. But when freedom of expression is used against us, we come to an almost insoluble dilemma. On the one hand, we do not want to censor people, we do not want to tell them what to say or think, so we're kind of stuck in defending them.

And I think one of the ways to go at this is to limit the amplification of opinions, not speech itself.

Every citizen should have the capacity to say whatever they want to say. And if they want to be pro-Russian, they can be pro-Russian and that's their right. However, we want to limit the capacity of foreign actors or domestic actors to amplify those messages in ways to shape attitudes and behavior. I think that's the important part. When my father or my grandmother sends me disinformation about who’s the best hockey team in Canada or those kinds of things, they're not trying to influence the information space on the scale. They’re just trying to influence me to cheer for another hockey team, which I will never do. At the moment, however, we are simply allowing certain players to play on the platforms.

To fix the problem, the economic model should be changed. And the only way to do that is to simply not amplify the political discourse. You can say what you want, but you cannot artificially increase your reach and thereby gain an undue advantage over others.

For example, we don't care if people are anti-vaxxers. But we care if they do ads on being anti-vaxxers, and they are paying for more people to see it.

Currently, the system can be played in several ways. For example, many far-right groups and influencers who run alternative media platforms live off micro-donations. It is not known who is funding them. Meanwhile, they hide the fact that they are not paid directly, but by the platform, Substack or Amazon. Disinformation can be a good business, many people do it to make money. Here is Tucker Carlson, for example: he did not go to Russia because he was bought off by the Kremlin. He doesn’t need to be. He makes millions of dollars just from views, advertising revenue, donations, which can be $50 or $100, buying T-shirts, and so on. So it's a business model, right? We would not restrict freedom of speech, but I think that we should restrict you from making money off of spreading disinformation.

Photo by Bankó Gábor/444

Many people attack you for your work. How do you deal with this?

We’re looking at any actors trying to influence the information space. And acting against Canada or the national interest. So we're looking at foreign actors like Russia, China, Iran, we're looking at domestic actors, non-state actors like the far right or various terrorist groups, and we're building projects on those. Of course, that goes against their interest. So, I'm regularly sued by the far right for my work on Russia. But they're also unhappy with the things I do on their stance on anti-immigration, on nativism, on the freedom convoy. They hate that we work on them and we expose what they do. And they try to intimidate and bully people. And I find this very ironic, right? They’re always the first to invoke freedom of speech, but they have a hard time accepting criticism when people use free speech to criticize them. I guess they love freedom so much that they can’t stand the idea of sharing it with other people.

Cover photo by Bankó Gábor/444

Translated by Benedek Totth

Lakmusz