What is Russian disinformation? How do you dement fake news during a face-to-face conversation? Where should we be most wary of fake news? This is what we talk about with mgr Adam Majchrzak, an assistant at the Institute of Media, Journalism and Social Communication, who is also associated with the Demagog Association.
If you are interested in the topic of disinformation, enter the fact-checking competition 'Detector'. This year, paid internships and a study visit to Brussels are up for grabs.
Adam Majchrzak, photo by Marcel Jakubowski
Marcel Jakubowski - This year is an election year. More than half of the world's people live in countries where elections will be held. Should we expect Russian interference in these processes?
Mgr Adam Majchrzak: - We should, but the intensity of this phenomenon will be different depending on the location of the elections and the geopolitical alignments. On the one hand, Russia, through its covert interference in democratic processes, may support politicians and groups that will be more friendly to it. On the other hand, it may seek to destabilise the situation in countries that are unfavourable to it, which we can understand as activities associated with hybrid warfare and the strengthening of soft power. However, the situation related to the war in Ukraine shows that elections are not the only opportunity to interfere in the functioning of Western states, and the tools of influence used can be various.
- What tools does Russia have when it comes to disinformation?
- We can point to several areas. The US Global Engagement Center distinguishes five pillars of Russian disinformation and propaganda. Russia can influence through official state messages, Russian media for the domestic and foreign markets, and proxy sources - that is, seemingly independent of the state. There are also aggressive messages on social media and the use of cyberspace on the technical side, including but not limited to hacking sites and creating hoaxes. In each of these areas, Russia can exploit various disinformation narratives that have already appeared in the public space. It can create a message that targets a specific audience. When it comes to the techniques themselves, they are also various. Nowadays, they can use, among other things, online bots, which has happened many times in the past. A growing threat is AI and the ability to create artificial identities. You can see that deepfake is starting to play an increasingly important role in disinformation. For example - as early as 2022, after the Russian attack, fake footage emerged of alleged President Zelenski announcing his surrender. Later, the image of the Ukrainian Prime Minister was used to influence relations with Turkey's Bayraktar. One can assume that there will be more such cases - a kind of testing ground.
- What did Russian propaganda consist of at the beginning of the war in Ukraine? How has it changed over these two years?
- At the beginning of the war, there was an effort to justify the invasion. At that time, we were confronted with different narratives, which are no longer talked about in Poland. Ukrainians used to be called Nazis, there was false information about Ukraine housing biological weapons laboratories that Russia wanted to defend itself against, it was said that Ukraine was terrorising and attacking the people of Donbas, and so on. Now, we see narratives geared towards discouraging Europe from helping Ukraine. Various messages about Ukrainian products or xenophobic narratives are being created to pit the viewer against Ukrainian society. For Russia, effective disinformation and propaganda involve a constant process of learning from mistakes. Sometimes it seems that we should not bother with some fake news, as it may come off as ridiculous or even downright stupid. However, we should keep in mind that in this way we can see which groups are more or less susceptible to certain messages. In other words - the Russians are watching how the public will react to crafted material.
- One such first major Russian disinformation affair was certainly the interference in the 2016 US presidential election. Have we learned anything in these eight years?
- It seems that for many years the problem of Russian disinformation was somewhat ignored. However, this does not mean that nothing is being done in general to combat the phenomenon of spreading false messages. Educational activities and information campaigns are being created all the time, and fact-checking initiatives are developing to try to teach people how to verify facts. Verification competence can increase, but one must also take into account that not everyone has the time to deal with verifying all the information that comes to them on a daily basis. There are certain periods, such as war, pandemic, or natural disaster, among others, when there is increased information chaos. Such situations heighten the sensitivity to misinformation, as people in the information rush want to find out as much as possible as quickly as possible, they want to reassure themselves and understand the reality around them. The effect can vary - if too much false information reaches us, we will continue to walk in the dark. Unfortunately, a lot of this has happened in recent times, and societies and governments have not always been able to cope with it.
- As the Demagog Association, you always give coverage of false information. Would you say that these materials on average get more or fewer views than they used to?
- It's hard to say whether such materials have become more popular, but it's important to note that online activity has increased through the pandemic, and we are using social media more and more. People are more likely to respond to polarising material because social media algorithms promote posts that evoke extreme emotions, such as hatred, but also extreme joy. Wherever simple, strong emotions are present, that's where we are more eager to react to such material. On the web, this is not difficult to do, as everyone has access to many means of expression and can often do so in quite extreme ways - as some seem to do - without any consequences.
- Are you at the Demagog Association able to get to all the fake news?
- We are certainly not able to verify all fake news. There are thousands of them. We focus on what can be the most damaging, what undermines the public's trust in scientific institutions, and what makes it possible for people to harm themselves or someone else. We certainly won't get to every type of misinformation. Quite often, fake news is created by mistake, someone can accidentally share something or twist it - and we are already dealing with constantly evolving false information. Fake news is a living creature that is difficult to control, but everyone can develop the critical thinking to be able to defend themselves against it.
- How do we act when our interlocutor starts giving us false information? Is telling the truth to someone's face an adequate and effective response?
- First of all, we need to ask ourselves what kind of beliefs we are dealing with. Problems with information fall into three types. The first is disinformation, which is a methodical effort to mislead someone. We also have misinformation, which is when someone spreads false information unknowingly. And the third type is malinformation, which is a message based on the truth that is intended to cause harm. I think we come into contact most often with misinformation. If we know that our friends or family members have been fooled by such a message, we should be guided by understanding and empathy and show that there are arguments that do not support such information. Above all, let us not attack - after all, quarrels divide, not unite. On the Demagog training platform, you can find a course on how to talk to people who believe in fake news.
- Where should we be most wary of fake news?
- Fake news comes up everywhere: in conversations with friends, in statements by politicians, or traditional media. However, it is most often found on social media, because hardly anyone controls who and what is posted there. The moment we click the 'publish' button, we can put whatever we want online. Everyone assumes that at most it will be moderated later. It is also difficult to determine whether the author of a social media post has any competencies in a given area or if they are just relying on their intuition and gut feelings. It is always worthwhile to consider the sources of such information, think about whether it arouses extreme emotions and try to read the author's intentions.
- How do you assess the attempts of companies to moderate fake news on social media?
- We will certainly never get rid of the problem of fake news, because it is an integral part of human history. From myself, I will say for sure: a big yes to freedom of speech and a huge no to the spread of disinformation. At the same time, I think that the big platforms continue to neglect this problem and introduce certain solutions ‘out of the blue’. What is really needed is to step up the fight against fake news and disinformation. Each platform has a different approach. Facebook sends a brief message to users stating that the information has been verified by fact-checking organisations. X.com, on the other hand, has introduced Community Notes, through which specific users of the portal can include a broader context for the information or point out its falsity, although this is far from ideal and is not fact-checking understood as lining up all the necessary and reliable sources. All this is still not enough - the big platforms should aim, among other things, to more effectively curb the activity of suspicious accounts that operate in networks of online bots or online scammers. In addition, those who benefit financially by spreading disinformation and those spreading hatred through false information.
- One of the actions that supports information literacy in society is the 'DETECTOR' competition run by the Institute of Media, Journalism and Social Communication. Could you tell us a bit about what this initiative is about?
- It's an initiative that was created in collaboration between the Institute and the Demagog Association. The main objective is to find young talents in the field of fact-checking among first- and second-level students. In the first stage, participants have to prepare material that debunks false information in a free and creative way. This can include an article, podcast, film, infographic, or other journalistic form. Submissions can be sent in until 31 March. Those who make it to the next stages of the competition will take part in workshops on verifying information and try their hand at a fact-checking test. The prizes to be won this year include paid internships at, among others, the Demagog Association, the Cyberdefence24 editorial office or the Pulsar project, as well as a trip to Brussels at the invitation of the European Commission. In addition, we are organising a series of free classes for students and a webinar to promote the campaign. I strongly encourage you to participate and follow the competition on the social media of the Institute of Media, Journalism and Social Communication and the main partner, the Demagog Association.