In the last ten years, social media has become the predominant way to read and share news at an unprecedented speed. While this can be seen as an improvement, it also comes with crucial risks that social media users have the power to fight.

In an interview with BlastingTalks, George Loukas, coordinator of Eunomia and professor at Greenwich University, tells us more about the project, a user-oriented and open-source platform assisting social media users to fight misinformation, and how journalists can and should help. Eunomia is currently funded by the European Union through the EU Horizon 2020 Research Program.

Eunomia is a project which aims to develop a decentralised toolkit for assisting social media users to fight misinformation.

As the project coordinator, what was the impetus behind this program?

Eunomia was conceived in response to a call for innovative solutions to tackle misinformation by the European Commission. There are many ways to do this. So usually there is some form of intermediary. That is, for example, a third party or fact checkers that might be telling you what they have identified as true or not true. There might also be a technology for example, artificial intelligence used to flag something as misinformation, or - the more common one nowadays - a social media platform itself that does it.

From my side, I don't like being told what to believe. And I suspect that I am not the only one. That was my motivation. Eunomia follows an approach that lets the user think what is misinformation or what is not.

What is the aim behind the Eunomia project?

Our experience from other areas with large-scale problems that involve the users, is that if they don't believe in it or if they don't consider themselves part of the problem and part of the solution, they're not going to do anything about it.

So unless people consider this to be their problem, they will not do anything about it, it doesn't matter how many technologies you give them. That's what Eunomia does, it makes it much easier for someone to not only say that they are part of the problem, but also part of the solution.

Your book ‘Cyber-physical attacks’ shows how cyberspace can affect physical space.

Would you say that social media has this influence on physical activity? Could you give us an example?

You are right that for cyber attacks and their physical impact, it has become more obvious because of the Internet of Things. We see things that are physical and are connected to the internet. What people don't see is that social media is almost the same. If you believe for example, in a false treatment for COVID-19, then that will have a direct physical impact on you.

What would be the ideal “hygiene routine” when reading the news?

It's difficult to find an ideal one. It is likely different for everyone. We haven't done enough research to know which one is for whom. Instead, we have identified guidelines that have significant scientific evidence - for example about Coronavirus.

There are many examples of advice that you might get from experts. But some are personal opinions of experts, without a study behind them. So we try to find the ones that do have studies behind them.

I would say that perhaps the most powerful way to fight misinformation would be to not share something unless you have read it. It sounds obvious, but it isn't at all. When we share something online, we don't necessarily do it because we have read it, we share it because it supports our opinion. That's what spreads misinformation, really: it's the speed. We feel that we gain something in our social circle, by sharing it or by being the first who will tell other people about it.

As you wrote on Eunomia’s blog, ‘Misinformation travels faster than reliable information’.

Could you explain why?

New information is what makes something interesting to share. It's interesting for you to share it with someone because it looks like you are in the know, you're the one who knew before the rest. So it creates these conditions for sharing much faster than sharing information that can be considered reliable. The numbers can be staggering.

Information from experts is not necessarily the most popular on social media, as opposed to opinions from non-experts that spreads rapidly online. How can the experts turn the tables in this situation?

Experts don’t necessarily know how to make the news go viral or even how to make it easy to understand. An expert is not an expert because they make things easy to understand.

They are experts because they know better than everyone else in a specific, narrow area of science. What they can do is say, “yes,” every time a journalist asks them for an interview. The journalist is the one who knows how to translate this into something that's easier to understand and more interesting.

How can journalists better tackle the infodemic?

There's one thing that they shouldn't do, which is using fear as a tactic for selling more. It's been pretty obvious for a number of years now that fear doesn't work. It might work in the very short term, not in the long term. It backfires badly. Now, the vast majority of people that are very skeptical about vaccinations, it is because they don’t like being told what to be afraid of.

That emanates to a large extent from fear. And fear, of course, is not generated by journalists but it is amplified by journalists. I would suggest that a more balanced presentation of the latest news would probably, in the long term, be better. COVID-19, for example, is a long term issue. It might be better to trust people by giving them more information rather than making them feel afraid.

You offer a novel mechanism for discovering “post-based” information cascades, showing the earliest original post to highlight how the information has evolved over time. What are the main points of transformation/manipulation of information?

First of all, the same piece of news can be not only represented in very different ways, but also by different people with various objectives.

I would say the most striking example is when a legitimate photo that is used in an article has been used years later, in a completely different article, to support a completely different viewpoint. For example, against immigration, or these kinds of topics. You might also have just the comma changing the meaning of an article completely. It is still true that this was said by that person at the time, but it was not meant that way. Our tool, which is looking for similar posts from the past, across the Eunomia ecosystem as we call it, is helping to identify the similarity. It doesn't say, “this is untrustworthy,” or anything like that. It just says, “three years ago, the same photo was used.” It's social media users, who are intelligent persons, who can make the decision for themselves.

If you just tell them about this, you let users then decide what to do about it and whether they should really share it.

The World Health Organization described the Coronavirus pandemic as an infodemic. How powerful is the infodemic in affecting the course of the virus?

It is really difficult to put a number on the actual impact for the very simple reason that it is difficult to put a number on the pandemic itself. Even for that we have difficulties such as the rate of infection. So it's even more difficult to evaluate the factors that contribute to the changes. In August, there was a publication showing that around 800 people across the world had died because of misinformation almost directly by following false medical treatment advice.

That was the actual tip of the iceberg. Clearly it's very different now, because that was back in August with data that was probably even older than that. We're talking about a very high number of people that have been affected.

Do you think there could be a before and after COVID-19 in the digital world as per social media? Is it a turning point for the digital world?

There is no doubt that it has accelerated things that were happening already. It was already very obvious that social media played a more important role than perhaps most people would think, because it felt like a form of entertainment. Now, for many people it is the only real interaction, especially if you're under lockdown. That means that if a piece of information travels through social media, it will have a number of different impacts on the future.

For example, when people realize how important social media is for misinformation, I suspect that there will be heavy regulation coming.

For the next few weeks or months, different social media platforms will try different things. That's a very good thing, because some of these measures will actually work. But we have our doubts as to the extent to which the social media platform is doing something. Then as I said, I would expect very heavy regulations coming in the next few years. Eunomia would rather give the power to the people.

Did the pandemic affect or reshape the mission of Eunomia?

I don't think it was affected negatively in terms of productivity. On the contrary, we had a lot more focus because when there is a very clear threat to everyone, there is a very clear focal point.

It's easier for people to work together towards a common goal. It was also easier, if we talk at a technical level, to change the user interface, to make it into something that would be useful to people to use against misinformation. It also accelerated adoption. A few months ago, we were not planning to open it publicly but now we have. We have the second pilot planned for the next few weeks. So it also accelerated the developments within the project.

What is your vision for the social media world in the next ten years?

We will expand social media into areas where we aren’t currently using it. For example, devices, machines or systems will start becoming part of the social media ecosystem. I expect the car for example to be one more user on social media platforms. We will use this paradigm for communication that doesn't have anything to do with social media, simply because it's so natural.