Inside the everyday Facebook networks where far-right ideas grow | The far right

Far-right ideas are gaining ground. They are thriving in online worlds which are not easily visible to large swathes of the public but which spread misinformation that has real-world consequences.

Take last year. In the summer of 2024, riots broke out across parts of the UK, fuelled by misinformation that spread on social media.

The violent disorder was primarily aimed at asylum seekers and Muslims, including an incident of rioters setting fire to a hotel housing asylum seekers.

The rioting was as surprising as it was appalling. It was mostly carried out by local people who were not members of formal far-right organisations. Some rejected the far-right label, carrying banners that read: “We’re not far-right, we’re just right.”

What we found was a community bound together by a deep distrust of government and its institutions, whose members trade in anti-immigrant sentiment, nativism, conspiracy and misinformation.

Experts say their posts include content that is far-right and extremist, and that such online spaces can play a role in radicalisation.

Membership of one of these groups does not signify any wrongdoing – for example, we found a small number of posts calling out racism – and there will inevitably be users who joined these groups through curiosity or because their friends had.

However, experts to whom we showed our findings said some comments posted in the groups were far-right in their nature. They pointed to a worrying normalisation of far-right ideologies on Facebook, still the world’s largest online social media platform.

Here is an overview of the key themes observed in the groups since their inception through to mid-May 2025, alongside expert analysis of the psychological processes driving this content.

1. Distrust of mainstream institutions

Many of the group members expressed support for Reform UK, with a sizeable proportion of the posts – about two in five of the sample – voicing widespread criticism of government institutions and figures, including anti-establishment and populist rhetoric.

In many of the posts, this included the use of contemptuous, dehumanising or villainising language. Ingrained in these posts was a deep-seated distrust of mainstream politics and various mechanisms of the state: Labour and the Conservatives were “traitors”, “treacherous”, “scum”; the police and the judiciary “two-tier”; the media “controlled” – even the RNLI was villainised as a “taxi service” for “illegals”.

FB_POSTS_1

Anki Deo of the advocacy group Hope Not Hate said the main benefactor of the loss of trust in institutions was Reform. She said the party was part of a wider ecosystem – which included GB News, social media influencers, far-right political groups and others with non-electoral aims – all of which drove anti-establishment rhetoric.

This in turn spilled into these Facebook groups, “filling the gap” for “people who are disillusioned with their usual parties and general politics”.

She said the Guardian’s deep-dive analysis of three of the groups in the network pointed to a worrying online environment: “Public Facebook groups like those used for the data sample contain a mixture of political discussion alongside extremely harmful language.”

Sander van der Linden, a professor of social psychology at Cambridge University, who has researched the impact of conspiracy and misinformation and its connection to extremism, said the undermining of democratic institutions on display in these groups echoed fascist methods used throughout history.

“Generally, fascists and extremists are trying to undermine institutions of truth, facts and education because that is what’s standing in their way – an informed citizenry,” he said.

However, he said the people reading these posts – or even posting themselves – were unlikely to be aware that attacking the credibility of mainstream institutions was a tactic associated with the current far right.

“Regular people interacting with this content often don’t know that they’re part of some playbook or agenda. Political elites and opinion leaders such as Farage, [the far-right party] Homeland, or Tommy Robinson do take a page out of the fascist playbook and are using it to dupe regular people into engaging with their rhetoric. So they come up with a narrative such as ‘the mainstream media is lying to you’, or ‘scientific institutions are looking to censor people’.

“And then they try to get regular people to support, amplify and engage with those topics.”

Dr Julia Ebner, a radicalisation researcher at the Institute for Strategic Dialogue and an expert on online radicalisation, said it was worrying that populist parties, including Reform, were increasingly using a rhetoric that fuelled some of the key radicalisation elements online – whether the undermining of institutions, out-group demonisation, existential threat ideas or the emphasis on a victimised in-group of patriots.

“If Reform UK doesn’t want to be considered far-right it has to make a better effort to distance itself from the most extreme voices supporting its political agenda. Likewise, many Reform UK voters might not like to be associated with far-right ideas but it is important that they are aware of the psychological elements driving all forms of extremism,” she said.

2. The scapegoating of immigrants

Immigration has become one of the most important issues for British voters, with two-thirds of adults polled in May saying the total number of people entering the UK was too high.

Posts in the network about immigration were commonplace, making up about one in seven of the entries captured in our sample.

Our analysis differentiated between criticism of government policy or migration statistics and demeaning, dehumanising, generalised or out-and-out racist posts.

One in 10 fell into this latter category.

FB_POSTS_3

The pattern of demonisation in the Facebook groups was noticeable in our analysis, with some posts portraying migrants as dangerous, deceitful, criminal or culturally incompatible.

Others were more subtle, using insinuations and generalisations about “military-aged men” and “grooming gangs” – both commonly used in these forums as coded language for an inherent threat – or the broad-stroke labelling of all immigrants as “illegals” or an “invasion”.

Ebner said participation in this kind of discourse played a role in radicalisation. “Immigrants are often the scapegoats, especially in far-right extremist communities [where they are] systematically demonised and dehumanised.”

The issue, she said, was that the gradual condoning of or justification for extremist ideologies – including the demonising and dehumanising language used about immigrants seen here – could form part of a “toxic cocktail”, resulting in riots or violence against groups or individuals.

Deo, of Hope Not Hate, noted that while many people had legitimate grievances, such as the cost of living crisis or the decline of their local high street, the scapegoating of immigration was not legitimate.

“The way in which they’re engaging [with these issues] is rarely in and of itself – it’s always in contrast to another group. The conversation around the welfare system, for example, which is legitimate, is very often based on this premise of mistruth, which is that people who immigrate to the UK can just instantly access benefits.”

While overt racism was rare in these groups – as is to be expected in a Facebook-moderated space with guidelines about what constitutes hate content – it wasn’t wholly absent: immigrants were variously described as “criminal”, “parasites”, “primitive”, “scum”, “lice” or having “dinosaur” beliefs or values.

Some of these posts were still accessible on the site in mid-May, months or years after being posted, despite Meta’s community standards and the groups’ own stated rules.

3. ‘White British people are fed up

Alongside anti-immigrant sentiment, there was a strong vein of nativism, with the posters tending to cling to a common in-group identity – that of being “indigenous”, “British”, “white” and “Christian” – which they perceived to be under threat, saying Britons were now “second-class citizens”.

One in 25 posts in our sample fell into this bucket. Deo said the implication in this grouping was that “white, British or English people are the pure group and everyone else is other”.

FB_POSTS_9

The repetitive nature of this messaging can be particularly harmful, according to Van der Linden. He said that the psychological mechanism at play here, when this kind of post is frequently appearing in someone’s social media feed, was one of “illusory truth”.

“The more often you hear something, the more likely your brain is to think that it’s true. So the more you keep hearing that immigrants are replacing the white population, the more it starts to feel like that is something that must be true. There’s social reinforcement.”

Ebner said that the posts in this group included a perceived “existential threat” to the in-group, as well as some signs of “identity fusion” – feelings of oneness with the in-group.

“When this sense of very strong pro-group commitment is activated in combination with a sense of existential threat from a demonised out-group, that’s when extremist ideologies can become dangerous,” she said.

“This type of identity fusion with an in-group, in this case among people who perceive themselves as a victimised in-group of patriots or white Brits, can then be the final ingredient that tips someone over into the violent spectrum; into being willing to do anything on behalf of the group to save them from a perceived threat from a demonised out-group.”

4: ‘I’m not far-right … I’m just right

Although there was some condemnation of last year’s riots in the groups analysed, we also found hundreds of posts supporting what the commentators saw as legitimate protest or – in the case of those charged with online offences – freedom of speech.

Many of these posts were contemporaneous to the unrest in July and August 2024, as the riots unfolded and initial sentencing took place.

FB_POSTS_HERE

They included posts decrying the arrest of Lucy Connolly, a childminder and the wife of a Conservative councillor from Northampton. She was jailed for 31 months in October after calling for hotels housing asylum seekers to be torched.

Deo said this reflected views in other online spaces where people “saw the riots as a watershed moment where their beliefs were finally being recognised” and portrayed the online offenders as “ordinary people expressing legitimate concerns”.

She said this was true of online and offline groupings: “Many of those who participated in the riots do not view themselves as far-right, racist or even anti-immigration. The majority will not be part of any far-right organisation or group.

“The line of what makes someone a member of the far right is increasingly blurred: whereas previously someone might have had to join an organisation, now people can participate or just observe through online groups, dipping in and out.”

One unique and distinguishing element to the patterns of behaviour we found was the online environment in which it was taking place. Ebner said that while the narratives around existential threats, dehumanisation of out-groups and nativism were similar to what had been seen throughout far-right history, what was new here was that the online spaces amplified a lot of these dynamics.

“It’s the algorithmic amplification – the speed at which people can end up in a radicalisation engine – which is different. There are also elements related to new technologies such as fabricated videos or deepfakes, as well as chatbots’ automation.”

She said spaces such as these Facebook groups “definitely play a role in radicalisation of individuals”.

“There is a sense that a lot of this rhetoric has been normalised to an extent that they no longer have to fear legal consequences or societal pushback,” she said, adding that this had only been strengthened by Meta’s announcement this year that it was reversing its takedown policies on certain content.

“[Extremist] rhetoric is welcome again because we’ve seen a reversal of removal policies and takedown guidelines that were in place,” Ebner said. “Not all of the users in the groups analysed by the Guardian show the same degree of radicalisation and extremist ideologies but there are definitely some users who showed clear signs of extremism,” she said of our sample.

Van der Linden said the unprecedented reach of social media meant the dynamics around this rhetoric were playing out differently to how they had in the past.

“You may be able to find two people in your neighbourhood who feel the same way, but now you can connect with thousands of individuals who feel the same way as you in a matter of seconds, which leads to a misperception about what the consensus is in society,” he said.

The way the rhetoric spilled over into offline violence was also distinct to other periods in history, he said. “Obviously, people have been sitting in rooms saying nasty things about other groups of people for a very long time. But whereas before you would have to commit violence in person, now there’s a buildup that happens online, which was just much more difficult without the internet.”

5. ‘Entry points’ for deeper conspiracies

One stark theme we have uncovered in the analysis is the prevalence of conspiracy theories and misinformation, which account for about one in 20 posts in the network.

Denial of the climate crisis featured heavily, as did replacement theory – referred to elsewhere as the “great reset” – and the view that the World Economic Forum or other “shadowy” elite bodies were dictating policy in the UK and elsewhere.

FB_POSTS_4

These far-fetched ideas are able to take root due to the atmosphere of confirmation bias which is already taking place in these groups.

Van der Linden said the atmosphere of mutual agreement on issues such as immigration could be hijacked by more extreme “leaders” in the groups, who seeded conspiracies where people were already looking for answers to their more mainstream societal questions.

“When people feel that they’re connecting with others who have the same idea, it’s a community, but then when the community starts to have an agenda and leaders emerge it becomes a structure,” he said.

Deo agreed: “In some cases, the content about immigration is an entry point. From then onwards, people are exposed to a whole range of conspiracy beliefs and ideas.”

Sara Wilford, a lead researcher on the EU-funded research project Smidge (Social Media Narratives: Addressing Extremism in Middle Age) said that many conspiracies and misinformation posts contained some truth but were cynically distorted to exaggerate or exclude certain facts.

“I think our biggest problem as a society is that there are so many people who are taking that information and legitimising it,” she said, adding that conspiracy and misinformation were often spread by grassroots amplification originated by “bad actors” with large social media followings.

Van der Linden pointed to Pizzagate, the single conspiracy theory that turned into QAnon, as an example of how conspiracy theories could galvanise movements.

“When you have enough angry people and you get them together and you direct that anger towards a singular goal, for example immigrants, that can be and that’s what led to the Southport [and subsequent] riots.”

***
The combined membership of the groups across the network stood at 611,289 as recorded by the Guardian’s methodology on 29 July 2025: however, this figure almost certainly includes double counting of individuals can be members of more than one group.

Additional reporting by Olivia Lee and Carmen Aguilar García.

The full methodology including our use of OpenAI’s API can be found here. The Guardian’s generative AI principles can be found here.


Source

Visited 1 times, 1 visit(s) today

Recommended For You

Avatar photo

About the Author: News Hound