Government ‘invasively’ targeted digital users using ‘offensive stereotypes’



‘Problematic’ proxy characteristics have been used by Government ads in recent years to target individuals based on protected characteristics like race, ethnicity, religion, or gender.




The UK Government has utilised tactics variously described as “invasive,” “problematic,” “obviously harmful,” and “unethical” to target individuals for advertising on digital platforms, according to a new study by the Scottish Institute for Policing Research (SIPR).


The research found evidence of invasive targeting of populations including Muslim and Black communities in the UK as well as extremely vulnerable refugees seeking to immigrate to the UK from abroad. Individual interest and fine-detail location data were regularly used as proxies for protected characteristics, such as race, ethnicity, and sexual orientation, which are barred from being used to target individuals on platforms like Facebook.


The study, published today, examined more than 12,000 campaigns run by different kinds of law enforcement bodies and their partners between 2020 and 2023, including government departments and agencies, security services, charity partners, and local police forces. It examined data released voluntarily to researchers by Meta, the parent company of Facebook and Instagram, through its Ad Library.


It has found:



  • Government ad campaigns have adopted “extremely invasive targeting” via proxies for protected characteristics, such as low-level postcode location data in and outside the UK


 



  • Proxy interests for non-White people include “Afro-textured hair” and “Bangladesh cricket”, while an interest in comedians like Ricky Gervais are signifiers for people who support Far Right tendencies


 



  • Police Scotland has used “fine-detail location and interest-based targeting” for digital marketing campaigns and increasingly employs influencers on social media to deliver community-relevant information and pro-police messaging


 



  • While the data for this investigation has come from Meta, researchers say it’s likely that all major digital platforms are being used in this way.


The report reads: “Paradoxically, as platforms have refined or in some cases removed the ability to target people by ethnicity and religion directly, this appears to have caused many law enforcement actors to adapt by adopting extremely invasive targeting via proxies for these protected characteristics, such as low-level postcode location data or microtargeting via interests and behaviours.”


Detecting ‘how much you’re online during Ramadan’


One example of such a campaign saw the Government target interests in Afro-textured hair, Bangladesh Cricket, Canadian poet Boonaa Mohammed, and Eid al-Fitr as stand-ins for Islamic and/or South Asian characteristics. The ad was for a job support scheme.


Another ad for an intelligence services recruitment campaign, which ran for one day in March 2021, featured predominantly young Black British people in the creative content and targeted specific areas within cities with a large Black community.




Dr Ben Collier, a lecturer in Digital Methods at the University of Edinburgh and co-author of the study, explained to The Media Leader the trouble the Government faces in trying to reach specific people with advertising. “The issue is if the only way you can target young Black people in London, for example, is by using these wild additional categories, it means that if you’re a young Black person in London, and you don’t listen to one of five bands, or read one of the five publications that they’ve chosen for the proxies, you won’t see the ad, unless you conform to the stereotype the government has of you culturally.”


Dr James Stewart, another co-author of the study and a lecturer in Science and Technology and Innovation Studies at the University of Edinburgh, added that the Government has a duty to communicate with all of its citizens, but couched that it must do so in appropriate and ethical ways.


“The Government should not be targeting ads based on the platform detecting how much you’re online during Ramadan,” said Collier. “That’s really invasive.” Nor should it be using granular postcode data to directly target ethnic minorities, the co-authors expressed.


Researchers found further examples of microtargeting for ad campaigns aimed at counter-radicalisation of potentially far-right individuals. For example, an ad from an unnamed charity carrying the message “You can leave the far right. It’s not easy but we’re here to help you every step of the way” was served to young people with interests including first-person shooter video games and excluding people with interests like “feminism” and “equal rights” to aim the ad away from those with progressive sensibilities.


Other ads used interest in certain comedians as a proxy for social conservatism. “We found a bunch of counter-radicalisation ads, so ads targeting people who are thinking of joining the far right or currently in the far right,” said Collier. “But to find that community, they’ve used ‘likes Jason Manford’ and ‘likes Ricky Gervais,’ and ‘likes Peter Kay,’ these comedians, which is kind of a working class, British stereotype.”


Such advertising practices have significant national security implications. As was revealed amid the Cambridge Analytica scandal in 2016, foreign governments will attempt to influence individuals’ beliefs via social media, including through marketing practices, and apart from improved transparency by Meta, there still have not been substantial guardrails set up to deter foreign governments from doing so.


Perhaps the worst offender when it comes to targeting, however, at least according to Collier and Stewart, is a campaign run by the Home Office targeting potential immigrants to the UK.


The campaign, called ‘Migrants on the Move,’ included ads written in Arabic, Kurdish, Pashto, Farsi, and Vietnamese targeting people in Northern France and Belgium who are seeking asylum in the UK. The aim is clearly to nudge them away from attempting to cross the English Channel in a small boat, communicating the severe risks associated with travelling to the UK via clandestine methods.


The ads “explicitly use a fear-based approach,” according to the study, including graphic depictions of boats sinking at sea, dogs searching trucks, and military-style drones. Text under the images asserts that refugees could be imprisoned for trying to smuggle people into the country, that drones and dogs will be used to find them, and that immigrants are likely to die in the Channel. Targeting was described as “extremely invasive,” which is not just ethically problematic, but could be illegal due to the difference in data privacy and advertising laws in various countries outside of the UK.



Campaign text: “There are large ships in the ocean, which can be deadly for small ships. Do not take this risk.”

 


As Collier notes: “This is a really horrible fear-based campaign targeting refugees in France and in Belgium, and obviously, in France, it is absolutely illegal for you to target anything, or even collect data on ethnicity. And so I think the issues with the legality of using this stuff internationally is pretty questionable as well.”


It is also worth noting that such ad campaigns are not even apparently effective. For example, ‘Migrants on the Move’ was delivered to people in Mexico, Riyadh, and Jordan, because the proxy characteristics it was using to target individuals inadvertently hit an audience of holidaymakers or business travellers in the region. “It’s obviously a horrible campaign,” said Collier, “but it’s also incompetent.”


Who’s more responsible: Government or marketing agencies?


Regarding the various campaigns discussed in the report, a Government spokesperson told The Media Leader: “Government campaigns are always designed to effectively reach key audiences and ensure value for money for the taxpayer.”


They continued: “The advertising channels are selected based on their ability to engage with audiences at a national, regional and local level.”


Collier noted that private sector consultancies and advertising agencies have certainly helped enable such targeting practices, but the researchers were not able to conclude the extent to which targeting data and decision-making comes from third-party agencies or Government employees themselves.


“We just don’t know yet,” said Stewart. “We have suspicions that there are some cases where you’d say this is definitely someone who works for the Council, who for example is probably a Muslim who works with Muslim communities, and they made this choice of names. Whereas in another case, that looks like it’s from somebody who’s just kind of making this up.”


He noted that for some campaigns, for example those related to counterterrorism efforts, the Government is likely to provide advertisers with a list of postcodes based on intelligence of which areas of the country, to a very fine detail, are most at risk. Conversely, consultants or agencies may take a Government campaign idea and figure out best the way to target audiences using their own, more classical research approach.


The Government is the largest advertiser in the country. Agencies named by researchers as having been involved in some of the campaigns include Omnicom Media Group agency MG OMD (whose specialist division OmniGov handles the UK Government’s media-buying activity) and Publicis Groupe creative shop Saatchi & Saatchi.


A spokesperson Manning-Gottlieb OMD declined to comment. A spokesperson for Saatchi & Saatchi did not immediately respond to request for comment.


‘Only scratching the surface’


A spokesperson for Meta told The Media Leader that in 2022 the company removed “many of the targeting options highlighted in this research.”


“We’ve made significant progress over the past few years to address this issue, including routinely reviewing, updating and removing targeting options to simplify our system and reduce the potential for abuse,” the spokesperson continued. “To increase transparency, we also show campaign targeting selections for social issues, electoral and political ads in our Ad Library and Researcher Platform.”


“I am not usually one to say nice things about big surveillance advertising companies, but you have to say Meta has done a really good job of making this stuff transparent,” said Collier. “It’s patchy, it’s not great, but if you do qualitative research on it, it gives you loads of really transparent data that we think the Government itself should be making public about these campaigns.”


Collier and Stewart are both confident that these type of targeting practices are not Meta-specific. “We know this is happening on other social networks,” said Collier, who pointed out that the National Crime Agency, the lead UK Government agency against organised crime, purchases most of its ads on Google. “This is only scratching the surface.”


In ‘murky territory’ with advanced police marketing strategies


In a related scope, the study also examined in detail the use of increasingly sophisticated marketing tactics by Scottish police forces.


Since 2018, Police Scotland has had a dedicated team devoted to strategic communications marketing campaigns, through which they have developed methods for crime prevention and pro-police messaging. The study notes that their communications strategy incorporates “novel techniques and tools […] particularly the use of behaviour change theory and digital targeting and segmentation infrastructures.”



Such tactics go beyond more standard “information” campaigns, which simply tell or ask the public to do something. The campaigns examined instead often use “perpetrator-focused adverts” to deliver targeted messages to those profiled as “at risk” of offending. The study notes that such digital targeting is mostly used at the broad demographic level (i.e. age and gender), but it found evidence of “some use of fine-detail location and interest-based targeting,” effectively amounting to profiling.


Beyond digital targeting, campaigns have increasingly included the use of social media influencers, which are typically well-known public figures online, especially ones with legitimacy with and knowledge of targeted communities.


Collier commented: “It gets into some quite murky territory. Especially because it’s one step lower than, like, paying Andy Murray to support your healthy living campaign. It’s slightly closer to that peer level.”


While such campaigns are largely developed in house by professional police communications specialists dubbed “Influence officers” by the researchers, media buying and some campaign development has been conducted with commercial partners.


The study concludes there is need for increased transparency, regulation and oversight to better understand digital targeting practices undertaken by those in power and protect citizens from potential invasions of privacy.






Post a Comment

0 Comments