BiometricsThe New Toolof theHostile Environment
Sam Bright and Sian Norris explore the Home Office’s plan to vastly extend the scope of its immigration data collection
The Home Office gave a presentation late last year that offered a glimpse into the future of immigration policy, and the monitoring of migrants.
Speaking to the Biometrics and Forensics Ethics Group – an advisory non-departmental public body – the Home Office said that “future biometrics policy would require facial images and fingerprints to be captured from all foreign nationals subject to immigration controls coming to the UK”. This would come into force, the Home Office said, once the technology was ready.
This is a vast undertaking – the wholesale monitoring of migrants entering the UK – and one that has profound implications for privacy and individual rights.
Experts at Privacy International told Byline Times that the Home Office’s policy – part of its aspiration to “become digital by design” – is a “reflection of a global trend which raises significant concerns”.
In particular, “there is a risk that this data could not only be used to monitor people but, integrated with other data, used to check against other information gathered via different means and shared both internally and externally,” Privacy International said.
Indeed, the mass gathering and sharing of data has become commonplace among public authorities in the UK and beyond – often used to clamp down on perceived offenders.
In 2014, the big data firm Palantir was awarded a $41 million contract by US Immigration and Customs Enforcement (ICE) to build and maintain an intelligence system to identify and deport undocumented immigrants. The system collates data from multiple federal and private law enforcement agencies, each of which might have fragments of information on these individuals.
Forbes consequently reported in January that Palantir had “helped turbocharge the Trump administration’s crackdown on immigrants”.
A similar, draconian system is feared in the UK, as experts suggest that migrants could be pushed to the margins of society – afraid of using state services – because they may be subject to mass data gathering by Government authorities.
In fact, the Home Office explicitly says that its biometrics data may be used “to check whether an individual has applied for or obtained a service or product which they are not legally entitled to receive… this could include access to a public benefit or service, such as local authority housing and housing benefits”.
Therefore, its biometrics proposals must be seen firmly in the context of the “hostile environment for migrants”, Privacy International experts told Byline Times. “Data collection should be justified and subject to oversight and strict protections to limit how it is used”, they added.
The equality impact assessment on the digitisation of the immigration system has yet to be published.
While serving as Home Secretary from 2010 to 2016, Theresa May implemented a raft of immigration policies designed to create what she called a “hostile environment” for people migrating to the UK. This culminated in the wrongful detention and deportation of dozens of British citizens who belonged to the post-war ‘Windrush’ generation. This abuse of state power, and the persecution of people who had done nothing wrong, still causes institutional distrust among migrant communities – an instinct that is crucial to understanding the Government’s data reforms.
For example, the hostile environment requires employers, landlords, private sector workers, NHS staff, and other public servants to check a person’s immigration status before offering them a job, housing, healthcare, or other forms of state support. It also allows for data-sharing between institutions – for example, a rape victim who sought help from a sexual assault centre was referred to the Home Office due to her irregular immigration status.
Digital Dangers
In a written submission to the Parliamentary Committee on Human Rights, the Joint Council for the Welfare of Immigrants (JCWI) expressed concern about increasing the powers to obtain biometric information, saying that the proposals “significantly [expand] the class of individual from whom information may be taken without reference to the purpose of information gathering”. This, it said, could be incompatible with human rights legislation.
The digitisation of the hostile environment has been trialled through the EU Settlement Scheme, with people holding a ‘digital only’ status – meaning that they can be verified online but they do not have not any physical documentation proving their migration status. The digitisation process will be expanded to Hong Kong BNO passport holders entering through the new Hong Kong visa route, who have a biometric passport, and will gradually be phased in for the entire migrant population.
According to the JCWI, “where digital border surveillance systems, and technologies being applied to welfare, housing, and other services, come together, there is a serious risk for all migrants as regards privacy, data security and safe access to vital amenities”.
In fact, fears around data-sharing have resulted in people with insecure or irregular immigration status, such as undocumented migrant people, avoiding accessing healthcare – even during the pandemic. The result, according to a report into Filipino migrants’ experiences of Coronavirus, was people dying from the virus at home, too scared to seek treatment.
The hostile environment also pushes people into insecure and exploitative work – in the dark economy – dangerous at the best of times but deadly during a pandemic when workers were prevented from isolating and taking time off sick.
The digitisation of the hostile environment, the JCWI explains, will “make it easier for the Government to increase surveillance – and subsequently criminalise and punish – migrants, who have no choice but to interact with public services on a daily basis”.
There are also concerns about the blockades that may prevent people from opting out of the biometrics system. The Home Office says that fingerprints are retained for 15 years, unless the individual is granted citizenship (in which case the records are deleted sooner), or they are considered an immigration ‘risk’ (in which case the data may be retained for longer). Facial images, meanwhile, are only deleted when “retention is no longer necessary for use in connection with a function under the Immigration Acts or in relation to nationality”.
Individuals can request that their data is deleted, but they must prove that they satisfy the above requirements – submitting relevant documents and chasing officials. And, even at the end of the process, the Government may decide that it has a “legitimate need to continue to keep or use their data”.
This route is highly unlikely to be pursued by individuals at the margins of society, especially those who have an instinctive distrust of public authorities, potentially leading to few people challenging the Government’s data dominance.
Concerns have also been raised about the Home Office’s track record of managing sensitive information, with the Public Accounts Committee already criticising the department for presiding over a “litany of failure” for its digital border programme. Errors can even lead to people being wrongfully denied access to healthcare or the legal right to work.
“It is a matter of fact that the Home Office has an abysmal record on delivering on IT projects despite its huge budget,” Privacy International experts told Byline Times. “We have estimated that annual expenditure exceeds £2 billion. There is a risk that this will be yet another example of the department throwing money at arms and tech companies for shiny new features which never materialise.”
The Home Office has been approached for comment.