UK commits to redesign visa streaming algorithm after challenge to ‘racist’ tool – TechCrunch

UK commits to redesign visa streaming algorithm after challenge to ‘racist’ tool – TechCrunch


The UK government is suspending the use of algorithms used to process visa applications, amid fears of technology bias and racism.

The device was subjected to a legal challenge. The Joint Council for the Welfare of Immigrants (JCWI) and election campaign law firm Foxglove have asked the court to, after a judicial review, outlaw the algorithm that runs the visa application and its use. Order to stop.

Legal action did not go its full way, but it appears to have forced the Home Office to redesign the system.

A Home Office spokesman confirmed to us that the algorithm will be suspended from August 7, sending us a statement via email: “We are reviewing how the Visa application streaming tool operates and Will redesign our processes to make them more organized and smooth.

Although the government has denied the allegations in a statement issued Friday stating “Similar, baseless allegations concerning Russia’s intelligence have been made more than once. [Secretary of State] Accepts allegations in the form of your claim [i.e. around unconscious bias and the use of nationality as a criteria in the streaming process]”

The Home Office letter also claimed that the department had already stopped using the streaming tool in “many types of applications”. But it added that it would be redesigned “with an open mind to consider the concerns you have raised.”

It is due to be completed by the fall, and the Home Office says an interim process will be set up during which the use of nationality will not be used as a separate standard.

JCWI has claimed a victory against the claim, describing it as accelerating a system of “shady, computer-powered” people. “Today’s victory is the UK’s first successful judicial challenge to an algorithm decision-making system. We have asked the court to declare the algorithm illegal, and to review visa applications.” Order to stop using it. The Home Office’s decision effectively upholds this claim.

The department did not answer a number of questions about the algorithm and its design process, including whether it sought legal advice before implementing the technology. Whether it has complied with the UK’s Equality Act.

The Home Office statement added: “We do not accept the allegations made in the Judicial Review Claims of the Joint Migrants Council and legal proceedings are still ongoing. No further comment for the Home Department.” It would not be appropriate to do so. “

JCWI has been complaining since 2015 about the use of the “Traffic Light System” algorithm to grade every visa application entering the UK.

“The tool, described by the Home Office as a digital ‘streaming tool’, assigns applicants a red, amber or green risk rating. Once assigned by the algorithm, a visa application from that rating Plays a key role in determining the outcome of, “she writes,” in view of “racism” and its treatment of specific nationalities, dubbing discrimination based on it.

“Visa algorithms discriminated on the basis of nationality – design. People with applications with ‘suspicious’ nationalities received a high risk score. Their applications were scrutinized by Home Office officials. The more skeptics were approached, the longer it took to determine, and the more likely they were to refuse.

“We have argued that this is racial discrimination and a violation of the Equality Act 2010.” “The deprivation device was vague. In addition to acknowledging the existence of a secret list of suspected nationalities, the Home Office declined to provide meaningful information about the algorithm. It is not yet clear what other factors were used to grade the application.

Since 2012, the Home Office has openly pursued an immigration policy known as the “hostile environment.” Implement administrative and legislative processes aimed at making it as difficult as possible for people to live in the UK.

This policy has resulted in numerous human rights scams. (We also covered the impact on the local tech sector by telling the story of a UK startup’s visa nightmare last year.) So already under a very troubling policy, automation has taken the formula to court. Looks like

JCWI’s concern around the chain tool was exactly what was being used to automate racism and discrimination. Many say they support the Home Office’s ‘hostile environment’ policy. ۔ In other words, if the policy itself is racist, then someone is picking up an algorithm and reflecting it.

Chi Patel, JCWI’s director of legal policy, said: “An independent review of the Home Office’s Windowsroach scandal revealed that it was oblivious to racist assumptions and the systems it operates.” “The tool explores decades of institutional racist practices, such as targeting specific nationalities for immigration raids, and turning them into software. The immigration system needs to be rebuilt from the ground up to monitor and eradicate such prejudice.

“We’re glad the Home Office removed the sensation and the string tool. Foxlow’s founder and director Corey Cryder added that racist impressions meant that what was supposed to be a proper migration was “In practice, it’s just ‘fast boarding for whites.’ What we need is democracy, not government through algorithms,” added Corey Kryder, founder and director of Foxlov. “Before any other system.” Get ready, let’s ask the experts and the public if automation is right or not, and how historical bias can be found and its roots hollowed out. “

In a letter to FoxGlove, the government promised Equality Impact Assessment and Data Protection Impact Assessment for the interim process, which it will change from August 7 – when it writes that it has “personally focused attributes (such as previous ones). Will use “evidence” to help process some visa applications, further pledging that “nationality will not be used”.

During this period, certain types of applications will be completely removed from the theft process.

He added: “The intention is that the redesign will be completed as soon as possible by October 30, 2020.

Internet Law Expert Ask for Thoughts on What a Legally Acceptable Visa Streaming Algorithm Can Be Lillian Edwards “It’s a difficult thing; I’m not good enough for an immigration lawyer to know that if the original standards had been applied, it would have been illegal anyway to have a forensic standard on suspect nationalities,” he told TechCrunch. no Applying to the sorting algorithm If yes, then clearly the next generation algorithm should wish to discriminate only on legally acceptable grounds.

“As we all know, the problem is that machine learning can reshape illegal standards – although there are now popular techniques to avoid it.”

“You could say that the algorithmic system did us a favor by using illegal standards that could bury an individual immigration officer informally. And indeed one argument for such systems is ‘consistency and arbitrariness’. “It’s a difficult thing,” he said.

Earlier this year, the Dutch government was ordered to ban the use of algorithmic risk scoring systems to predict that social security claimants could commit benefits or tax fraud – when a local court ruled that Found out that he has violated human rights law.

In another interesting case, a group of UK Uber Drives is challenging the legitimacy of the algorithmic management of the Gig platform under Europe’s data protection framework.





Source link

Leave a Reply

Close Menu