Two internet providers are tracking and collecting the websites visited by their customers as part of a secretive Home Office trial, designed to work out if a national bulk surveillance system would be useful for national security and law enforcement.
Details about the data collection experiment are limited, emerging via an obscure regulatory disclosure and a report in Wired, prompting campaigners to warn of a lack of transparency over data being “hoovered up into a surveillance net”.
Under the two trials, the Home Office is working with the National Crime Agency to harvest “internet connection records (ICRs)” – information about which websites a customer visited, when they did so and how much data they downloaded.
DWP uses excessive surveillance on suspected fraudsters, report finds
Read more
The metadata, as it is known, does not detail the specific pages visited on a given website, such as theguardian.com. But it can nevertheless point to a lot of personal information about an individual. That could include health or financial information, revealed because a browser visits a certain site or category of site frequently.
The Open Rights Group, a digital campaign group, complained that details about the Home Office trial were insufficient.
“This is a fairly staggering lack of transparency around mass data collection and retention,” said policy manager Heather Burns.
“We should have the right to not have every single click of what we do online hoovered up into a surveillance net on the assumption that there might be criminal activity taking place,” she added.
Home Office sources indicated that it was taking advantage of abilities in the Investigatory Powers Act 2016, to test what data can be acquired, how useful it is in practice, and how it might be used in investigations.
The act was described as giving the UK government “the most extreme surveillance in the history of western democracy” by whistleblower Edward Snowden when it passed into law nearly five years ago.
It aimed to legalise a range of tools for snooping and hacking by the security services unmatched at the time by any other country in western Europe or even the US, following Snowden’s revelations which showed the amount of mass surveillance that UK and US intelligence agencies already undertook.
The first trial began in July 2019 after the Home Office “retention notice” request was approved by the Investigatory Powers Commissioner’s Office (Ipco) and related to an unnamed internet provider. A second trial relating to another phone company began in October of the same year, according to the regulator’s annual report.
Quick guide The use and abuse of algorithms
Show
Hide
Finance
Algorithms are widely used to accept and reject applications for loans and other financial products. Egregious discrimination is widely thought to occur. For example, in 2017 Apple co-founder Steve Wozniak found that when he applied for an Apple Card he was offered borrowing 10 times that of his wife although they shared various bank accounts and other credit cards. Apple’s partner for the card, Goldman Sachs, denied they made decisions based on gender.
Policing
Software is used to allocate policing resources on the ground and to predict how likely an individual is to commit or be a victim of a crime. Last year, a Liberty study found at least 14 UK police forces have plans to use crime prediction software. Such software is criticised for creating self-fulfilling crime patterns, ie sending officers to areas where crimes have occurred before, and the discriminatory profiling of ethnic minorities and low-income communities.
Social work
Local councils used «predictive analytics» to highlight particular families for the attention of child services. A 2018 Guardian investigation found that Hackney, Thurrock, Newham, Bristol and Brent councils were developing predictive systems either internally or by hiring private software companies. Critics warn that, aside from concerns about the vast amounts of sensitive data they contain, these systems incorporate the biases of their designers and risk perpetuating stereotypes.
Job applications
Automated systems are increasingly used by recruiters to whittle down pools of jobseekers, invigilate online tests and even interview candidates. Software scans CVs for keywords and generates a score for each applicant. Higher-scoring candidates may be asked to perform online personality and skills tests, and ultimately the first round of interviews may be carried out by bots which that use software to analyse facial features, word choices and vocal indicators to decide whether a candidate advances. Each of these stages is based on dubious science and may discriminate against certain traits or communities. Such systems learn bias and tend to favour the already advantaged.
Offending
Algorithms that access a criminal’s chances of reoffending are widely used in the US. A ProRepublica investigation of the Compas recidivism software found that black defendants were often predicted to be at a higher risk of reoffending than they actually were and white defendants were often predicted to be less risky than they were. In the UK, Durham police force has developed the Harm Assessment Risk Tool (Hart) to predict whether suspects are at risk of offending. The police have refused to reveal the code and data upon which the software makes its recommendations.
Was this helpful?
Thank you for your feedback.
A spokeswoman for Ipco said the trial was continuing, and part of its purpose was to ensure that the collection of metadata was “necessary and proportionate”.
She added: “Once a full assessment of the trial has been carried out, a decision will be made on whether there is a case for national rollout.”
The National Crime Agency confirmed it was working with the government on the trial.
“We are supporting the Home Office-sponsored trial of internet connection record capability to determine the technical, operational, legal and policy considerations associated with delivery of this capability,” the police agency said.
Свежие комментарии