上海品茶

您的当前位置:上海品茶 > 报告分类 > PDF报告下载

CETaS:2024生物识别技术在警务和执法领域的应用未来展望报告(英文版)(79页).pdf

编号:158709 PDF   DOCX 79页 9.71MB 下载积分:VIP专享
下载报告请您先登录!

CETaS:2024生物识别技术在警务和执法领域的应用未来展望报告(英文版)(79页).pdf

1、 The Future of Biometric Technology for Policing and Law Enforcement Informing UK Regulation Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein March 2024 Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 1 About CETaS.2 Acknowledgements.2 Executive Summary.3 Recommendations.6 I

2、ntroduction.10 1.Definitions and Taxonomies.18 1.1 Defining biometrics:limits and issues.18 1.2.Understanding biometrics as a spectrum.20 2.Opportunities and Benefits for Public Safety.24 2.1.Current applications.24 2.2.Future trends and applications.27 3.System Risks and Challenges.30 3.1.Reliabili

3、ty:Performance,bias,and scientific validity.30 3.2.Concerns around data collection and sharing.32 3.3.Appropriate use and impacts to civil liberties.33 3.4.Trust and transparency.34 3.5.Challenges for evaluation.35 4.Legal Risks and Challenges.37 4.1.Overview and limitations of UK biometrics regulat

4、ion.37 4.2.Overview and limitations of UK biometrics policy.42 5.Public Attitudes to Biometrics.45 5.1 Comfort and trust vary by application and organisation.45 5.2 High levels of concern shown for a wide range of risks.48 5.3 A strong desire for explicit regulation and bans on certain use cases.49

5、5.4 Most people perceive benefits will outweigh concerns.50 6.Alternative Policy and Regulatory Options.52 6.1.Insights from the EU AI Act.52 6.2.Improving biometric governance and oversight.53 About the Authors.55 Appendix 1.CETaS Public Opinion Survey.56 The Future of Biometric Technology for Poli

6、cing and Law Enforcement:Informing UK Regulation 2 About CETaS The Centre for Emerging Technology and Security(CETaS)is a research centre based at The Alan Turing Institute,the UKs national institute for data science and artificial intelligence.The Centres mission is to inform UK security policy thr

7、ough evidence-based,interdisciplinary research on emerging technology issues.Connect with CETaS at cetas.turing.ac.uk.This research was supported by The Alan Turing Institutes Defence and Security Programme.All views expressed in this report are those of the authors,and do not necessarily represent

8、the views of The Alan Turing Institute or any other organisation.Acknowledgements The authors are grateful to all those who took part in a research interview or workshop for this project,without whom the research would not have been possible.The authors are also very grateful to Fraser Sampson at Sh

9、effield Hallam University for their Strategy Advisor role on the project,as well as to Marion Oswald,Lindsey Chiswick,Campbell Cowie,Matthew Boakes,Malak Sadek and Chris for their valuable feedback on an earlier version of this report.Design for this report was led by Michelle Wronski.This work is l

10、icensed under the terms of the Creative Commons Attribution License 4.0 which permits unrestricted use,provided the original authors and source are credited.The license is available at:https:/creativecommons.org/licenses/by-nc-sa/4.0/legalcode.Cite this work as:Sam Stockwell,Megan Hughes,Carolyn Ash

11、urst and Nra N Loidein,“The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation,”CETaS Research Reports(March 2024).Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 3 Executive Summary This CETaS Research Report explores the future of biometric technol

12、ogy for UK policing and law enforcement.It analyses technical trends and highlights where new regulatory measures may be required to facilitate responsible uses and restrictions of these systems for public safety purposes.Gathering insights from existing literature,research interviews with 35 expert

13、s and a workshop with policing,government and regulator officials,a diversity of views have informed the study findings.This study also incorporates the most up-to-date public survey on biometric systems,which considers future technological developments and their regulatory implications,involving a

14、nationally representative sample of 662 members of the UK population.There is ongoing disagreement over definitions of biometric technology.Current discourse tends to define biometric technology as systems used for the purposes of uniquely identifying an individual or verifying their identity.Howeve

15、r,this definition is outdated and does not account for the full range of biometrics-based systems now available.Therefore,this report argues for a broader conceptualisation of biometric technology,to account for the emergence of new inferential and classification biometrics-based systems such as age

16、 estimation,emotion recognition,and demographic-based categorisation.Biometric systems offer new benefits for tackling crime by uniquely identifying individuals with a high degree of confidence,which in certain circumstances such as crowded public places would be extremely challenging for human oper

17、ators to achieve manually.In some cases,biometrics might protect against the errors of human judgement.There is also growing demand among consumers and industry for more secure ways to protect personal data,due to increasingly sophisticated cybersecurity threats and identity fraud techniques.However

18、,despite their many benefits,critics have argued that emerging biometric systems are prone to technical flaws,pose risks to human rights and in some cases lack scientific validity.This has led to calls for certain biometric systems to be heavily regulated or outright banned.For instance,the EUs fort

19、hcoming AI Act is likely to implement a general ban on live facial recognition technology with exemptions for specific policing and law enforcement use cases,as a means of attempting to balance benefits and risks.With discussions ongoing in the UK over how best to regulate AI technology,public atten

20、tion will also turn to what future regulatory model the UK will pursue for emerging biometric systems.Our study shows that,in the next 5-10 years,the type of biometric systems and data available are likely to broaden dramatically.Moving beyond prevailing purposes of uniquely identifying or verifying

21、 individuals,the same technology may be used for making inferences The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 4 about someones behaviour,emotional state,or classifying them into demographic groups,despite significant concerns over the scientific valid

22、ity and potential benefits of such use cases.New developments such as frictionless biometric formats that require little or no physical contact,and multimodal systems which combine multiple biometric data sources,could also improve the reliability of biometric systems and,in turn,enhance law enforce

23、ment capabilities.This research reinforces existing evidence that the UKs legal framework for biometrics is inadequate and in need of reform.Current laws are failing to keep pace with changes to biometric technology,which risks undermining public confidence and trust in these systems.Most notably,th

24、e current legal framework does not adequately distinguish between tried and tested,scientifically valid biometric systems(such as fingerprint identification,DNA analysis and facial matching)and novel,often untested inferential or classificatory systems such as age estimation,emotion recognition and

25、gait analysis.Nevertheless,the public survey conducted for this project has demonstrated support for police and law enforcement use of biometric identification and verification systems such as live facial recognition,but not for the use of inferential systems such as polygraph testing or emotion rec

26、ognition.In general,the UK public is marginally optimistic about the benefits that biometrics could provide for crime reduction.However,many respondents expressed anxiety over the adequacy of safeguards to protect individuals from a range of risks,such as data misuse and discriminatory implications

27、of certain emerging use cases.1.60%of respondents reported that they were comfortable with policing and law enforcement applications involving identification biometric systems(e.g.facial recognition to identify criminal suspects in crowded areas).In contrast,only 29%were comfortable with the use of

28、inferential systems(e.g.polygraphs).2.Respondents reported higher levels of trust in the use of biometric systems by public sector organisations such as police forces(79%)and the NHS(66%),as opposed to commercial entities,particularly employers(42%)and retailers(38%).3.Most respondents(57%)were quit

29、e uncomfortable or very uncomfortable with biometric data sharing schemes between police forces and the private sector for public safety activities,such as tackling shoplifting.4.Respondents suggested that most biometric use cases should be explicitly regulated rather than outright banned.In terms o

30、f outright bans,however,most respondents believed that the use of novel biometric systems in job interviews to assess performance(63%),and tracking student or employee engagement(60%)should both be banned.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 5 5.Most respondents(53%)reported

31、that the benefits of biometrics will outweigh the concerns(either greatly or marginally),whereas 24%thought that the concerns will outweigh the benefits(either greatly or marginally).To reassure the general public,maximise the benefits of biometric technologies and protect individual rights,our reco

32、mmendations emphasise the need to simplify the complex web of existing regulatory and policy measures;introduce new protections for emerging biometric systems;and standardise testing and evaluation procedures.This should be achieved through updating existing biometrics legislation and developing new

33、 codes of practice for certain data types and systems.Such measures should address the risks,harms and purpose of specific use cases,distinguishing between established technologies,and novel use cases that may involve classification or inferential systems.Any new regulation or codes must apply consi

34、stent cross-sector standards for any systems used for public safety purposes,and establish mandatory requirements for independent system auditing and testing.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 6 Recommendations New legal definitions 1.Future l

35、egislation should introduce a new legal definition of biometrics-based data,to take into account how different types of biometric data may be used for purposes other than uniquely identifying individuals,such as inference or classification.New codes of practice and guidance 2.The UK Equality and Hum

36、an Rights Commission should issue a new code of practice for compliance with the public sector equality duty when using identification,classification or inferential biometric systems.The code should highlight protections needed to prevent group-level discrimination and profiling risks,recognising th

37、e prevailing focus on individual-level protections within current equality and human rights law.3.The College of Policing should develop new Authorised Professional Practice(APP)for retrospective facial recognition(RFR),to address the specific risks that may arise from its use,given existing APP onl

38、y covers live facial recognition(LFR).4.The Home Office should issue new guidance on the appropriate collection,retention,use and deletion of biometric facial images and voice samples by police and law enforcement agencies.This will provide greater legal clarity to existing data protection regulatio

39、ns that fall outside of DNA and fingerprint samples.5.The Police Digital Service(PDS)should issue new guidance for standardising third-party biometric system procurements,with reference to pre-defined system requirements and evaluation processes,to ensure that future system transfers adhere to minim

40、um standards of accountability,transparency and technical performance.Policing transparency 6.The National Police Chiefs Council(NPCC)should establish a nationwide,public biometrics deployment register for police procurement and use of biometric systems.The UK Governments Algorithmic Transparency Re

41、cording Standard could provide a basis for such a register.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 7 7.The NPCC should consult with academics,civil society organisations and regulators to establish standardised procedures for public communications campaigns around the use of bio

42、metric systems.These should go beyond mechanisms that aim to inform and towards those that actively involve affected communities,such as physical engagements,surveys or focus groups,to improve public confidence in future policing deployments.Regulator responsibilities 8.The Information Commissioners

43、 Office(ICO)should ensure that any market outreach for upcoming regulatory sandboxes to test biometric systems includes engagement with policing and law enforcement agencies,which will help to pre-empt and address system risks associated with promising innovations for public safety.9.The ICO should

44、also develop a new risk management framework to address the range of risks from biometric systems which fall outside of data protection legislation.Future regulatory and policy measures 10.Alongside requirements in existing legislation and technical standards,any future regulatory or policy measures

45、 for biometric technologies must:a.Identify the purpose,risks and harms of different use cases to inform appropriate levels of oversight and regulation.b.Integrate a system-focused approach,including creating a list of specific use cases which could be amended based on new technical or legal develop

46、ments.This should include banning scientifically untested use cases if the risks and potential harms to individuals are judged to be unacceptable.c.Outline a consistent set of mandatory standards and accountability measures which all organisations must adhere to when using biometric systems for publ

47、ic safety activities,particularly given current legal ambiguities with private sector deployments.d.Mandate auditing and evaluation procedures by an independent body,such as the National Physical Laboratory(NPL).These procedures could be based on existing ISO standards(e.g.ISO/IEC19795-2:2007,ISO/IE

48、C19795-6:2012 and ISO/IEC 19795-1:2021)to ensure that any systems are certified to an appropriate standard before deployment.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 8 11.Any changes to UK biometrics regulation should take into account the distinct

49、legal frameworks of devolved administrations.This will help to apply a more consistent governance approach and reduce the risk that new measures conflict with separate biometric laws in Scotland or Northern Ireland.Public deliberation 12.The CETaS survey data demonstrated significant variation in pu

50、blic attitudes,with particular concern over the harms that could arise from emerging biometric systems,mixed levels of trust in organisations and preferences for stronger regulation.Inclusive roundtable discussions should be organised by police forces or relevant government departments(e.g.the Home

51、Office)on future biometric technologies to identify areas of positive feedback and concern.Assessing proportionality 13.Organisations using biometric systems for policing or law enforcement should adopt clear frameworks for proportionality assessments,such as the CETaS framework for assessing propor

52、tionality of privacy intrusion of automated analytics.14.Early-stage testing of biometric systems should seek to minimise potential negative impacts on the public,such as through using consenting volunteers,synthetic datasets and testing in controlled environments before moving to live testing with

53、members of the public.The time period,purpose,impacts,and evaluation methodology of pilots should also be clearly articulated to regulators and the public.Minimum system requirements 15.The NPL should work with the British Standards Institute(BSI)s IST/44 biometrics committee to establish mandatory

54、requirements that must be met in the design,deployment,and evaluation of biometric systems.These should include minimum error rates,demographic fairness requirements and human operator considerations across all environmental conditions.16.The NPL should also test any early-stage biometric systems wh

55、ere there is a lack of a consensus on their scientific evidence base.If such assessments cannot establish appropriate assurance,the system in question should be prohibited for use by policing and law enforcement agencies.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 9 Testing and eval

56、uation standards 17.BSIs IST/44,in consultation with other relevant standards bodies,should explore updating existing standards for the testing and evaluation of biometric systems to incorporate further sociotechnical considerations.These should include:a.Potential consequences of system deployments

57、 beyond individual privacy,given the increased sensitivity and permanence of biometric data compared with other personal data types.b.The role of human error and how to mitigate against this,owing to how the effectiveness of biometric systems can be influenced by human factors such as sensor positio

58、ning and data handling.c.A requirement for testing and evaluation procedures to be conducted and periodically reviewed throughout the system lifecycle.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 10 Introduction What are biometrics?The term biometrics i

59、s derived from the Greek words bio-(meaning life),and metric-(meaning measurement).The literal meaning therefore refers to the measurement of an individuals life characteristics.1 In practice,the term is most commonly associated with types of samples,data and systems which together link back to a sp

60、ecific individual based on certain characteristics.More recently,the term has also become associated with data and systems aiming to predict certain attributes(such as age)or states(such as alertness)of individuals.Historically,systems which utilised biometric data worked by obtaining physical mater

61、ial from an individual(e.g.a fingerprint mark)that human experts analysed to extract the unique features such as their specific fingerprint ridges.One method would then involve manually comparing these features against a pre-existing sample of a person to determine if they are who they say they are(

62、verification).This process was relatively straightforward and resulted in few errors.Another method involved comparing these features against a database of multiple different samples,to determine if they match to a specific person on that database(identification).2 Unlike verification,this latter pr

63、ocess was more challenging given the number of comparisons,therefore risking higher error rates.3 Biometric systems convert the relevant features into a biometric template,which stores the necessary information in a convenient form for comparison see Figure 1.The term biometric data is often reserve

64、d to refer to these resultant features or templates rather than the initial sample,particularly in law.The process of converting a sample(which may be physical or digital)into templates is not necessarily immediate and may in itself be subject to errors and uncertainties.4 1“What are biometrics?,”Sc

65、ottish Biometrics Commissioner,https:/www.biometricscommissioner.scot/biometrics/what-are-biometrics.2“What is a biometric system,and how to secure it,”Veridium,19 July 2018,https:/ Government Office for Science,Biometrics:A Guide(2018),4,https:/assets.publishing.service.gov.uk/government/uploads/sy

66、stem/uploads/attachment_data/file/715925/biometrics_final.pdf;Interview with industry representative,21 November 2023.4 Catherine Jasserand,“Experiments with Facial Recognition Technologies in Public Spaces:In Search of an EU Governance Framework,”in Handbook on the Politics and Governance of Big Da

67、ta and Artificial Intelligence,ed.Andrej Zwitter and Oskar J.Gstrein,(Cheltenham:Edward Elgar Publishing,2023),5,https:/ Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 11 Figure 1:Stages of biometric data collection and processing Digital technologies such as artificial intelligence(AI)hav

68、e fundamentally changed biometric processing.AI can be understood as machines that perform tasks that would ordinarily require human brainpower to accomplish.Certain novel systems incorporating AI can now capture samples from individuals without needing the direct involvement of the subject,such as

69、remote facial recognition(FR)systems which process images of individual faces.Algorithms can also extract and compare key features more quickly,accurately,securely and against a larger database of other templates than human analysts speeding up potential matches and reducing error rates.5 For instan

70、ce,a biometric system using soft computing methods was found to have an error rate of 0.18%in 2022.6 AI has also created new possibilities to use biometric data for purposes other than verification or identification.New systems have been developed that use statistical correlations between biometric

71、characteristics and certain traits with the aim of classifying individuals into different demographic categories(e.g.age or ethnicity),or to infer emotions and psychological states.These systems have proved controversial due to concerns over their scientific validity and ethical implications.The app

72、lications of these systems could therefore,if embedded within society,lead to people being negatively impacted.7 5 Yash Rawat et al.,“The Role of Artificial Intelligence in Biometrics,”in 2023 2nd International Conference on Edge Computing and Applications(ICECAA),(19-21 July 2023):622-626,DOI:10.11

73、09/ICECAA58104.2023.10212224.6 Vani Rajasekar et al.,“Enhanced multimodal biometric recognition approach for smart cities based on an optimized fuzzy genetic algorithm,”in Scientific Reports 12,no.662(January 2022),DOI:10.1038/s41598-021-04652-3.7 Matthew Ryder QC,The Ryder Review:Independent legal

74、review of the governance of biometric data in England and Wales(Ada Lovelace Institute:June 2022),3-7,https:/www.adalovelaceinstitute.org/wp-content/uploads/2022/06/The-Ryder-Review-Independent-legal-review-of-the-governance-of-biometric-data-in-England-and-Wales-Ada-Lovelace-Institute-June-2022.pdf

75、.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 12 Figure 2:Examples of established biometric data types Figure 3:Examples of contested biometric data types Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 13 As discussed in the following sect

76、ions,current discourse tends to define biometric technology as systems used for the purposes of uniquely identifying an individual or verifying their identity.However,this definition is outdated and does not account for the full range of biometrics-based systems now available.Therefore,this report a

77、rgues for a broader conceptualisation,adopting the following definition of biometric technologies:Computer-based systems which collect and process physiological data or behavioural data.This data can be used for numerous purposes,for instance to identify an individual,verify their identity,categoris

78、e them into different groups,or make inferences about their psychological or emotional states.We recognise that this expanded categorisation goes beyond existing legal definitions of biometric data.However,as it will be shown,this broader conceptualisation is essential to cover the full range of new

79、 risks and regulatory considerations arising from new and emerging biometrics-based systems.Why are biometrics important?The collection,analysis and sharing of our biometric data has now become an everyday occurrence from enabling individuals identity to be verified at borders,to providing convenien

80、t access to personal devices such as mobile phones.For police and law enforcement agencies,biometric data is used routinely to identify suspects present at crime scenes through DNA or fingerprint analysis,and increasingly in crowded public spaces using live facial recognition.New generative AI tools

81、 that produce highly realistic artificial content have already demonstrated the vulnerability of traditional cybersecurity techniques.Across 2023,there was a 704%increase in face swaps,a type of deepfake(or fake content designed to look real)that can be used to trick biometric systems for fraudulent

82、 activities.8 Given recent evidence showing poor human detection rates against these system attacks,new security risks may therefore arise by opting not to use biometric measures.9 There are also potential consequences of human error when biometric samples are involved in legal cases.The so-called P

83、rosecutors Fallacy is where the claimed probability of a sample(e.g.DNA)being matched to an accused person in court is taken to be absolute,while neglecting how 8“New Threat Intelligence Report Exposes the Impact of Generative AI on Remote Identity Verification,”iProov,Press Release,7 February 2024,

84、https:/ Matthew Groh et al.,“Deepfake detection by human crowds,machines,and machine-informed crowds,”Proceedings of the National Academy of Sciences 119,no.1(2022),https:/doi.org/10.1073/pnas.2110013119.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 14 c

85、ontextual factors such as margins of error and sample size can mean that there is still a likelihood of false association.10 Nevertheless,biometric systems themselves also present numerous risks.Some of these relate to reliability concerns.Biometrics typically involve comparative analysis of one sam

86、ple against another(or many others).They also exploit statistical correlations between biometric traits and characteristics.Due to the probabilistic nature of statistical analysis,there is a risk of false positives and false negatives whether as a result of poor quality data or human error.11 Equall

87、y,behavioural inferences(such as detecting emotions)are likely to be highly unreliable due to a lack of scientific validity in the research of inferential systems.Other risks relate to human rights and proportionality,owing to the unique properties of biometric data compared to other forms of data u

88、sed to identify individuals.This type of personal data can reveal other sensitive information such as ancestry,health,or demographics.Given this,there are important considerations over the intrusiveness into an individuals privacy when biometric data is collected.This includes scenarios where police

89、 forces are capturing facial images of civilians walking past a system within a crowded public area as part of an operation,who may be unaware of this processing taking place.Possession of biometric data can also enable more effective physical surveillance due to the fact that the information proces

90、sed is inseparable from a person(e.g.their facial features or fingerprints).In doing so,a sense of pervasive monitoring in public could erode individuals rights of free assembly or expression due to fears over the potential consequences.12 Alongside these problems,the UKs existing biometrics legal f

91、ramework which consists of a complex web of overlapping equality,human rights,data protection and police powers legislation has also been criticised as insufficient and unclear.13 As far back as 2016,the then-UK Biometrics Commissioner who was responsible for overseeing the use and retention of biom

92、etric data raised similar concerns in his annual report,stressing the lack of clarity over future governance arrangements.14 The publication of the Ryder Review in 2022 10 David Spiegelhalter and Anthony Masters,“Covid,false positives and conditional probabilities.,”The Guardian,25 April 2021,https:

93、/ Government Office for Science(2018),4-5.12 Jasserand,2023,21.13 Brian Plastow,“Is Scotland sleepwalking towards its place within a UK surveillance state in 2024?,”8 January 2024,https:/www.biometricscommissioner.scot/media/uhbowbhn/sbc-opinion-piece-january-2024.pdf;Interview with government repre

94、sentative,10 October 2023.14 Paul Wiles,Annual Report 2016(Commissioner for the Retention and Use of Biometric Material:March 2017),https:/assets.publishing.service.gov.uk/media/5a74eb86ed915d3c7d528fb5/CCS207_CCS0917991760-1_Biometrics_Commissioner_ARA_Accessible.pdf.Sam Stockwell,Megan Hughes,Caro

95、lyn Ashurst and Nra N Loidein 15 further articulated the need for new legal measures to be considered.15 Despite these mounting concerns,there has been a lack of corresponding progress in changes to regulation and policy,as the law falls further behind innovation.Methodology and structure Within thi

96、s context,CETaS conducted a research project on the future of biometric technology for UK policing and law enforcement.This study aims to move beyond the current debates that specifically concentrate on FR technology,instead considering the full range of biometric technologies currently available an

97、d on the near-term horizon.Research questions RQ1:What is the state of the art in relation to emerging biometric technologies with potential security implications?RQ2:Where are the current gaps in existing biometric policy and regulation?Is a regime of safeguards for the retention of specific biomet

98、ric data still fit for purpose,as opposed to a system that primarily regulates the use of biometric technologies?RQ3:What regulatory and policy actions are needed to ensure that emerging developments in this area are adequately covered for policing and law enforcement usage?RQ4:What distinctions sho

99、uld be drawn between the use of biometrics by public and private sector organisations when considering the potential for intrusion and the safeguards required?Methods 1.Literature review:the research team analysed relevant biometric regulatory and policy measures in the UK,policing and law enforceme

100、nt trends,use cases and risks from future developments in biometric technology.2.Interviews:semi-structured,anonymised interviews between August and November 2023 with 35 participants across academia,civil society organisations,UK government,policing and industry.Participants were identified through

101、 a purposive sampling strategy to ensure informed responses.15 Matthew Ryder QC(2022),3-7.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 16 3.Workshop:an invitation-only workshop held in January 2024 entitled,“Stress-Testing Future Biometrics Policy Optio

102、ns for Policing and Law Enforcement”.This session gathered 12 experts from law enforcement,regulators and other relevant areas of government,to review different potential regulatory approaches for biometrics.4.Survey:to investigate public perceptions towards biometric technologies,CETaS conducted a

103、survey with a representative sample of 662 UK-based respondents.Key themes from the survey can be found in Section 5,while more detail on the methodology and results can be found in Appendix 1.5.Freedom of Information(FOI)requests:CETaS submitted a series of FOI requests to the Home Office in Novemb

104、er 2023.These were based on emergent findings from the literature.The results can be found in Sections 2 and 3.Limitations 1.This study is focused on the use of biometric systems within a policing and law enforcement context.That is,when the technology is used by public bodies or private sector orga

105、nisations acting on behalf of the state for activities designed to protect the public.2.This study does not explore the regulation of covert uses of biometric systems by policing and law enforcement,which fall under different legislative frameworks.As such,the study is solely concerned with overt de

106、ployments of this technology.3.The focus of this report is on data-driven biometric technology.Therefore the report does not explore non-digital biometric samples taken and retained by police and law enforcement agencies.4.While the report does cover technical trends in the development of biometric

107、systems,particularly those that have implications for policy and legislation,detailed analysis of state-of-the-art adversarial attacks on biometric systems are out of scope.Structure This report is structured as follows.Section 1 explores the definition and scope of the term biometric data and biome

108、tric systems,summarising disagreements between experts in the field.Section 2 provides an overview of the opportunities and benefits of biometric systems,exploring both current applications and future trends.Section 3 discusses the risks and challenges posed by biometric systems,such as demographic

109、bias and human Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 17 rights considerations.Section 4 provides an overview of existing legal and policy frameworks for biometric technologies and their limitations.Section 5 details the key findings from the public survey,while Section 6 concl

110、udes with a discussion of alternative regulatory and policy measures which the UK could adopt to improve governance and oversight.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 18 1.Definitions and Taxonomies This section introduces our understanding of b

111、iometrics,where disagreement lies between stakeholders on what should be included under this concept,and the need to move towards understanding biometrics as a spectrum rather than a discrete category of technologies.1.1 Defining biometrics:limits and issues Within the existing literature,many taxon

112、omies are used to categorise different types of biometrics.The most common distinctions include the following(although specific definitions often vary).Soft vs hard biometric data.Hard biometrics are integral characteristics of a person which can be used for uniquely identifying them to a high degre

113、e of confidence,such as fingerprints and DNA.Conversely,soft biometrics relate to learned characteristics(e.g.gait/walking patterns)and non-unique integral characteristics(e.g.eye colour).While learned characteristics are weaker identifiers for matching back to a specific person,non-unique integral

114、characteristics could be used in addition to other biometric samples for increasing the accuracy of identifying a specific person.16 Physiological vs behavioural biometric data.The former rely on physical characteristics,including someones face or fingerprints,while the latter relate to patterns in

115、behavioural characteristics of the human body,such as keyboard stroke or gait.17 Direct vs remote biometric systems.Direct biometric systems rely on physical contact with the subject.These include traditional biometric data types,like requiring someone to present their fingerprint.Remote systems,how

116、ever,can capture the necessary information from remote sensors and processes.For instance,face and gait recognition do not need the active engagement of an individual.18 16 Government Office for Science(2018),3.17 Christiane Wenderhorst&Yannic Duller,Biometric Recognition and Behavioural Detection:A

117、ssessing the ethical aspects of biometric recognition and behavioural detection techniques with a focus on their current and future use in public spaces(European Parliaments Policy Department for Individuals Rights and Constitutional Affairs:August 2021),67-68,https:/www.europarl.europa.eu/RegData/e

118、tudes/STUD/2021/696968/IPOL_STU(2021)696968_EN.pdf.18“Remote biometric identification:a technical&legal guide,”EDRi,23 January 2023,https:/edri.org/our-work/remote-biometric-identification-a-technical-legal-guide/.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 19 Table 1:Common distinc

119、tions of biometric data types Biometric data Hard or soft Physiological or behavioural Direct or remote DNA Hard Physiological Direct Fingerprint Hard Physiological Direct Face Hard Physiological Remote Voice Soft Behavioural Remote Gait Soft Behavioural Remote Facial expressions Soft Behavioural Re

120、mote Keystroke Soft Behavioural Remote The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 20 Experts continue to disagree over these definitions,and what should be interpreted as falling under the term biometrics.19 This includes whether emergent applications

121、 should be included(e.g.those using physiological and behavioural data to classify individuals into categories rather than identify them).20 Existing legal frameworks in the UK and EU define biometric data in a narrow sense,where it is only considered as such when it involves the unique identificati

122、on of a person.21 However,several interviewees and legal experts have criticised this current framing on the basis that when originally drafted,these regulations did not account for how the integration of AI has led to the rise of new functions for biometrics(such as classification and inferential u

123、ses)that fall outside existing frameworks and may pose a risk to human rights.22 Some therefore advocate for expanding the definition of biometrics,to ensure that such additional applications are scrutinised and explicitly regulated.In contrast,some interviewees believed that expanding the legal con

124、cept of biometric data or systems would have more negative implications than retaining the existing one.Widening the definition to include some technologies that are widely believed to lack scientific validity may damage the reputation of more reliable systems and vendors.23 For instance,controversi

125、al emotion recognition systems could then be permitted for integration into applications,on the basis of being legally defined as a form of biometrics.Yet any negative impacts on individuals resulting from such integration could lead to an erosion in trust with other biometric systems(e.g.verificati

126、on models)which offer security benefits.24 1.2.Understanding biometrics as a spectrum Ongoing disagreements over fundamental definitions of biometric technologies have undermined the ability to revise existing regulatory frameworks,ensuring they remain up to 19 Interview with academic representative

127、,25 September 2023;Interview with academic representative 28 September 2023;Interview with academic representative,10 October 2023;Interview with industry representatives,12 October 2023.20 For an example in the literature of a more expansive concept of biometrics,see:https:/www.adalovelaceinstitute

128、.org/wp-content/uploads/2022/06/Countermeasures-the-need-for-new-legislation-to-govern-biometric-technologies-in-the-UK-Ada-Lovelace-Institute-June-2022.pdf,16-20.21“What is special category data?,”ICO,https:/ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/lawful-basis/special-category-d

129、ata/what-is-special-category-data/#scd4.22 Interview with academic representative,25 September 2023;Interview with academic representative,28 September 2023;Interview with academic representative,10 October 2023;Matthew Ryder QC(2022),3-7;ICO,Biometrics:Foresight(October 2022),8-11,https:/ico.org.uk

130、/media/about-the-ico/documents/4021971/biometrics-foresight-report.pdf.23 Interview with industry representatives,12 October 2023.24 Gloria Fuster and Michalina Peeters,Person identification,human rights and ethical principles:Rethinking biometrics in the era of artificial intelligence(Panel for the

131、 Future of Science and Technology:December 2021),20-21,https:/www.europarl.europa.eu/RegData/etudes/STUD/2021/697191/EPRS_STU(2021)697191_EN.pdf.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 21 date and address new issues that arise.25 A large portion of interviewees therefore believe

132、d that it was better to understand biometric technologies as a spectrum.26 A spectrum of biometric systems There is a need clarify the different types of biometric systems that have emerged in recent years,to facilitate more precise discussion about the implications and risks associated with distinc

133、t uses.The taxonomy outlined in Table 2 was developed from these insights.Table 2:Taxonomy of biometric systems Biometric system Purpose Relative degree of public acceptability27 Verification Uses a 1:1 match to determine if an individual is who they say they are High Typically requires active user

134、involvement and match is only compared against a single data source,so may have lower margins of error Identification Uses a 1:N(one-to-many)match to determine if an individual corresponds to a member of a pre-existing database of multiple biometric samples Medium Typically does not require users di

135、rect involvement and data comparisons against a larger database may increase the margin of error Classification Used to classify individuals into different groups Low Includes some systems that lack scientific validity,as well as some that raise significant ethical concerns(e.g.racial profiling)Infe

136、rential Used to make inferences about someones psychological or emotional state Very low Includes some systems that lack scientific validity,as well as some with potentially malicious uses(e.g.coercive manipulation)25 Interview with academic representative,25 September 2023.26 Interview with governm

137、ent representative,27 October 2023;Interview with government representative,31 October 2023;Interview with government representative,2 November 2023;Interview with academic representative,2 November 2023.27 Based on the results of the CETaS public survey(see Section 5)and interview analysis.The Futu

138、re of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 22 The spectrum of biometric data As with the subtleties between biometric systems,similar analysis is applicable to biometric data.Biometric modalities are sometimes assumed to involve one key feature that is extrac

139、ted(e.g.geometric measurements from a persons face to verify their identity).In reality,these modalities can contain a diversity of data types.From a persons face,other data features such as wrinkles or eye shape could be extracted for classification purposes such as estimating that persons age.Curr

140、ent legislation typically covers some of these processes,in that only data which is used for identification or verification purposes is considered“biometric”.While other“special category”data as defined in UK GDPR could protect against certain use cases(such as when the information could reveal ones

141、 ethnic origins),this may not address inferences made about ones behaviour or emotions,which could still create significant risks to human rights.This is because they operate a fuzzy zone within the realm of personal data,where although it may not always be linked back to a specific person,individua

142、ls cannot easily change learned or inner traits compared to other forms of personal data.28 A key finding from the survey was the level of concern over emotion recognition systems,due to the belief that emotions were highly sensitive information to an individual.As such,more nuances may be needed to

143、 ensure that stricter protections are in place,such as by defining a new legal term for biometric-based data.29 Table 3 shows how this could work in practice,with a hypothetical example of a police officer analysing a voice recording to assist in catching a suspect who fled a crime scene.Table 3:Bio

144、metric data vs.biometric-based data Data category Summary Biometric data A police officer uses an AI system to capture a voiceprint of a fleeing suspect from a voice recording which is unique to that individual.Biometric-based data A police officer uses an AI system to assess the likely demographic

145、profile of a fleeing suspect(e.g.age group,gender and native vs.non-native English speaker)from certain elements of their voice recording,such as pitch and tone.28 Wendehorst&Duller(2021),68.29 Ibid.69.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 23 Taken together,the range of biomet

146、ric systems and data types described above may lead to new applications in different sectors(see below).Although certain use cases could be beneficial,others may pose risks to individuals(e.g.job screening based on perceived emotions discriminating against neurodivergent candidates).Table 4:Current

147、and future applications of biometric systems and data Use case Biometric system Modality and feature Age verification for alcohol purchases at supermarket self-checkouts30 Classification Face(geometry and features)Targeted demographic-based advertising(e.g.age and gender-specific)Classification Face

148、(geometry and features)Pain monitoring for non-verbal patients31 Inferential Eyes(pupil dilation);heart(pulse);respiratory system(breathing rate)Online job interview candidate screening based on perceived emotions32 Inferential Face(facial expression);eyes(eye movement)30“Asda to trial digital ID at

149、 self-checkouts,”Asda,31 January 2022,https:/ Will Knight,“Job Screening Service Halts Facial Analysis of Applicants,”WIRED,1 December 2021,https:/ Michael et al.,“Biometrics and AI Bias,”IEEE Transactions on Technology and Society 3,no.1(March 2022):1-8,DOI:10.1109/TTS.2022.3156405.The Future of Bi

150、ometric Technology for Policing and Law Enforcement:Informing UK Regulation 24 2.Opportunities and Benefits for Public Safety This section provides an overview of current and potential future applications of biometric systems for policing and law enforcement.Broader technological and societal trends

151、 relating to the development of biometric technologies are also discussed.2.1.Current applications It is important to note that,particularly within a UK context,the roles of police and law enforcement are distinct.The former relate to the 45 territorial and 3 special police forces which are responsi

152、ble for a broad range of policing duties in specific regions.The latter relate to agencies with non-regional duties and specific areas of remit,such as the National Crime Agency(NCA)and UK Border Force.While both types of organisations may deploy similar biometric systems,they are often used for dif

153、ferent purposes.FOI requests obtained by CETaS and further analysis highlight some of the existing biometric data types used by police and law enforcement,as well as the scale of data samples available for tackling crime:As of October 2023,there were 16,572,608 custody facial images held on the Poli

154、ce National Database(PND).33 It is highly likely that images of individuals who were never charged or were cleared of all charges remain stored on the PND indefinitely.34 As of 31st March 2022,there were 870,705 subject profile records and 685,063 crime scene DNA profile records held on the National

155、 DNA Database(NDNAD).35 As of 31st March 2022,there were 27,168,685 fingerprint forms relating to 8,562,878 individuals and 2,009,989 unidentified crime scene marks held on the National Fingerprint Database and National Automated Fingerprint Identification System(now known collectively as IDENT 1).3

156、6 33 Freedom of information request submitted to the Home Office by CETaS on 24 November 2023.34 Cahal Milmo and Mark Wilding,“Hundreds of thousands of innocent people on police databases as forces expand use of facial recognition tech”,iNews,29 September 2023,https:/inews.co.uk/news/police-secretiv

157、e-facial-recognition-database-millions-innocent-people-2635445.35 Ben Snuggs,Forensic Information Databases Strategy Board Annual Report:April 2021-March 2022(Home Office and National Police Chiefs Council:May 2023),9,https:/assets.publishing.service.gov.uk/media/646c7fe6a726f60013cebc09/Forensic_In

158、formation_Databases_Strategy_Board_Annual_Report_2021-22_Web_Accessible_002_.pdf.36 Ibid.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 25 The main examples of how law enforcement currently use biometrics include:Border security and immigration:UK Border Force collects fingerprints and

159、 facial images to support the UK immigration system,such as when verifying the identity of an individual seeking to enter the UK who cannot produce a document establishing identity,nationality or citizenship.37 Domestic security:fingerprint scanners are currently being rolled out by the Home Office

160、to replace the use of ankle tags to monitor individuals facing deportation.38 Air and rail travel:FR technology is also used in ePassport gates available at some British air and rail ports,while trials of passport-free e-gates using FR systems are due to begin in 2024.39 By contrast,UK policing curr

161、ently uses biometrics in two ways:DNA and fingerprints can be used for the purpose of evidential forensics,such as informing evidence during a court trial as to whether someone was present at a crime scene.Other biometrics can be used for intelligence and crime prevention purposes.Suspects or wanted

162、 individuals may have their images compared against CCTV or police watchlists to help determine who the person is and assist in narrowing down potential geographical areas where they may be present based on recent footage.40 DNA and fingerprints can also be collected from arrested persons(and crime

163、scenes)to confirm an individuals identity for investigations.41 The use of the polygraph is discussed further below.Some of the most prominent biometric technologies in use by policing and law enforcement agencies include mobile fingerprint scanners and FR systems.Pronto mobile devices,first deploye

164、d by West Yorkshire Police in 2018,enable officers to check fingerprints from the 37 HM Government,Biometric enrolment:policy guidance(Home Office:February 2024),4,https:/assets.publishing.service.gov.uk/media/653a41a9e6c9680014aa9b62/Biometric+information+-+enrolment.pdf.38 Nicola Kelly,“UK plans G

165、PS tracking of potential deportees by fingerprint scanners,”The Guardian,13 January 2023,https:/ Ben Clatworthy,“No passports needed under Border Force e-gate plan,”The Times,1 January 2024,https:/www.thetimes.co.uk/article/uk-flights-passports-border-force-queues-szdd39c5x.40 HM Government,Biometri

166、cs Strategy:Better public services,maintaining public trust(Home Office:June 2018),12,https:/assets.publishing.service.gov.uk/media/5b34f69c40f0b60b107a4a80/Home_Office_Biometrics_Strategy_-_2018-06-28.pdf.41 Ibid.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regul

167、ation 26 field against national database records in real-time.42 These increase the efficiency of verification and identification processes,which previously would have necessitated arresting and holding an individual in custody while conducting hours of enquiries.43 There are three types of FR used

168、by UK police forces:44 Operator Initiated Facial Recognition(OIFR):use of a mobile application to check a person of interests identity against the police national database without having to take them into custody.Retrospective Facial Recognition(RFR):use of images supplied after an event or incident

169、 for comparison against national custody images of individuals on the PND.Live Facial Recognition(LFR):use of live video capture to identify a person of interest in real time.Currently,police use of LFR has been limited and restricted to a small number of forces including the Metropolitan Police Ser

170、vice(MPS),South Wales Police(SWP)and Essex Police.However,any new regulatory changes or operational requirements could lead to greater uptake across forces.Indeed,the MPS is already working with a group of large retailers on the Pegasus initiative,where the police are analysing CCTV images and using

171、 RFR technology to identify prolific shoplifters.45 Police forces do not currently use novel classification and inferential biometrics,but polygraph technology(also known as the lie detector)is used by the Probation Service to monitor people convicted of sexual,domestic abuse and terrorism-related o

172、ffences ensuring they comply with harm prevention orders.46 Nevertheless,recent analysis has found that individuals arrested on suspicion of sex offences have also been subjected to the polygraph,despite not having a conviction,which represents a potential misuse of policing powers.47 42 Motorola So

173、lutions,“Pronto Biometrics Fingerprint Identification,”Motorola Solutions,2018,https:/ Home Office(2018),6-8.44 Home Office,“Police use of Facial Recognition:Factsheet,”Home Office in the media blog,29 October 2023,https:/homeofficemedia.blog.gov.uk/2023/10/29/police-use-of-facial-recognition-factsh

174、eet/.45 Divya Talwar and Eleanor Layhe,“Small shops call for aid to tackle brazen shoplifters,”BBC News,16 September 2023,https:/www.bbc.co.uk/news/uk-66819837 46 Cahal Milmo and Mark Wilding,“Police forces may be exceeding powers in the use of lie detectors,”iNews,1 January 2024,https:/inews.co.uk/

175、news/uk-police-forces-expanding-lie-detector-tests-2822226.47 Ibid.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 27 2.2.Future trends and applications Technological trends Advances in biometric technologies are likely to proliferate out to 2030 and beyond due to innovation driven by a

176、 range of factors from consumer demand to increasingly sophisticated adversarial attacks on systems.Some of these advances will relate to new forms of biometric processing.Table 5:Future biometric formats Format trend Summary Cancellable biometrics Biometric templates are transformed in such a way t

177、hat,if they are compromised,the original feature cannot be determined.A new version would then need to be reissued(akin to resetting a password).48 Multi-modal biometrics These systems collect data from several biometric modalities(such as keystroke analysis alongside fingerprint recognition)or use

178、a single modality to extract multiple forms of data(e.g.extracting gaze estimation and pupil diameter data from a single eye image).49 This trend may help mitigate some of the existing problems associated with individual systems,such as sensor accuracy.50 Frictionless biometrics Where little(or no)p

179、hysical contact or pausing is required to gather biometric data.51 This will offer enhanced efficiency and convenience,but the ability to use these systems at a distance without requiring someones awareness is likely to prompt questions surrounding consent for data processing.52 Alongside these new

180、formats,there will be changes related to both the wider design and performance capabilities of biometrics systems.Further advancements in deep learning 48 Interview with industry representative,29 September 2023.49 ICO,Biometrics:insight(ICO:October 2022),6,https:/ico.org.uk/media/about-the-ico/docu

181、ments/4021972/biometrics-insight-report.pdf.50 Interview with academic representative,22 September 2023.51 ICO,Biometrics:insight(ICO:October 2022),5,https:/ico.org.uk/media/about-the-ico/documents/4021972/biometrics-insight-report.pdf.52 ICO,Biometrics:foresight(ICO:October 2022),5,https:/ico.org.u

182、k/media/about-the-ico/documents/4021971/biometrics-foresight-report.pdf.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 28 could enable improved FR in challenging conditions,such as reduced visibility and low-resolution cameras,53 as well as with masked fa

183、ces.54 This could allow for more flexible deployment of the technology,without compromising accuracy.55 Biometric systems could also become more accessible in the next decade as hardware miniaturisation allows for heightened portability,while software will increasingly be integrated into mobile phon

184、es and drones.56 As the private sector provides increased access to personal surveillance systems through the consumer market,the volume of potential biometric data could increase exponentially.This phenomenon is already underway;one in five British households uses doorbell cameras57 and a quarter o

185、f British drivers use dashcams.58 One interviewee believed that the abundance of images and videos shared online will contribute to this trend,and that biometric information can no longer be considered confidential.59 With advances in AI lowering the barrier to entry for forging identities,veracity(

186、the accuracy and reliability of biometric information used for verification)will play a key role in the future.60 Future applications The development of existing and emergent technologies could enable a range of new applications relevant to policing and law enforcement in the UK.The below applicatio

187、ns were posited as potentially useful and worthy of further consideration by research participants.Three-dimensional(3D)FR or accurate iris recognition under imperfect conditions could be useful for identifying or verifying a known individual.61 Contactless fingerprint verification could be valuable

188、 for border control applications,as well as for the identification of missing 53 Katina Michel et al.,(2022).54 Marta Gomez-Barrero et al.,“Biometrics in the Era of COVID-19:Challenges and Opportunities,”IEEE Transactionss on Technology and Society 3,no.4(December 2022):307-322,DOI:10.1109/TTS.2022.

189、3203571.55 Interview with government representative,24 October 2023.56 Interview with industry representatives,31October 2023;Interview with academic representative,22 September 2023;Cahal Milmo and Mark Wilding,“Police across UK equipped with live facial recognition bodycams,”iNews,25 November 2023

190、,https:/inews.co.uk/news/police-uk-live-facial-recognition-bodycams-2775720.57 Dominic Penna,“More households install alarms and doorbell cameras over crime fears,”The Telegraph,1 May 2023,https:/www.telegraph.co.uk/news/2023/05/01/people-turn-to-diy-security-amid-crime-fears/.58 RAC,“New app could

191、soon turn every car into a speed camera and report traffic offences at the touch of a button,”RAC Drive News,20 March 2023,https:/www.rac.co.uk/drive/news/driving-tech/new-app-could-soon-turn-every-car-into-a-speed-camera/.59 Interview with industry representative,21 November 2023.60 Interview with

192、policing representative,6 October 2023.61 Interview with industry representative,12 October 2023.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 29 persons.62 Emergent systems such as gait analysis and gaze estimation(which predicts where a person is looking)are likely to see greater mu

193、lti-sectoral use.63 For instance,future gait analysis systems could indicate whether someone is concealing a weapon,64 while gaze estimation could be used to monitor individual behaviour considered to be suspicious for instance to detect whether they are conducting hostile reconnaissance.65 One inte

194、rviewee suggested it may actually be unethical not to use new biometric systems especially if they are shown to better protect vulnerable individuals compared to existing tools and humans.66 Voice recognition technology in particular was mentioned as being currently underutilised for public safety p

195、urposes.67 The large volume of available audio samples on video and social media platforms is an enabling factor behind the fast-paced development of voice recognition systems.68 INTERPOLs global voice database(SiiP)is their third-largest biometric database,demonstrating the utility of this particul

196、ar modality.69 Figure 4:Potential future biometric trends for policing and law enforcement 62 Interview with policing representative,6 October 2023.63 Interview with industry representative,12 October 2023.64 Interview with industry representative,31 October 2023.65 Virginio Cantoni et al.,“Gaze-bas

197、ed biometrics:An introduction to forensic applications,”Pattern Recognition Letters 113,(October 2018):54-57,https:/doi.org/10.1016/j.patrec.2016.12.006.66 Interview with industry representative 2,12 October 2023.67 Interview with policing representative,6 October 2023.68 Fieke Jansen,Javier Snchez-

198、Monedero,and Lina Dencik,Biometric identity systems in law enforcement and the politics of(voice)recognition:The case of SiiP.Big Data&Society 8,no.2(2021),DOI:205395.69 Ibid.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 30 3.System Risks and

199、Challenges This section presents risks and challenges associated with the use of biometric systems for policing and law enforcement.3.1.Reliability:Performance,bias,and scientific validity The impact of environmental factors on performance Environmental factors,such as lighting conditions or image q

200、uality,can significantly impact the performance of biometrics systems.These include factors related to how data subjects interact with the system(e.g.finger placement for fingerprint systems),70 how practitioners operate the system(e.g.the placement of cameras),71 and other external factors(e.g.ligh

201、ting conditions).72 These environmental factors have implications for the real-world reliability of systems under different conditions.At worst,such errors in a law enforcement context could result in wrongful arrests(from false positives)or allow perpetrators to avoid apprehension(false negatives).

202、73 It is also important to recognise how human operators themselves can be affected by environmental factors.Fatigue,distractions,multi-tasking commitments and time pressures could all undermine the optimal performance of biometric systems or exacerbate existing technical deficiencies.74 Demographic

203、 bias Of particular concern is the risk of differences in system performance for different demographic groups.75 One participant described how fingerprints can be less reliable for the elderly,Asian females,or those undergoing chemotherapy,due to less well-defined 70 Interview with academic represen

204、tative,22 September 2023.71 Science,Innovation and Technology Committee,“Governance of artificial intelligence(AI)-Oral evidence,”24 May 2023,28,https:/committees.parliament.uk/oralevidence/13201/pdf/;Interview with government representative,24 October 2023.72 Interview with academic representative,

205、6 October 2023.73 Madeleine Chang,Countermeasures:the need for new legislation to govern biometric technologies in the UK(Ada Lovelace Institute:June 2022),https:/www.adalovelaceinstitute.org/wp-content/uploads/2022/06/Countermeasures-the-need-for-new-legislation-to-govern-biometric-technologies-in-

206、the-UK-Ada-Lovelace-Institute-June-2022.pdf.74 Interview with industry representative,21 November 2023.75 Pawel Drozdowski et al.,“Demographic bias in biometrics:A survey on an emerging challenge,”IEEE Transactions on Technology and Society 1,no.2(2020):89-103;Chang(2022);Interview with industry rep

207、resentative,12 October 2023;Interview with industry representative 2,12 October 2023;Interview with industry representative 2,20 October 2023.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 31 fingerprint ridges.76 Increased awareness of demographic bias has resulted in efforts to colle

208、ct more diverse training datasets,and more concerted efforts to measure performance for certain groups.However,these are typically restricted to a narrow range of dimensions(e.g.race or gender,while ignoring factors such as disability)and categories(paring male versus female,while ignoring identitie

209、s such as non-binary).Mitigating demographic bias is particularly critical in law enforcement(e.g.entry and freedom of movement),given the potential impact on individuals human rights.Close attention should therefore be paid to how bias may enter in biometric systems,including within datasets or sys

210、tem outputs.77 Bias can also be introduced by humans-in-the-loop(operators using the system)which can be more difficult to test and detect.78 This should be taken into account during testing and evaluation processes to avoid potential discriminatory implications arising from the wider deployment set

211、up.Questions over scientific validity While the debate over identification systems has often focused on skewed datasets and environmental factors,critics of certain inferential systems have pointed to more fundamental limitations.Emotion detection which aims to infer emotional states from facial ima

212、ges,speech and other characteristics has been widely criticised as pseudoscience.79 Critics suggest there is a lack of evidence that external expressions can infer emotional states,and that how emotions are displayed depends on social and cultural contexts.Some particular classification systems have

213、 also been labelled as pseudoscience,such as those that predict age based on bone structures,or sexuality based on facial features.80 Subsequently,police and law enforcement should be particularly cautious of the hype and overselling of these new and often unproven technologies and,in 76 Interview w

214、ith industry representative 2,12 October 2023;“Data protection requirements when using biometric data,”ICO,https:/ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/guidance-on-biometric-data/data-protection-requirements-when-using-biometric-data/.77 Malak Sadek,Sam Stockwell and Marion Osw

215、ald,Evaluating the use of facial recognition in UK policing,(The Alan Turing Institute:December 2023);Interview with industry representative 2,12 October 2023;Interview with industry representative,12 October 2023.78 Ibid;Interview with industry representative,21 November 2023.79 Meredith Whittaker

216、et al.,AI Now report 2018(AI Now Institute:December 2018),https:/ainowinstitute.org/publication/ai-now-2018-report-2;Amba Kak and Sarah Myers West,2023 Landscape:Confronting Tech Power(AI Now Institute:April 2023),https:/ainowinstitute.org/2023-landscape;Lisa Feldman Barrett et al.,“Emotional expres

217、sions reconsidered:Challenges to inferring emotion from human facial movements,”Psychological science in the public interest 20,no.1(2019):1-68;Interview with industry representative,26 September 2023;Interview with academic representative,2 October 2023.80 Interview with academic representative,28

218、September 2023.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 32 cases where the scientific evidence base is uncertain,prohibit the use of such systems due to the risks that may arise.81 Adversarial attacks Although not a focus of this report,the increasi

219、ng advancement of adversarial attacks is an ongoing risk.These attacks aim to trick biometric systems for example to achieve false matches for verification(spoofing),or to avoid detection from identification systems(masking),such as through obscuring sensors.82 Progress in generative AI techniques,a

220、long with the increased availability of publicly available biometric data(e.g.facial images)has enabled increased performance in some attacks.83 Threats to system security are particularly important in a law enforcement context,for example actors attempting to spoof verification systems to gain unau

221、thorised access to a system,device or premises.3.2.Concerns around data collection and sharing AI-based systems often require very large training datasets.This has resulted in several controversies whereby biometric images or videos are scraped from the Internet without the explicit knowledge or con

222、sent of data subjects,who may not have expected their data to be used for such purposes.84 In 2022,Clearview AI was fined for breaching data protection laws by the ICO,who accused the company of unlawfully collecting Internet data on individuals from the UK and globally to create a FR database that

223、was accessible for police departments.85 Leakage or theft of biometric data can be particularly harmful to affected individuals and organisations based on its sensitive nature.For example,everyone can change compromised passwords.However,since biometric information is encoded in an individuals body

224、or behaviour,such information may be impossible to replace if compromised without additional protections.Genetic data can also contain information about relatives,posing a privacy threat beyond the individual in question.86 81 Interview with academic representative,25 September 2023.82 Interview wit

225、h academic representative,22 September 2023;Interview with academic representative,6 October 2023;Interview with industry representative 2,12 October 2023;Interview with academic representative,22 September 2023.83 Interview with industry representative,21 November 2023.84 Nicolas Kayser-Bril,“Face

226、recognition data set of trans people still available online years after it was supposedly taken down,”Algorithm Watch,15 September 2022,https:/algorithmwatch.org/en/dataset-face-recognition.85“ICO fines facial recognition database company Clearview AI Inc more than 7.5m and orders UK data to be dele

227、ted,”ICO News and Blogs,23 May 2022,https:/ico.org.uk/about-the-ico/media-centre/news-and-blogs/2022/05/ico-fines-facial-recognition-database-company-clearview-ai-inc/.86 Interview with academic representative,2 November 2023.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 33 3.3.Approp

228、riate use and impacts to civil liberties Many applications of biometrics technologies have been criticised on ethical grounds,rather than performance.These relate to the use of systems in ways that can negatively impact human rights,such as during protests,or by oppressive regimes to identify minori

229、ty groups,track journalists and for other authoritarian purposes.87 While many inferential systems(such as emotion detection)have been described as pseudoscientific,highly subjective and context-dependent accurate inferential systems would also raise serious concerns.These systems could intrusively

230、reveal intimate states,such information could be misused(e.g.for manipulation),and individuals may have little control over the signals used to acquire such predictions.88 Several classification systems that aim to predict demographic characteristics such as race and gender may also be problematic,g

231、iven that these can be contextual,fluid,and political,with many individuals viewing themselves through a multicultural lens.89 Human rights considerations Concerns have been raised that biometric systems can disproportionately interfere with human rights,including the right to respect for private li

232、fe(Article 8 of the European Convention on Human Rights,ECHR),freedom of expression(Article 10 ECHR),and freedom of assembly and association(Article 11 ECHR).LFR in particular has been a topic of heated debate regarding its impact for civil liberties,especially given its increasing use in public spa

233、ces.90 Beyond these immediate impacts,some have highlighted ambiguity over how biometric systems could also lead to unpredictable impacts and harm in future.Even if systems are introduced in constrained settings under tight controls for a specific purpose,there is a risk of mission or function creep

234、 and the expansion of use over time.91 Indeed,one interviewee was concerned that discussions on FR had moved from static one-off uses 87 Richard Van Noorden,“The ethical questions that haunt facial-recognition research,”Nature 587,no.7834(2020):354-359;Conor Healy and Donald Maye,“Punishing Journali

235、sts PRC Provinces Latest Mass Surveillance Project,Won by Neusoft Powered by Huawei,”IPVM,29 November 2021,https:/ Wendehorst&Duller(2021).89 Nenad Tomasev et al.,“Fairness for unobserved characteristics:Insights from technological impacts on queer communities,”in Proceedings of the 2021 AAAI/ACM Co

236、nference on AI,Ethics,and Society(New York:Association for Computing Machinery,2021),254265;Ruha Benjamin,“Race after technology,”in Social Theory Re-Wired:New Connections to Classical and Contemporary Perspectives,ed.Wesley Longhofer and Daniel Winchester(Routledge,2023),405-415.90 Chang(2022).91 Z

237、ara Rahman,Paola Verhaert,and Carly Nyst,Biometrics in the humanitarian sector(Oxfam:2018),https:/ Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 34 towards their use in conjunction with body-worn cameras,92 which could contribute to a problematic normalisati

238、on of increased surveillance.93 Proportionality and consent Since the use of biometric technologies can impact human rights,whether such rights should be infringed upon for policing or law enforcement requires a legal framework that meets the standards for the rule of law and the linked requirements

239、 of necessity and proportionality.94 Participants at a recent workshop hosted by The Alan Turing Institute and Metropolitan Police highlighted that proportionality assessments could not be limited to either technical or legal evaluations instead requiring consideration of policies,procedures and eva

240、luation mechanisms.95 The CETaS Structured Framework for Assessing Proportionality of Privacy Intrusion of Automated Analytics provides one such framework for bringing these elements together,which aims to cover the entire lifecycle of analytic systems.96 To improve proportionality during system tri

241、als,UK policing should also ensure that all procedures include a range of testing options from highly controlled environments towards operational conditions,moving along as the testing(and subsequent necessary improvements)provides assurance of the systems reliability.This was reflected in Frances a

242、pproach,where it deployed FR technology with consenting volunteers who were compared against a watchlist containing a combination of their own and AI-generated faces.97 3.4.Trust and transparency The risks and sensitivities around biometrics point to the importance of public understanding and trust.

243、However,building and maintaining this trust first necessitates adequate transparency.A recurring challenge mentioned by policing interviewees was how officers should communicate strategies to the public,particularly given that the black-box nature of many AI systems can make it difficult to understa

244、nding how outputs are reached.98 92 Interview with civil society representative,6 November 2023.93 Sadek et al.,(2023).94“Human Rights Act 1998,”Legislation.gov.uk,https:/www.legislation.gov.uk/ukpga/1998/42/contents;Sadek et al.,(2023).95 Sadek et al.,(2023).96 Ardi Janjeva,Muffy Calder and Marion

245、Oswald,“Privacy Intrusion and National Security in the Age of AI:Assessing proportionality of automated analytics,”CETaS Research Reports(May 2023).97Jasserand(2023),12-19.98 Interview with policing representative,6 October 2023;Interview with policing representative,26 October 2023.Sam Stockwell,Me

246、gan Hughes,Carolyn Ashurst and Nra N Loidein 35 In their view,failure to clearly explain the benefits and oversight of these systems with local communities can pose challenges in enabling wider deployment for public safety activities.Police representatives emphasised that engagement with the public

247、should address the following areas:who is in control of the system,who is defining the boundaries of use and what benefits do these systems provide?99 However,it is important to note that these methods primarily focus on informing the public and other stakeholders.More is needed to ensure meaningful

248、 engagement and accountability,such as collaboration with communities through surveys or focus groups to build buy-in.100 3.5.Challenges for evaluation Understanding system performance,for example through accuracy or error rates,is vital to identifying the potential impacts of deployment.101 However

249、,testing strategies employed by developers often differ,making interpretation and comparisons difficult.102 Testing is also often done in idealised settings,meaning that real-world performance is likely to be worse in practice.103 Understanding how performance may vary for different demographic grou

250、ps is also essential.Efforts to improve evaluation in this respect have improved following work by scholars,advocacy groups and journalists to surface such risks,though many believe more needs to be done.For instance,NIST conducted one specific test which demonstrated how all of the algorithms asses

251、sed exhibited different levels of biased performances based on gender,race,and age groups.104 To understand whether a particular system is appropriate for use,evaluation needs to go beyond system performance measures.One interviewee suggested the need for four stages of evaluation:technology evaluat

252、ion,scenario evaluation,operational evaluation and continuous evaluation,with particular emphasis on the last stage which is not referenced strongly in existing technical standards on biometric system evaluation.105 Assurance is not only needed that the model in isolation performs as intended,but th

253、at the wider outcomes 99 Interview with policing representative,6 October 2023;Interview with policing representative,26 October 2023.100 Ada Lovelace Institute,Participatory Data Stewardship(Ada Lovelace Institute:2021),https:/www.adalovelaceinstitute.org/report/participatory-data-stewardship/.101

254、Interview with industry representative 2,12 October 2023.102 Hitoshi et al.,The future of biometrics technology:from face recognition to related applications,”APSIPA Transactions on Signal and Information Processing 10,(2021):e9.103 Jansen et al.,(2021).104 Anil K.Jain,Debayan Deb,and Joshua J.Engel

255、sma,“Biometrics:Trust,but verify,”IEEE Transactions on Biometrics,Behavior,and Identity Science 4,no.3(2021):303-323.105 Interview with industry representative,29 September 2023.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 36 of the system deployed in c

256、ontext are understood.106 This includes accounting for how the system is used in practice,how operators act on its outputs and broader societal impacts,such as surveillance risks.106 Rosamund Powell and Marion Oswald,“Assurance of Third-Party AI Systems for UK National Security,”CETaS Research Repor

257、ts(January 2024),https:/cetas.turing.ac.uk/publications/assurance-third-party-ai-systems-uk-national-security.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 37 4.Legal Risks and Challenges This section presents a summary of the current UK regulatory and policy framework governing polic

258、e and law enforcement use of biometrics,as well as criticisms over gaps or risks that have emerged in recent years.4.1.Overview and limitations of UK biometrics regulation Overview of UK biometrics regulation The UKs current biometrics regulatory framework constitutes several pieces of legislation w

259、hich stretch back decades.Instead of applying blanket coverage to biometric data or systems,different legal areas seek to govern biometrics in specific ways(see Figure 5).Yet as this section highlights,these various laws have been subject to criticisms over gaps and shortcomings.Figure 5:Legal areas

260、 covering biometrics oversight in the UK The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 38 Limitations of UK biometric regulation:common law reliance There have been several instances in the UK where different organisations unlawfully collected,retained,u

261、sed or shared biometric data or failed to apply biometric techniques correctly.107 These have had serious consequences,including miscarriages of justice(see Table 6).Recent announcements from the UKs senior policing minister on allowing police access to passport photos,alongside running FR searches

262、against 50 million driving licenses,raise questions about the adequacy of measures in place for preventing future data misuse.108 Failure to keep pace with innovation has been exacerbated by confusion over where the legal boundaries lie with new biometric systems or data types.Despite the sheer amou

263、nt of existing relevant legislation,many statutes were not enacted at a time when the capture,automated processing,and analysis of biometrics in real time from public spaces was feasible.As a result,legal rulings and other ad-hoc developments are being heavily relied upon to inform procedures and pr

264、ovide safeguards.109 While these developments have since improved practices,there are concerns over whether they can be extended to establish a clear nationwide basis for the long-term use of biometric systems.110 Although UK police forces have a core duty under common law to protect the public by d

265、etecting and preventing crime,these are broad powers and can sometimes be vague in terms of applying limits.111 The Bridges judgement(2020)in particular reflected how relying on common law powers did not give a specific legal basis authorising police use of FR systems.112 107 It is important to also

266、 note a recent non-UK legal ruling(Glukhin v.Russia),whereby the European Court of Human Rights ruled that the detaining of a protestor using FR systems following a demonstration violated the protestors right to freedom of expression.108 Tom Singleton,“Police access to passport photos risks public t

267、rust,”BBC News,4 October 2023,https:/www.bbc.co.uk/news/technology-67004576;Joel R.McConvey,“UK bill would let police run facial recognition against all drivers licenses,”Biometric Update,21 December 2023,https:/ Nra N Loidein,“Lawfulness and Police Use of Facial Recognition in the UK:Article 8 ECHR

268、 and Bridges v South Wales Police,”in Facial Recognition in the Modern State,ed.Rita Matulionyte and Monika Zalnieriute(Cambridge:Cambridge University Press,2024),19,https:/ Baroness Hamwee,“House of Lords Justice and Home Affairs Committee letter to the UK Home Secretary,”26 January 2024,https:/com

269、mittees.parliament.uk/publications/43080/documents/214371/default/.111 Jennifer Brown,“Police powers:an introduction,”House of Commons Library,21 October 2021,https:/commonslibrary.parliament.uk/research-briefings/cbp-8637/.112 N Loidein(2024),17.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N

270、Loidein 39 Table 6:Overview of major biometric legal incidents in the UK since the 1990s Legal incident Examples Summary Miscarriages of justice Stefan Kiszkos incorrect conviction(1992)Andrew Malkinsons incorrect conviction(2004)Both individuals were wrongfully imprisoned on the basis of incorrect

271、biometric forensic samples used by the police,with advancements in techniques later acquitting them of committing any crimes.113 Court rulings S&Marper v United Kingdom(Grand Chamber)(2008)RMC&FJ v Commissioner of the MPS and Secretary of State for the Home Department(2012)Ed Bridges v South Wales P

272、olice(2020)Gaughran v the United Kingdom(2020)The UK Courts ruled,in several different legal cases,that the UK Government and policing officials had violated the law in relation to biometrics.This includes infringing on data protection law,as well as human rights law,with the deployment of LFR syste

273、ms that breached privacy rights.114 Database violations Suprema(2019)HM Revenue and Customs(2019)Clearview AI(2022)Organisations had illegally collected,retained or used the biometric data of UK individuals.115 113 Evidence Based Justice Lab;Lauren Hirst and Tom Mullen,“Andrew Malkinsons rape convic

274、tion quashed after 20-year fight,”BBC News,26 July 2023,https:/www.bbc.co.uk/news/uk-england-manchester-66310919.114“S and Marper v United Kingdom,”Equality and Human Rights Commission,8 June 2016,https:/ Buch,“Police retention of photographs unlawful,High Court rules,”UK Human Rights Blog,27 June 2

275、012,https:/ Rees,“Facial recognition use by South Wales Police ruled unlawful,”BBC News,11 August 2020,https:/www.bbc.co.uk/news/uk-wales-53734716.115 Josh Taylor,“Major breach found in biometrics system used by banks,UK police and defence firms,”The Guardian,14 August 2019,https:/ Cox,“UK ICO chall

276、enges Clearview AI ruling,”Pinsent Masons News,21 November 2023,https:/ Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 40 Concern over risk and regulator coverage Historically,the primary focus of biometrics regulation has been on protecting an individuals pe

277、rsonal data and their right to privacy,given the ability to uniquely identify an individual with a high degree of confidence using such data.116 However there are other serious concerns over biometric systems among the public which fall outside of these considerations,such as the consequences for in

278、dividuals if a system output is incorrect(e.g.wrongful arrest).117 Identification and classification techniques also raise group-level risks that may affect multiple individuals.The sensors on technology such as FR can capture data from their surroundings,outside of just the individual who is in pri

279、mary view,which could therefore include unaware individuals walking past in the background.Moreover,existing equality and human rights laws are primarily focused on protecting the rights of the individual.118 As such,there may be ambiguity over how such legislation might apply for group-based classi

280、fication systems for instance a biometric system which categorises individuals into different demographic groups,without necessarily identifying them.The forthcoming Data Protection and Digital Information Bill is designed to simplify the UKs data protection rules,and includes amendments to the posi

281、tion of the Biometrics and Surveillance Camera Commissioner(BSCC),the only regulator with an explicit remit for biometrics in England and Wales.119 In abolishing the surveillance component of this role,oversight of biometrics is expected to be passed on to the ICO,which has remit over data protectio

282、n.120 However,there are concerns over whether the ICO has sufficient resources and scope to cover the range of potential risks outlined above.121 The creation of a risk management framework could therefore help to alleviate these anxieties.The regulator does appear to be taking a forward-looking app

283、roach to the technology,through planning to use regulatory sandboxes to test innovations.122 Such experiments should include a focus on systems that may be used within law enforcement,which could help to pre-empt system risks with innovations that could be promising for public safety activities.116

284、Matthew Ryder QC(2022),21.117 See survey data in Section 5.118 CETaS workshop,22 January 2024.119 Fussey&Webster(2023),6-7.120 Biometrics and Surveillance Camera Commissioner(2023).121 Fussey&Webster(2023),50-54.122“Our current areas of focus for the Regulatory Sandbox,”ICO,https:/ico.org.uk/for-org

285、anisations/advice-and-services/regulatory-sandbox/our-current-areas-of-focus-for-the-regulatory-sandbox/.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 41 Ambiguous private sector oversight Private sector use of biometric systems is growing.123 This trend has heightened concern that bi

286、ometrics regulation predominantly applies to the public sector.Laws such as the Police and Criminal Evidence Act 1984(PACE)and the Terrorism Act 2000 only regulate police or border security uses of biometric data.Yet even those broader in scope,like the Equality Act 2010,are only applicable to priva

287、te organisations in limited circumstances(e.g.those performing public functions).124 Risks from this patchy oversight could materialise as some companies explore using biometric systems for tackling crime(e.g.identifying shoplifters).125 Given the low confidence in private companies using biometric

288、systems,126 much is needed to increase public confidence over safeguards in place.123 Jawahitha Sarabdeen,“Protection of the rights of the individual when using facial recognition technology,”Heliyon 8,no.3(March 2022):1-2,https:/doi.org/10.1016/j.heliyon.2022.e09086.124 ICO,“Data sharing:a code of

289、practice,”https:/ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/data-sharing/data-sharing-a-code-of-practice/.125 Josie Hannett and William McLennan,“Shoplifting:The small businesses using facial recognition cameras,”BBC News,11 October 2023,https:/www.bbc.co.uk/news/uk-england-surrey-6

290、6982326.https:/www.bbc.co.uk/news/uk-england-surrey-66982326 126 See survey data in Section 5.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 42 4.2.Overview and limitations of UK biometrics policy Overview of UK biometrics policy Policy acts as an additio

291、nal layer to legislation in providing guidance for decision-making.Figure 6 provides an overview of relevant policy areas,while the rest of this section highlights corresponding gaps and limitations.Figure 6:Policy areas covering biometrics in the UK Limitations of UK biometrics policy:patchy codes

292、of practice The most prominent component of biometrics policy relates to codes of practice.These act as a set of either voluntary or legally enforceable guidelines that set minimum standards and promote compliance with the law.Codes can cover everything from the appropriate deployment of biometric s

293、ystems to the correct processing and handling of biometric data.However,there are concerns that current codes are both outdated and insufficient.Existing codes focus almost exclusively on legal compliance when handling certain data types,like DNA and fingerprints.127 Policing and law enforcement are

294、 increasingly using FR technology that captures and stores facial images,while voice recognition is considered an area of future interest.128 The lack of 127 Surveillance Camera Commissioners Office,“What we talk about when we talk about biometrics,”Surveillance Camera Commissioners Office Blog,12 O

295、ctober 2021,https:/videosurveillance.blog.gov.uk/2021/10/12/what-we-talk-about-when-we-talk-about-biometrics/;Pete Fussey and William Webster,Independent report on changes to the functions of the Biometrics and Surveillance Camera Commissioner arising from the Data Protection and Digital Information

296、(No.2)Bill(CRISP:October 2023),9,https:/assets.publishing.service.gov.uk/media/653f7128e6c968000daa9cae/Changes_to_the_functions_of_the_BSCC.pdf.128 Interview with academic representative,25 September 2023;Interview with government representative,24 October 2023.Sam Stockwell,Megan Hughes,Carolyn As

297、hurst and Nra N Loidein 43 tailored safeguards over these materials risks legal ambiguity which could be exploited.129 While some recent codes have been important in filling gaps in primary legislation,such as the College of Policings APP on LFR use by UK police forces,they are equally limited in th

298、at they do not address other forms of FR(e.g.retrospective use).A new APP on retrospective FR would therefore be beneficial for legal clarity.130 Sociotechnical considerations in system evaluation standards A number of international standards bodies publish best practices on biometric system specifi

299、cations and quality management.This includes ISO/IEC JTC 1/SC 27 and 37,BSIs IST/44 and CEN-CENELECs CEN/TC 224.Topics covered by these groups include standards on evaluation(e.g.ISO/IEC 19795-1:2021)which provide useful information on error rates,throughput rates and reducing bias.131 Nevertheless,

300、there is lack of wider sociotechnical considerations in existing standards,which are important for biometric systems given the degree of sensitive data involved.132 As such,relevant working groups within standards bodies,such as BSIs IST/44,should explore updating current standards on testing and ev

301、aluation.This could incorporate potential consequences of system deployments on surveillance concerns and chilling effects in public spaces,alongside the role of human error in potentially undermining the reliability of outputs(e.g.through the poor placement of sensors).Public-private partnerships P

302、articipants raised several concerns regarding current legal frameworks for public sector acquisition and use of commercial biometric systems.133 In comparison to the stringent accountability mechanisms placed on UK police and law enforcement agencies,the lack of comparative checks in the private sec

303、tor means that there can be less transparency on system data and testing when sourced through industry.134 In response to an FOI request submitted for this study,the Home Office stated that the decision to purchase a biometric system on a trial basis is for each police force to take 129 Matthew Ryde

304、r QC(2022),52.130 College of Policing,Authorised Professional Practice:Live facial recognition(College of Policing:March 2021),6,https:/assets.college.police.uk/s3fs-public/2021-05/live-facial-recognition-app.pdf.131“ISO/IEC 19795-1:2021,”ISO,May 2021,https:/www.iso.org/standard/73515.html.132 Inter

305、view with academic representative,28 September 2023;Interview with industry representative,29 September 2023.133 Interview with policing representative,6 October 2023;Interview with academic representative,6 October 2023;Interview with policing representative,11 October 2023.134 Interview with polic

306、ing representative,6 October 2023.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 44 locally.135 While such local decision-making enables regional context to be taken into account,insufficient national oversight can lead to duplication of effort and a patc

307、hwork of safeguards and oversight mechanisms.With Essex Police recently announcing trials of LFR with equipment supplied by SWP,there remain questions over where accountability lies for the use of the system.136 One option is the creation of a nationwide registry of new biometric systems in policing

308、,which would help to keep track of deployments and improve public transparency.Another option is the development of policing guidance that would streamline requirements applied during any transfers of technology between forces,as well as with procurements from the private sector.135 Freedom of infor

309、mation request submitted to the Home Office by CETaS on 24 November 2023.136“Live facial recognition,”Essex Police,https:/www.essex.police.uk/police-forces/essex-police/areas/essex-police/au/about-us/live-facial-recognition/.Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 45 5.Public At

310、titudes to Biometrics This section presents the findings of the CETaS survey,which involved a nationally-representative sample of 662 UK-based respondents.As a nationally-representative sample,we acknowledge that the sample is skewed towards demographic majority groups and may not therefore account

311、for the specific concerns held by minority demographics.Not sure or prefer not to say responses have been excluded from the data visualisations in this section given the focus on key findings,which means that some of the total percentage counts do not add up to 100%.Additional information on methodo

312、logy and the full survey results can be found in Appendix 1.5.1 Comfort and trust vary by application and organisation Respondents demonstrated a nuanced view of biometric technologies,with expressed levels of comfort varying significantly depending on the application(see Figure 7).For most policing

313、 and law enforcement applications,the majority of respondents feel comfortable with such use;namely using biometrics to verify identities at the UK border(85%)or trying to identify criminal suspects in crowded areas with low or high diversity(61%and 60%respectively)(see Figure 7).The exception was t

314、he use of biometric data to determine whether someone might not be telling the truth,for which only 29%selected that they felt comfortable.This finding correlates with a similar survey conducted by the Ada Lovelace Institute,which found FR technology applications that were used by the police for cri

315、minal investigations received much stronger support than other use cases.137 137 Ada Lovelace Institute,Beyond face value:public attitudes to facial recognition technology(Ada Lovelace Institute:September 2019),8,https:/www.adalovelaceinstitute.org/wp-content/uploads/2019/09/Public-attitudes-to-faci

316、al-recognition-technology_v.FINAL_.pdf.The Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 46 Figure 7:Percentage of participants who selected that they feel comfortable with the given application When it came to trust in organisations,respondents reported hig

317、her levels for public sector organisations such as police forces(79%)and the NHS(66%),but much lower for commercial entities particularly employers(42%)and retailers(38%).This graph combines the average percentages from two separate survey questions on trust.138 138 For detailed survey results,pleas

318、e see Appendix 1.8%9%10%16%29%34%45%57%60%61%75%85%85%0%25%50%75%100%Assessing job interview performance(e.g.gauging enthusiasm)Voluntary digital ID schemesTracking student or employee engagement(e.g.assessing attention levels)None of thesePolice forces determining whether someonemight not be tellin

319、g the truthRetailers estimating customer ages for age-restricted transactionsRetailers trying to identify suspects linked toshoplifting or violenceMandatory digital ID schemesTrying to identify criminal suspects incrowded areas with high diversityTrying to identify criminal suspects incrowded areas

320、with low diversityAccessing personal spaces(e.g.workplaces)Verifying identities at the UK borderUnlocking personal devices(e.g.phone)Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 47 Figure 8:Percentage of participants who selected whether they trust completely,trust somewhat or do not

321、 trust at all/very much different organisations to use biometric systems responsibly When asked whether participants were comfortable with biometric data sharing schemes between police forces and the private sector for public safety activities,the majority of respondents were quite or very uncomfort

322、able with this type of data sharing process(57%).4%5%4%6%9%12%19%19%27%31%34%39%44%48%50%48%52%52%59%56%51%46%37%30%30%29%18%0%25%50%75%100%Retailers(e.g.seeking to identify shoplifters)EmployersLocal authoritiesOther government departmentsPrivate healthcare providers(e.g.care homestaff)Third sector

323、(e.g.seeking to locate missingchildren)Intelligence agenciesPolice forcesNHS staffTrust completelyTrust somewhatDo not trust at all/very muchThe Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 48 Figure 9:Percentage of participants who selected whether they we

324、re very comfortable,comfortable,quite uncomfortable or very uncomfortable in biometric data sharing between the police and private entities Responses to this question differed by region,with respondents from Scotland(28%“comfortable”or“very comfortable”)and Northern Ireland(11%)being less comfortabl

325、e with data being shared between the police and the private sector than those in England(36%)and Wales(48%).While this may be due to a number of contextual factors affecting trust in police more generally,these findings highlight that public attitudes to biometrics vary between the nations,which sho

326、uld not be overlooked when designing new policies.5.2 High levels of concern shown for a wide range of risks Participants were asked how concerned they were about eight wide-ranging risks from biometric systems,such as privacy intrusion or unreliable systems.For all risks,at least 84%of participants

327、 selected that they were somewhat concerned or very concerned(see Figure 10).Those with the highest rates for very concerned were as follows:risks around mass collection of personal data without consent(61%),potential mistakes or invalid results being made by biometric systems(60%)and identity theft

328、 from data breaches(58%).24%33%28%7%0%25%50%75%100%Very uncomfortableQuite uncomfortableComfortableVery comfortable Sam Stockwell,Megan Hughes,Carolyn Ashurst and Nra N Loidein 49 Figure 10:Percentage of participants who selected whether they were very concerned,somewhat concerned or not concerned w

329、ith different risks from biometric systems 5.3 A strong desire for explicit regulation and bans on certain use cases Respondents were asked whether certain biometric applications should be unregulated,explicitly regulated,or banned.For every application,the majority responded that the use case shoul

330、d be explicitly regulated or banned.In terms of outright bans,the majority of respondents reported that the use of novel biometric systems in job interviews to assess performance(63%)and tracking student or employee engagement(60%)should be banned.32%44%46%49%54%58%60%61%52%44%40%39%37%32%34%30%11%8

331、%7%7%5%7%2%7%0%25%50%75%100%Disproportionate invasion of privacyLack of clarity over how the system worksReduced or excluded access to publicservices if the systems do not work wellLack of clarity over human involvement indetermining decisionsUnreliable technology integrated into thesystemIdentity t

332、heft from data breachesThe system may produce wrong or invalidresultsMass collection of personal data withoutconsentVery concernedSomewhat concernedNot concernedThe Future of Biometric Technology for Policing and Law Enforcement:Informing UK Regulation 50 Figure 11:Percentage of participants who sel

333、ected whether they wanted different biometric use cases to be banned,explicitly regulated(e.g.specific laws or provisions in laws)or not regulated by explicit legislation 5.4 Most people perceive benefits will outweigh concerns Finally,when considering how biometric systems will impact society in the future,a slight majority of respondents(53%)reported that the benefits of biometrics will outweigh

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(CETaS:2024生物识别技术在警务和执法领域的应用未来展望报告(英文版)(79页).pdf)为本站 (白日梦派对) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
会员购买
小程序

小程序

客服

专属顾问

商务合作

机构入驻、侵权投诉、商务合作

服务号

三个皮匠报告官方公众号

回到顶部