上海品茶

您的当前位置:上海品茶 > 报告分类 > PDF报告下载

CETaS:2023隐私设计技术未来发展趋势:对英国安全的政策影响研究报告(英文版)(61页).pdf

编号:158701 PDF   DOCX 61页 4.38MB 下载积分:VIP专享
下载报告请您先登录!

CETaS:2023隐私设计技术未来发展趋势:对英国安全的政策影响研究报告(英文版)(61页).pdf

1、 0 The Future of Privacy by Design Technology Policy implications for UK security Sam Stockwell,Andrew Glazzard and Alexander Babuta September 2023 Sam Stockwell,Andrew Glazzard and Alexander Babuta 1 About CETaS.2 Acknowledgements.2 Executive Summary.3 Recommendations.6 Introduction.10 The context.

2、10 Methodology and structure.13 Definitions.16 1.The Current Privacy Debate.18 1.1 Privacy and security concepts.18 1.2 Capabilities and risks to capabilities.21 1.3 Responses to loss of capability.27 1.4 Governance and policy issues.29 1.5 The way forward.34 2.Future Horizons of Privacy Technology.

3、36 2.1 Drivers of change shaping the future landscape.36 2.2 Axes of uncertainty.40 3.An Alternative Vision for Future PBD Technology.42 3.1 Privacy 2035:An alternative vision.42 4.Recommendations.45 4.1 Create an industry-wide certification scheme for future PBD technology design.46 4.2 Facilitate

4、cross-sector training and secondments.47 4.3 Incentivise corporate social responsibility models.47 4.4 Strengthen international alignment on PBD principles.48 4.5 Develop a new model for constructive dialogue and stakeholder inclusivity in the IETF.49 4.6 Engage with a wider range of CSOs to broker

5、useful dialogue.50 4.7 Establish privacy education and public engagement programmes.51 4.8 Clarify the impact of PBD technology on interdiction capabilities.53 5.Conclusion.54 About the Authors.55 Appendix:Methodology.56 Phase I:Data gathering.56 Phase II:Driver mapping and scenarios.56 Phase III:Vi

6、sioning and backcasting.57 Phase IV:Multistakeholder engagement.58 The Future of Privacy by Design Technology:Policy implications for UK security 2 About CETaS The Centre for Emerging Technology and Security(CETaS)is a research centre based at The Alan Turing Institute,the UKs national institute for

7、 data science and artificial intelligence.The Centres mission is to inform UK security policy through evidence-based,interdisciplinary research on emerging technology issues.Connect with CETaS at cetas.turing.ac.uk.This research was supported by The Alan Turing Institutes Defence and Security Progra

8、mme.All views expressed in this report are those of the authors,and do not necessarily represent the views of The Alan Turing Institute or any other organisation.Acknowledgements The authors are grateful to all those who took part in a research interview or workshop for this project,without whom the

9、 research would not have been possible.The authors are also very grateful to Dan Sexton(Internet Watch Foundation),Professor Steven Murdoch(University College London)and the anonymous reviewers for their valuable feedback on an earlier version of this report.Sam Stockwell,Andrew Glazzard and Alexand

10、er Babuta 3 Executive Summary This CETaS Research Report explores the implications of privacy-by-design(PBD)technology for UK security,and provides evidence-based recommendations for how future policy can achieve a long-term,beneficial outcome on this issue over the next 10 years.The findings have b

11、een gathered from a range of experts and organisations across academia,civil society organisations,industry,law enforcement agencies,and UK Government departments.While there is no standardised or single definition of PBD,this report has adapted terminology from Article 25 of the General Data Protec

12、tion Regulation(GDPR):the embedding of privacy and data protection considerations into the design of devices,standards and software,such as encryption,anonymisation and traffic obfuscation features.Digital privacy is a highly contentious and divisive topic.The widespread introduction of data encrypt

13、ion over the last three decades has resulted in significant benefits for citizens and the state,from both a privacy and cybersecurity perspective.From preventing malicious breaches into personal data to protecting human rights and supporting a trustworthy and secure digital ecosystem,these measures

14、enhance the protection of individuals in multiple ways.At the same time,however,the latest encryption and anonymisation designs being embedded into user devices pose new challenges to lawful data access for detecting,mitigating and prosecuting criminal activities;from the circulation of child sexual

15、 abuse material and terrorism planning,to financial fraud and cyberattacks.Consequently,the societal implications of future privacy features will be complex,with a need to recognise that there will be some level of risk and trade-offs involved with any decisions made.The current controversy surround

16、ing the UKs Online Safety Bill(OSB)which would require online services to scan their communication platforms for harmful content in certain circumstances reflects a situation where constructive dialogue over specific risks and trade-offs has sometimes been lacking.On the one hand,the UK Governments

17、proposals have been characterised by several major online service providers as significantly weakening user privacy and compromising safety.On the other hand,government officials have argued that the mass rollout of ubiquitous end-to-end encryption(E2EE)has failed to embed the safety of the public i

18、nto system designs.The Future of Privacy by Design Technology:Policy implications for UK security 4 The research has found that an absence of clarity on the key concepts of security and privacy contributes to narrow views of public good,safety and security risks on both sides of the debate.However,s

19、uch polarisation has not produced a stalemate:instead,industry is rolling out new PBD features unimpeded,while the UK Government threatens to respond with legislation to set standards for wider societal safety that,critics say,risks fatally undermining user privacy.In parallel to these challenges,th

20、is study found that certain standards developing organisations(SDOs),which are dedicated to the vital task of ensuring the development of robust and interoperable technologies,can contribute to this polarisation.SDOs develop,review and amend Internet communications and routing standards to maintain

21、functionality and interoperability,with some like the Internet Engineering Task Force(IETF)playing a critical role in shaping digital privacy architecture.Nevertheless,despite being formally open and voluntary,some SDOs have been accused of marginalising input from key voices in civil society,law en

22、forcement and central government.Other SDOs,meanwhile,are dominated by governments and tend to impose a state-centric view at the expense of non-governmental organisations.Seeking to move beyond this divisive debate,a driver mapping exercise was undertaken to determine the drivers of change influenc

23、ing the future of the privacy landscape out to 2035.A series of scenarios was generated from the interaction of different driver projections,which highlighted varying future positive and negative outcomes.Using these scenarios,and in consultation with stakeholders through workshops,the most benefici

24、al elements from each outcome were extracted and synthesised into a vision statement.This vision reflects a multistakeholder research and development(R&D)model,where the views of various communities are taken into account before any new features are implemented.At the same time,all sides recognise m

25、ore comprehensive and nuanced understandings of both privacy and security rights,helping to facilitate constructive engagement.In many ways,it is a mirror opposite of the present situation.The recommendations summarised below provide an overview of how different sectors can work together to advance

26、progress towards the vision outlined in this report.Across all actions highlighted,three clear objectives emerge which should be placed at the heart of future policy efforts:1)achieving clarity over concepts;2)ensuring inclusive stakeholder input in the development of all PBD technology;and 3)identi

27、fying areas of compromise and alignment of interests over any new digital privacy proposals.Sam Stockwell,Andrew Glazzard and Alexander Babuta 5 Figure 1.Key objectives for advancing change in the PBD ecosystem The Future of Privacy by Design Technology:Policy implications for UK security 6 Recommen

28、dations Recommendation Relevant Stakeholder(s)Description Objective Create an industry-wide certification scheme for future PBD technology design UK Government Industry Civil society Academia The UK Government,UK Accreditation Service(UKAS)and other relevant UK certification and testing bodies shoul

29、d engage with industry,civil society organisations(CSOs)and academia to develop a domestic voluntary certification scheme for PBD technology.Such a scheme could be akin to existing certification programmes such as Cyber Essentials,meaning companies that comply are publicly recognised via a kitemark.

30、Requirements for achieving a kitemark should include third-party approved evidence of risk assessments that demonstrate how companies have thoroughly considered the risks to different human rights and reduced them to an appropriate level,taking into account competing security trade-offs and the conc

31、erns of all stakeholders.Any risk assessments should be technically specific to provide greater clarity on the benefits and risks involved.Compromise and alignment Facilitate cross-sector training and secondments Industry Law enforcement agencies Civil society UK law enforcement agencies,in collabor

32、ation with industry and CSOs,should outline and formalise a secondment programme to facilitate cross-sector networking,training opportunities and trust-building between these communities.These sectors should draw from existing schemes,such as Australias Defence Industry Secondment Program,to identif

33、y transferrable Compromise and alignment Sam Stockwell,Andrew Glazzard and Alexander Babuta 7 elements,such as offering projects at different security clearance requirements.Incentivise corporate social responsibility models UK Government The UK Government,in coordination with Innovate UK,the Engine

34、ering and Physical Sciences Research Council(EPSRC),and the Science and Technology Facilities Council(STFC),should explore creating a set of grants that offer support to small and medium-sized enterprises(SMEs)on fundamental science and late-stage encryption and anonymisation features that incorpora

35、te comprehensive risk-mitigation measures at the heart of designs.It should be made clear that companies which receive funding for these designs must ensure they consistently apply human rights considerations across national markets,regardless of the political regime in question.Compromise and align

36、ment Strengthen international alignment on PBD principles The UK Government(and other democracies that share UK values,transparency and legal operating models)The UK Government should review and refine the existing international statement on encryption and public safety,transforming it into a policy

37、 of key principles on future privacy technology.This document could be inspired by DEFRAs existing Environmental principles policy statement,outlining a comprehensive concept of security and inclusive R&D processes that balance social and organisational interests.Efforts should be made to further de

38、velop these principles with like-minded democracies that share the UKs values,and through existing multinational bodies including the EU,OECD,G7 and UN.Compromise and alignment The Future of Privacy by Design Technology:Policy implications for UK security 8 Develop a new model for constructive dialo

39、gue and stakeholder inclusivity in the IETF Standards developing organisations The Internet Engineering Taskforce(IETF)should explore creating a new public safety working group(PSWG),to help ensure that a wider range of security trade-offs are taken into account with future protocol amendments.The P

40、SWG should be multinational and multistakeholder to increase its legitimacy.Stakeholder inclusivity Clarify UK Government positions and concerns within the IETF UK Government Law enforcement agencies The UK Government must make greater efforts to publicly disclose the potential impact of future priv

41、acy standards to inform standards developing processes.This should include drafting unclassified impact assessments to the engineering community via existing processes,which specify the risks involved from individual proposals.Stakeholder inclusivity Engage with a wider range of CSOs to broker dialo

42、gue UK Government The UK Government should create a list of CSOs that offer convening power to act as an expert bridge between industry,civil society and the UK Government on privacy issues.These efforts could also facilitate more constructive dialogue(e.g.on enterprise security issues)and expand th

43、e range of potential solutions available that may not currently be under consideration.Stakeholder inclusivity Establish privacy education programmes UK Government Law enforcement agencies The UK Government should coordinate with the Department for Education to facilitate programmes at different edu

44、cational levels across the UK to improve youth awareness on privacy issues.Any programmes should be multistakeholder in nature,incorporating talks from not only relevant government departments,but also CSOs,academics and industry representatives.Such Conceptual clarity Sam Stockwell,Andrew Glazzard

45、and Alexander Babuta 9 engagement should cover:1)the importance of adopting comprehensive approaches to security and privacy;2)the benefits and risks arising from encryption or anonymisation features;3)the need for inclusive R&D approaches;and 4)the stringent limitations placed on public authorities

46、 when accessing user data.Develop a public engagement strategy on PBD technology UK Government Civil society Leveraging actions outlined in the 2020 National Data Strategy,the UK Government should coordinate with the Office for National Statistics and commission polling organisations to conduct publ

47、ic perceptions research across national,regional and local levels on digital privacy topics.These initiatives should focus on focus groups and surveys with citizens to better understand concerns and desires on balancing different security trade-offs involved with new PBD proposals.It is important th

48、at a diversity of communities are consulted,owing to the impact any technology could have on marginalised societal groups.This data can then be shared with industry to inform future R&D processes,such as through a Future Tech Forum event or engaging with techUK board members.Stakeholder inclusivity

49、Clarify the impact of PBD technology on interdiction capabilities Industry Law enforcement agencies All stakeholders should clarify in public discussion the differences between interdiction and investigative capabilities,and ensure that work to provide technical mitigations for loss of capability tr

50、eats the two issues as separate.Conceptual clarity The Future of Privacy by Design Technology:Policy implications for UK security 10 Introduction The Alan Turing Institutes Centre for Emerging Technology and Security(CETaS)conducted an independent research study into the future implications of priva

51、cy-by-design(PBD)technology for UK security.For the purposes of this report,PBD technology is defined as the embedding of privacy and data protection considerations into the design of devices,standards and software such as encryption,anonymisation and traffic obfuscation features.We differentiate th

52、ese technologies from privacy-enhancing technologies(PETs),such as homomorphic encryption and secure multiparty computation,which can be used to facilitate privacy-preserving analysis of sensitive data.1 Our focus is primarily on privacy technology which is intended to enhance end-user confidentiali

53、ty and security,but often comes at the expense of degrading digital investigative capabilities by preventing lawful access to data.As PBD features become increasingly ubiquitous across devices,software,protocols and standards,there remains uncertainty over their long-term impact on public safety and

54、 national security.This report therefore explores the key drivers of change influencing the future PBD landscape,and what future outcomes may materialise in 1015 years time from different interactions between those drivers.This report aims to move beyond the current deadlock and barriers to construc

55、tive dialogue,by articulating an optimal policy position out to 2035,which attempts to accommodate the concerns of all relevant stakeholders.The context Public safety and national security are primary responsibilities of any government.In common with many other countries,the UK maintains special inv

56、estigative capabilities to ensure its law enforcement and intelligence agencies can detect,disrupt and prosecute criminals and other adversaries who cause harm.In the pre-digital age,agencies relied on techniques such as physically intercepting communications to gather intelligence.However,the wides

57、pread availability of Internet technologies,from personal computers to smartphones,transformed the operating environment.More users(including malicious actors)used more systems for more activities,so the potential intelligence yield increased exponentially and qualitatively.At the same time,the prol

58、ific commercialisation of Internet 1 For further discussion on the use of PETs for national security and intelligence,see:https:/cetas.turing.ac.uk/publications/privacy-and-intelligence.Sam Stockwell,Andrew Glazzard and Alexander Babuta 11 technologies was accompanied by the availability of privacy

59、features such as public-key cryptography,a powerful form of encryption which was invented in the mid-1970s and became widely available in the 1990s.2 These encryption features enabled the rapid development of services such as Internet banking,e-commerce and digital telephony,but they also posed chal

60、lenges to law enforcement agencies,who were used to having a major technical edge.Until recently,it appears that UK authorities were able to maintain that edge.According to Ciaran Martin,former head of the National Cyber Security Centre(NCSC),the period from 2001(the 9/11 attacks)to 2013(the Edward

61、Snowden allegations against the US National Security Agency(NSA)and UK Government Communications Headquarters(GCHQ)can be described as a golden age of signals intelligence,when PBD features were only used patchily while digital data(and therefore intelligence)increased exponentially.3 Digital invest

62、igative capabilities at this time were used to counter terrorism and other security threats,but also to bear down on the spiralling problem of online child sexual abuse material(CSAM)which had become a vastly greater problem than in the pre-Internet age.4 The Snowden allegations,however,turbo-charge

63、d the deployment of privacy technology,while intensifying public debates over the acceptable limits to the states ability to conduct digital surveillance.5 In response,over the last decade the UK and allied governments have increasingly warned that modern encryption features where communications are

64、 only accessible to the participants of a conversation,and anonymisation features,where digital identities can be hidden to observers pose a threat to legitimate law enforcement capabilities unless they are designed to permit,in some way,lawful access.For example,an open letter from the Home Secreta

65、ry in 2022 warned industry that E2EE creates severe risks to public safety.6 2 Lily Chen and Matthew Scholl,“The Cornerstone of Cybersecurity Cryptographic Standards and a 50-Year Evolution,”Cybersecurity Insights,NIST,last modified May 26,2022,https:/www.nist.gov/blogs/cybersecurity-insights/corner

66、stone-cybersecurity-cryptographic-standards-and-50-year-evolution.3 Ciaran Martin,End-to-end encryption:the(fruitless?)search for a compromise(Oxford:University of Oxford),4,https:/www.bsg.ox.ac.uk/sites/default/files/2021-11/End-to-end%20Encryption%20Ciaran%20Martin%20Blavatnik%20School.pdf.4 Klaus

67、 M.Beier,Maximilian von Heyden and Isabel Schilg,“Child sexual abuse as a global challenge,”InPsych 45,no.3(August 2021),https:/psychology.org.au/for-members/publications/inpsych/2021/august-special-issue-3/child-sexual-abuse-as-a-global-challenge.5 Jonathan Cable,UK Public Opinion Review:Working Pa

68、per-An overview of public opinion polls since the Edward Snowden revelations in June 2013(Cardiff University:2015),9-15,https:/orca.cardiff.ac.uk/id/eprint/74140/1/UK%20Public%20Opinion%20Review%20180615.pdf.6 HM Government,International statement:End-to-end encryption and public safety(Home Office:

69、2020),https:/www.gov.uk/government/publications/international-statement-end-to-end-encryption-and-public-safety.The Future of Privacy by Design Technology:Policy implications for UK security 12 This was not only in terms of making legitimate law enforcement access to data more difficult,but also by

70、preventing service providers themselves from efficiently detecting and disrupting harmful activity on platforms.The Governments position was supported by certain CSOs in areas such as child protection,who feared that progress in limiting the distribution of CSAM would be undone by the creation of di

71、gital safe spaces for abusers.7 Some of the largest platforms and service providers supported by CSOs and academics working to defend civil liberties have responded with alarm to the UKs proposals to ensure access to communications data.Several service providers,for example,published an open letter

72、protesting against measures in the draft Online Safety Bill(OSB)requiring the detection and removal of harmful content.They warned that the OSB,which seeks to protect children and adults from harm when using Internet services,could break end-to-end encryption,opening the door to routine,general and

73、indiscriminate surveillance of personal messages by empowering Ofcom(which will become the regulator of Internet services when the OSB becomes law)to force the proactive scanning of private messages on end-to-end encrypted communication services.8 Since the letter was published in April 2023,the UK

74、Government has announced that these measures will not be enforced until the technology to do so is technically feasible.9 Nevertheless,this decision has still not been seen as adequate by privacy advocates,with some arguing that the absence of this statement in the amended version of the bill may si

75、mply lead to the issue being postponed until new technical developments make it relevant again.10 At the time of writing this report,these arguments remain polarised and trust between governments,industry and civil society appears in short supply.As reflected in the UK Governments recent statement o

76、n the OSB,a technological solution that preserves a sufficient level of user privacy while allowing lawful access,and that commands the support of a wide range of stakeholders,has yet to be found or accepted in principle.Meanwhile,with the proliferation of artificial intelligence(AI)and the Internet

77、 of Things(IoT)creating new types of data,public attitudes towards the monitoring of personal information continue to change.Over the last 10 years,the UK has witnessed declining levels of concern with 7 Dan Sexton,“Not all Encryption is the same:social media is not ready for End-to-End Encryption,”

78、Internet Watch Foundation,last modified March 14,2022,https:/www.iwf.org.uk/news-media/blogs/not-all-encryption-is-the-same-social-media-is-not-ready-for-end-to-end-encryption/.8 Matthew Hodgson et al.,“An open letter,”WhatsApp Blog,last modified April 17,2023,https:/ Emma Loffhagen and Sian Hewitt,

79、“Online Safety Bill:government backs down from privacy battle,”The Evening Standard,September 11,2023,https:/ Ibid.Sam Stockwell,Andrew Glazzard and Alexander Babuta 13 data privacy,coupled with rising levels of satisfaction regarding the amount of data shared.11 At the same time,the prevalence of o

80、nline crime such as fraud has risen dramatically,with cybercriminals still exploiting vulnerabilities on user devices to illegally gather sensitive data.12 An important issue here is the multitude of conflicting definitions surrounding privacy and security.Perceptions of what constitutes security in

81、 the digital realm vary significantly between stakeholder groups,with some seeing security as primarily about public safety in response to serious threats such as terrorism.13 However,security is more complex and context-specific in practice,and within the privacy debate,it can incorporate everythin

82、g from the confidentiality of personal data to the safety of children online.14 A context-specific policy framework is therefore required to account for different public expectations of privacy in relation to this broad spectrum of security considerations.Methodology and structure This study incorpo

83、rated a participatory futures research design,based on the methods outlined in the UK Governments Futures Toolkit.15 A participatory futures design encourages long term strategic thinking in the policy and strategy process.16 This study sought to address the following four research questions:RQ1:How

84、 is emerging PBD technology(including but not limited to encryption technologies)likely to constrain future investigative capabilities?11 Data and Marketing Association,UK Data Privacy:What the Consumer Really Thinks 2022(December 2022),https:/dma.org.uk/uploads/misc/dma-uk-data-privacy-2022.pdf,1.1

85、2 Charles Griffiths,“The Latest 2023 Cyber Crime Statistics,”AAG IT Support,last modified July 7,2023,https:/aag- Josephine Wolff,“What we talk about when we talk about cybersecurity:security in Internet governance debates,”Internet Policy Review 5,no.3(September 2016):1-2,https:/doi.org/10.14763/20

86、16.3.430;Veronika Nagy,“Security:Theories,”in Encyclopedia of Security and Emergency Management,ed.Lauren R.Shapiro and Marie-Helen Maras(Cham:Springer,2020).https:/doi.org/10.1007/978-3-319-69891-5_244-1,1;S.O.Kravchenko and L.A.Tairova,“Concept Of Public Security In The Context Of Modern Society D

87、evelopment,”Academy of Municipal Administration,14,no.4(November 2014):57,https:/ideas.repec.org/a/ama/journl/v14y2014i4p57-70.html.14 Kravchenko&Tairova(2014),57.15 See:HM Government,The Futures Toolkit:Tools for Futures Thinking and Foresight Across UK Government(Government Office for Science:2017

88、),https:/assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/674209/futures-toolkit-edition-1.pdf.16 Ibid,1.The Future of Privacy by Design Technology:Policy implications for UK security 14 RQ2:How can these expected constraints be mitigated to limit loss of capab

89、ility?(Mitigations may be policy,legal,methodological or procedural).RQ3:What new challenges may emerging PBD technology present for digital content moderation?RQ4:How can non-government actors(including but not limited to platform providers)prevent unlawful misuse of emerging PBD technology,and wha

90、t future policy or regulatory interventions may be required to ensure this?To address these questions,this study involved four phases of research which are outlined below.A more detailed description of the methodology is provided in the Appendix.Figure 2.Overview of Study Phases Phase 1(Data Gatheri

91、ng)Phase 2(Driver Mapping and Scenarios)Phase 3(Visioning and Backcasting)Phase 4(Multistakeholder Engagement)Sam Stockwell,Andrew Glazzard and Alexander Babuta 15 Study Phase Activities Outputs PHASE 1:Data gathering Literature review on 36 sources related to new PBD developments,lawful access prop

92、osals,debates over concepts and SDO organisational processes.Interviews with 9 experts from security backgrounds to understand key concerns with the future of PBD technology.An Issues Paper summarising different areas of conflict within the current debate and perspectives on potential future outcome

93、s.PHASE 2:Driver mapping and scenarios Identification of 13 drivers of change shaping the future PBD landscape,categorised by the uncertainty of their future trajectory.Creation of an axis of uncertainty to identify the range of projections for the uncertain drivers.3 fictitious scenarios(vignettes)

94、to illustrate the full range of privacy and security concerns that could emerge.PHASE 3:Visioning and backcasting Cross-government workshop to determine a unified UK policy position,as well as the necessary steps to achieve such an outcome.A set of visions for an ideal policy outcome on PBD technolo

95、gy from a UK Government perspective,as well as key actions required to achieve this outcome.PHASE 4:Multistakeholder engagement A multistakeholder workshop with CSOs,industry,UK Government representatives,SDOs and academia to discuss key questions on achieving progress within the PBD ecosystem.Once

96、actions for advancing these changes were identified,a prioritisation exercise identified the degree of urgency associated with each.A corresponding set of ideal outcomes and actions needed from the perspective of a diverse range of stakeholders.The Future of Privacy by Design Technology:Policy impli

97、cations for UK security 16 Definitions It is important to clearly articulate and distinguish key concepts used throughout this report to avoid creating confusion which,within the context of the contentious privacy debate,can have dangerous ramifications in misinforming perspectives.17 The table belo

98、w provides a summary definition of key terms used throughout the report.Key concept Definition Anonymisation Where digital identities can be hidden to observers(e.g.service providers).Encryption Where communications are only accessible to the participants of a conversation.Investigative powers Capab

99、ilities,including the interception of communications,equipment interference and the acquisition of communications data,which are used only for specific purposes such as national security,the detection or prevention of serious crime if there is a threat to life or to protect the economic wellbeing of

100、 the UK.18 Privacy-by-design The embedding of privacy and data protection considerations into the design of devices,standards and software,such as encryption,anonymisation and traffic obfuscation features.19 Right to privacy The right to privacy is defined in Article 8 of the European Convention of

101、Human Rights(ECHR)as respect to ones private and family life,his home and his correspondence.Article 8 is a qualified right,meaning it may be interfered with to protect the rights of the wider public and is subject to certain restrictions that are in accordance with law and necessary in a democratic

102、 society.20 In the Universal Declaration of Human Rights(UDHR),Article 12 states that no one shall be subjected to arbitrary interference with his privacy,family,home or correspondence,nor to attacks upon his honour and 17 For an example,see:David Elliot and Eldon Soifer,“AI Technologies,Privacy,and

103、 Security,”Frontiers in Artificial Intelligence 5(April 2022):1-4,https:/doi.org/10.3389/frai.2022.826737.18“The Powers,”IPCO,accessed June 25,2023,https:/www.ipco.org.uk/investigatory-powers/the-powers/.19 Research team definition,adapted from the GDPR Article 25 and refined through workshop consul

104、tations.20“Article 8:Respect for your private and family life,”Equality and Human Rights Commission,accessed June 16,2023,https:/ Declaration of Human Rights,”United Nations,accessed June 16,2023,https:/www.un.org/en/about-us/universal-declaration-of-human-rights.Sam Stockwell,Andrew Glazzard and Al

105、exander Babuta 17 reputation.Everyone has the right to the protection of the law against such interference or attacks.Right to freedom of expression Similar to the right to privacy,this is a qualified right subject to certain restrictions.It is defined in Article 10 of the ECHR as the freedom to hol

106、d opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.21 In the UDHR,Article 19 includes the same description to articulate this right.Security The ECHR imposes positive obligations on the state to protect individuals from thr

107、eats to their security,including through Article 2(the right to life),and Article 3(the prohibition on torture and inhumane or degrading treatment).The UDHRs Article 3 also states that everyone has the right to life,liberty and security of person.It is important to recognise that these different con

108、siderations create a comprehensive,balanced view of what security means from a human rights perspective.22 In many cases,the states positive obligations to counter such security threats may involve interfering with other qualified rights,such as the right to privacy or freedom of expression.21“Artic

109、le 10:Freedom of expression,”Equality and Human Rights Commission,accessed June 16,2023,https:/ Piet Hein van Kempen,“Four Concepts of Security A Human Rights Perspective,”Human Rights Law Review 13,no.1(March 2013):20-23,https:/doi.org/10.1093/hrlr/ngs037.The Future of Privacy by Design Technology:

110、Policy implications for UK security 18 1.The Current Privacy Debate This section provides an overview of current positions on the privacy debate from various stakeholder perspectives,including:the interpretation of different human rights;the risks of PBD features to law enforcement and intelligence

111、activities;and the nature of industry design processes for the development of new technologies,protocols and standards.Section 1.1 explores the conceptual and technical debates related to digital privacy.Section 1.2 discusses the governance and policy issues associated with PBD technology.1.1 Privac

112、y and security concepts The right to privacy,meaning the right to limit who has access to our bodies,our families and our possessions(and hence information and data about us or that we possess),is expressed as a fundamental human right in Article 12 of the UDHR and Article 8 of the ECHR.23 However,i

113、n both frameworks privacy is clearly articulated as a qualified right,rather than an absolute one.This means there are circumstances in which the state can interfere with individuals right to privacy,for instance if such interference is necessary in the interests of public safety,crime prevention or

114、 national security.24 The potential conflict between the right of privacy and the responsibilities of the state to protect the public is central to the debate over how PBD technology should be developed.Privacy is where the rights of the individual meet the authority of the state so it is often cont

115、roversial.The allegations from former NSA contractor Edward Snowden,on the scale of US and UK Government programmes of digital surveillance,added urgency to but did not initiate the debate over the limits of the powers of democratic governments to intrude into the private sphere.Instead,modern compu

116、ter-based cryptography debates emerged in the 1970s,when technology entrepreneurs in the US began to develop powerful cryptographic products that challenged what previously had been a state monopoly.25 23 See:https:/www.un.org/en/about-us/universal-declaration-of-human-rights;https:/www.echr.coe.int

117、/Documents/Convention_ENG.pdf 24 Article 12 of the UDHR prohibits arbitrary interference,while Article 8 of ECHR prohibits interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of nati

118、onal security,public safety or the economic wellbeing of the country,for the prevention of disorder or crime,for the protection of health or morals,or for the protection of the rights and freedoms of others.25 Craig Jarvis,Crypto Wars:The Fight for Privacy in the Digital Age:A Political History of D

119、igital Encryption(Florida:CRC Press).Sam Stockwell,Andrew Glazzard and Alexander Babuta 19 While the technology has changed,and governments have become more open about their capabilities,the fundamentals of the debate have remained constant.On one side,privacy advocates hold that PBD technology prot

120、ects the individual against a range of threats,including that of an overly powerful state exercising arbitrary power,and the consolidation of state capabilities through lawful access designs which could leave backdoors for malicious actors.On the other side,governments and some sections of civil soc

121、iety maintain that the state sometimes needs to intrude into the private sphere to protect the public.While these positions are not necessarily irreconcilable,what constitutes arbitrary interference and what is necessary in the interests of varying notions of security are seen differently by each si

122、de.These concepts are also sensitive to a range of contextual factors,including the changing technological landscape,the current security threat environment,and the perceived level of trustworthiness of the state agencies in question.26 As researchers working in the privacy space have highlighted,pr

123、ivacy should not be strictly equated to only ensuring the confidentiality of user content and metadata,but should also incorporate ethical and social values about which citizens care deeply.27 Indeed,the same encryption and anonymisation protocols that can protect the public from arbitrary data inte

124、rference by third parties can also enable malicious actors to conduct their activities with less risk of detection or prevention whether that be human trafficking or cybercrime.The current debate is often framed in terms of privacy versus security.However,this framing simplifies the complexities inv

125、olved.A PBD enhancement could improve cybersecurity for the public while impeding lawful access to communications or limiting content moderation capabilities.Conversely,an investigative capability,such as a software vulnerability identified and exploited by a signals intelligence agency,may provide

126、national security benefits but could also,at least theoretically,be exploited by a threat actor.28 A PBD feature may also make user devices more secure while blindsiding enterprise-level cybersecurity measures:for example,features introduced as part of the updates to 26 Geoffrey S.Corn and Dru Brenn

127、er-Beck,“Going Dark:Encryption,Privacy,Liberty,and Security in the Golden Age of Surveillance,”in The Cambridge Handbook of Surveillance Law,ed.David Gray and Stephen E.Henderson(Cambridge:Cambridge University Press,2017),330-371 27 Claudia Peersman et al.,REPHRAIN:Towards a Framework for Evaluating

128、 CSAM Prevention and Detection Tools in the Context of End-to-end encryption Environments:a Case Study(February 2023:REPHRAIN),https:/bpb-eu- Equities Process,”GCHQ,accessed May 25,2023,https:/www.gchq.gov.uk/information/equities-process.The Future of Privacy by Design Technology:Policy implications

129、 for UK security 20 Transport Layer Security(TLS)1.3 have the potential to increase the difficulty of detecting malicious traffic for some organisations.29 Some experts have,therefore,proposed that what seems like a trade-off between privacy and security should really be conceived as complex trade-o

130、ffs involving different conceptions of security:individual or collective,offensive or defensive.The Encryption Working Group(EWG),which aims to convene greater constructive dialogue with stakeholders on encryption policy,has gone further than this.They have called for an end to the false dichotomy s

131、ome have drawn between“security”and“privacy”.30 In its place,there must be a recognition that security in the context of the encryption debate can consist not only of national security or public safety,but privacy itself.As such,the key is not to reject the possibility for compromise or reduce priva

132、cy to equivalent to confidentiality,but understand that what is involved is determining how to weigh competing security interests.31 Nevertheless,these nuances can often be overlooked.For instance,a recent UN report exploring human rights challenges and opportunities in SDOs narrowly focuses on secu

133、rity implications of encryption standards from the perspective of the individual user,disregarding the impact these have for collective security.32 It also asserts that UN Member States may not propose standards that lead to arbitrary interferences with the right to privacy,but does not clearly defi

134、ne what is meant by arbitrary or how this may change on a case-by-case context.33 As such,contributions such as this only further complicate the ability of policymakers and non-specialists to distinguish between the specific trade-offs arising from new standards proposals.29 Ian Levy,“TLS 1.3:better

135、 for individuals harder for enterprises,”NCSC,accessed May 19,2023,https:/www.ncsc.gov.uk/blog-post/tls-13-better-individuals-harder-enterprises.30 Encryption Working Group,Moving the Encryption Policy Conversation Forward,3,(Carnegie Endowment for International Peace:2019),https:/carnegieendowment.

136、org/files/EWG_Encryption_Policy.pdf 31 Ibid,4;Catarina Fontes and Christian Perrone,Ethics of surveillance:harnessing the use of live facial recognition technologies in public spaces for law enforcement,6,(Technical University of Munichs Institute for Ethics in Artificial Intelligence:December 2021)

137、,https:/ieai.mcts.tum.de/wp-content/uploads/2021/12/ResearchBrief_December_Fontes-1.pdf.32 Office of the United Nations High Commissioner for Human Rights,Human rights and technical standard-setting processes for new and emerging digital technologies,6,(HRC/53/42:June 2023).33 Ibid,7.Sam Stockwell,A

138、ndrew Glazzard and Alexander Babuta 21 1.2 Capabilities and risks to capabilities One way of looking at these issues is in terms of capabilities,or the actions that technology can enable.PBD technology enables content and increasingly metadata to be made unreadable by third parties(encryption),and c

139、an irreversibly conceal identities of parties in a communication(anonymisation).34 These features can impact the following capabilities:1)Investigative powers that enable governments,law enforcement and intelligence agencies(hereafter,agencies),to investigate and potentially disrupt adversaries thro

140、ugh acquiring the content and/or metadata of communications.35 2)Interdiction capabilities that enable services to remove or moderate content and accounts,for violations of either the law or the platforms terms of use.36 3)Cyber defence capabilities that enable enterprises and users to protect their

141、 networks and data against malicious attack.37 The current debate over PBD technology has sometimes been confused by a conflation of these capabilities.The most controversial of these capabilities are digital investigative powers,which in the UK are governed under the Investigatory Powers Act 2016(I

142、PA).These powers include the interception of communications(sometimes referred to as lawful intercept or LI),the collection and retention of communications data(CD),and equipment interference(EI),by which devices can be modified using physical or software implants to obtain data.38 The UKs law enfor

143、cement and intelligence agencies,echoing similar concerns from the US government,have become increasingly concerned that PBD technology,particularly E2EE,34 Ales Teska,“Pseudonymization,Anonymization,Encryption.what is the difference?,”TeskaLabs Blog,accessed May 31,2023,https:/ HM Government,Home O

144、ffice report on the operation of the Investigatory Powers Act 2016(Home Office:2023),https:/www.gov.uk/government/publications/report-on-the-operation-of-the-investigatory-powers-act-2016/home-office-report-on-the-operation-of-the-investigatory-powers-act-2016-accessible-version#chapter-3-changing-o

145、perational-environment.36 Lauren Dudley,“Year in Review:Content Moderation on Social Media Platforms in 2019,”Council on Foreign Relations,last modified December 19,2019,https:/www.cfr.org/blog/year-review-content-moderation-social-media-platforms-2019.37“National Cyber Force transforms countrys cyb

146、er capabilities to protect the UK,”GCHQ,accessed May 31,2023,https:/www.gchq.gov.uk/news/national-cyber-force.38 HM Government,Report on the operation of the Investigatory Powers Act 2016(Home Office:2023),https:/www.gov.uk/government/publications/report-on-the-operation-of-the-investigatory-powers-

147、act-2016.The Future of Privacy by Design Technology:Policy implications for UK security 22 is making interception and CD collection increasingly difficult.Concerns over the risk of going dark where agencies are largely unable to collect intelligence from electronic communications have been raised em

148、phatically by UK Government representatives;some suggest that the point of going spotty has already been reached,i.e.that agencies can collect intelligence in some instances but not in others.39 One concern is that more targeted and less invasive investigative methods will be increasingly undermined

149、,leading to greater reliance on less targeted powers that risk a higher degree of collateral intrusion.40 Alongside this impact on investigative capabilities,the moderation of content for safety and security reasons such as removing and preventing the distribution of CSAM or terrorist propaganda can

150、 also be constrained by PBD technology.CSOs,such as the Internet Watch Foundation,have highlighted that while E2EE and anonymisation measures protect the privacy of users during communication,the nature of their design bypasses the tools used to detect CSAM.41 Other types of problematic content whic

151、h could proliferate via PBD features include disinformation,hate speech and criminal content such as human trafficking advertisements,which may never be detected or removed by service providers.42 There is a need to maintain a clear distinction between investigative and interdiction capabilities.Cer

152、tain types of use case(e.g.countering CSAM)are often invoked to make the case for mitigations of PBD,while critics often argue that such mitigations could facilitate state surveillance or introduce cyber vulnerabilities that allow malicious actors to unlawfully access content.However,to conflate the

153、se capabilities is a category error.Interdiction capabilities are mostly the responsibility of platforms and services rather than governments(though they can be mandated by legislation).Additionally,the aim of interdiction,to prevent harmful content reaching vulnerable audiences,is also categoricall

154、y different from investigation(building intelligence and securing prosecutions).Critics of the going dark argument point out that law enforcement agencies have generally been successful in conducting investigations despite advances in PBD,with one study which examined prosecutions in the Netherlands

155、 concluding that criminal usage of E2EE 39 Interviews with government experts,11 January 2023;25 January 2023;21 February 2023.See also:Ian Levy and Crispin Robinson,“Principles for a More Informed Exceptional Access Debate,”Lawfare,last modified November 29,2018,https:/ Corn&Brenner-Beck(2017);Jim

156、Baker,“Rethinking Encryption,”Lawfare,last modified October 22,2019,https:/ Dan Sexton,“Not all Encryption is the same:social media is not ready for End-to-End Encryption,”Internet Watch Foundation,last modified March 14,2022,https:/www.iwf.org.uk/news-media/blogs/not-all-encryption-is-the-same-soci

157、al-media-is-not-ready-for-end-to-end-encryption/.42 Sarvesh Mathi,“How End-To-End Encryption Impacts Human Rights?The Good And The Bad,”MediaNama,April 21,2022,https:/ Stockwell,Andrew Glazzard and Alexander Babuta 23 does not appear to have frustrated prosecutions.43 Some privacy advocates have als

158、o argued that alternative methods could be used for answering investigative questions on encrypted content.This has included suggestions that targeted exploitation solutions which focus on pre-existing technical vulnerabilities on platforms are the only viable approach for reliable interception,as o

159、pposed to creating new weaknesses in a network or device.44 More commonly however,these groups have argued that rather than directly analysing the content of communications which they see as violating the core principles of encryption analysis of CD(the who,when,where,and how of a communication45)co

160、uld be a less intrusive option,although this remains a matter of considerable debate.46 Such metadata is already an invaluable source for agencies,where it complements intelligence from the interception of content.However,government officials have said that it cannot replace the value of information

161、 derived from the content itself.For content moderation,the analysis of metadata alone by automated systems would be ineffective at balancing privacy concerns and user safety.47 This is due to the rates of false negatives(e.g.poor detection rates of abusive social media accounts,as low as 50%with so

162、me models)and false positives(leading to mislabelling of users who have not engaged in any malicious activity).48 Moreover,over-reliance on metadata for content moderation could lead to a chilling effect on free speech if the models falsely accuse users of engaging in malicious activity through larg

163、e margins of error.49 The utility of CD as an alternative to interception is also put into question by moves to encrypt metadata as well as content.MIT researchers have designed a system called XRD(Crossroads)which would shuffle identity information across a series of servers before content reaches

164、the intended recipient,thereby breaking the link between source and 43 Pieter Hartel and Rolf van Wegberg,“Going dark?Analysing the impact of end-to-end encryption on the outcome of Dutch criminal court cases,”Crime Science 12,no.5(March 2023):6-7,https:/doi.org/10.1186/s40163-023-00185-4.44 Steven

165、Bellovin et al.,“Going Bright:Wiretapping without Weakening Communications Infrastructure,”IEEE Security&Privacy 11,no.1(Jan-Feb 2013):71,https:/ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6357177.45 HM Government,Communications Data:Code of Practice(Home Office:2018),8,https:/assets.publishing

166、.service.gov.uk/government/uploads/system/uploads/attachment_data/file/822817/Communications_Data_Code_of_Practice.pdf.46 Seny Kamara et al.,Outside Looking In:Approaches to Content Moderation in End-to-End Encrypted Systems(Center for Democracy&Technology:August 2021),https:/cdt.org/wp-content/uplo

167、ads/2021/08/CDT-Outside-Looking-In-Approaches-to-Content-Moderation-in-End-to-End-Encrypted-Systems-updated-20220113.pdf.47 Ian Levy and Crispin Robinson,“Thoughts on Child Safety on Commodity Platforms”ArXiv(July 2022),50-54,https:/doi.org/10.48550/arXiv.2207.09506.48 Ibid,50-51.49 Ibid,54.The Futu

168、re of Privacy by Design Technology:Policy implications for UK security 24 destination users.50 However,this system would provide far weaker protection than encryption does for content.51 Similarly,while Android operating systems since 2018 have embedded metadata encryption as a standard feature on i

169、nternal storage,Apples iOS 17 enables users to conceal the fact that they are interacting with a browser,in addition to heightened protection against user identification.52 As these features become more widespread,agencies may increasingly lose access to sources of CD on which they currently rely,fu

170、rther pushing them towards techniques that may be more intrusive and less targeted.53 UK officials interviewed for this project were clear that the going dark scenario is not just a possibility,but likely to occur if planned and anticipated PBD developments go ahead without any modifications to pres

171、erve security capabilities.For instance,user-controlled encryption(UCE),where end users rather than service providers have ultimate control over the keys to encrypt and decrypt data,is likely to be deployed within the next 5-10 years with no new laws,regulations,or other mandates that limit its use.

172、54 UCE means that end-users will be the only party able to make the decision over whether to cooperate or not with agencies,resulting in those under investigation being effectively able to deny lawful access to their communications if the encryption is correctly implemented.Additional examples of em

173、erging PBD developments which are now posing risks to investigative capability,or are likely to do so in the near future,include the following:50 Rob Matheson,“Protecting sensitive metadata so it cant be used for surveillance,”MIT News,last modified February 26,2020,https:/news.mit.edu/2020/protecti

174、ng-sensitive-metadata-from-surveillance-0226.51 Ibid.52“Metadata Encryption,”Android Open Source Project,accessed May 19,2023,https:/ announces powerful new privacy and security features,”Apple Press Release,last modified June 5,2023,https:/ Interview with government expert,11 January 2023.54 Encryp

175、tion Working Group,Likely Future Adoption of User-Controlled Encryption(Washington DC:Carnegie Endowment for International Peace,2019),3,https:/carnegieendowment.org/files/Encryption_Working_Group_Future_Adoption_WEB_Updated1.pdf.Sam Stockwell,Andrew Glazzard and Alexander Babuta 25 PBD development

176、Summary and Implications Transport Layer Security(TLS)1.3 TLS is one of the most widely used encryption protocols on the Internet and facilitates end-to-end security of data sent between applications(e.g.secure web browsing).55 The latest version of this protocol(v1.3)was developed in 2018,which eli

177、minated obsolete cryptographic algorithms and reduces the amount of user information which is visible on a network.56 However,some of the new features introduced may undermine enterprise security.This is due to how v1.3 disables passive mode decryption that many enterprises rely on to inspect data t

178、raversing their network,to detect potential malware or other cyber threats.57 Encrypted Client Hello(ECH)ECH is designed to work with TLS v1.3 to encrypt service name identities in TLS connections.58 In doing so,a range of metadata is now kept secret from network observers,including the identities o

179、f data endpoints and how they use the connection.59 ECH may undermine content filtering due to disruptions to transparent proxies.These act in the same way as a standard proxy server,but do not possess special client configuration and users are often not aware of their existence.60 In particular,sch

180、ools could be impacted most severely since these proxies are used to prevent children from accessing blocked websites and safeguard them against phishing attacks.61 DNS over HTTPS(DoH)DoH encrypts communication between a client and a DNS server,meaning that it can prevent intruders from monitoring I

181、nternet traffic.It also makes DNS queries,which demand information sent from a users computer,invisible to Internet service providers.62 This method has been adopted by the IETF in 2018,with further encryption of Internet traffic seen as a valuable way to enhance user privacy.63 However,analysis has

182、 revealed that DoH features are being used for malicious purposes,such as malware distribution or data 55“TLS Basics,”Internet Society,accessed May 19,2023,https:/www.Internetsociety.org/deploy360/tls/basics/.56“Taking Transport Layer Security(TLS)to the next level with TLS 1.3,”Microsoft Security B

183、log,last modified August 20,2020,https:/ Gigamon,“What Do You Mean TLS 1.3 Might Degrade My Security?,”Gigamon White Paper(2020),https:/ Hartel&van Wegberg(2023).59 Christopher Patton,“Good-bye ESNI,hello ECH!,”The Cloudflare Blog,last modified December 8,2020,https:/ Andrew Campling,Notes From A Ro

184、undtable Discussion About Encrypted Client Hello(ECH)(419.Consulting:August 2021).61 Ibid.62 Hartel&van Wegberg(2023).63 Karel Hyenk et al.,“Summary of DNS Over HTTPS Abuse,”IEEE Access 10(May 2022):54668-54680,https:/doi.org/10.1109/ACCESS.2022.3175497.The Future of Privacy by Design Technology:Pol

185、icy implications for UK security 26 exfiltration,in a manner that now poses challenges for traditional cybersecurity protection tools.64 Child safety experts have also criticised the feature for undermining website blocklists that protect children from viewing harmful content.65 Multiplexed Applicat

186、ion Substrate over QUIC Encryption(MASQUE)MASQUE makes uses of another type of Internet protocol known as Quick UDP Internet Connection(QUIC),which is a fully encrypted transport protocol,to handle connections between a user,proxy server and website.66 However,alongside this typical connection,there

187、 is an additional tunnel layer between the client and proxy server.This further guarantees confidentiality,making data exchanged between the proxy and server undetectable by network observers.67 One example of the MASQUE protocol is Apples iCloud Private Relay.This uses two separate and encrypted In

188、ternet relays for any user traffic requests,meaning that no one,even Apple,can identify IP addresses,location or browsing activity.68 In denying access to any party to access user data,this protocol has been criticised on the basis that it undermines the ability to obtain insights from Internet acti

189、vity data for digital investigations.69 Figure 3.Overview of Apples iCloud Private Relay 70 64 Ibid.65“Exposing child victims:The catastrophic impact of DNS-over-HTTPs,”Internet Watch Foundation,last modified June 10,2019,https:/www.iwf.org.uk/news-media/blogs/exposing-child-victims-the-catastrophic

190、-impact-of-dns-over-https/.66 Zaheduzzaman Sarker et al.,“A collaborative approach to encrypted traffic,”Ericsson,last modified June 25,2020,https:/ Ibid.68“About iCloud Private Relay,”Apple,accessed May 19,2023,https:/ Sophie Webster,“UK Network Operators Voice Concerns About Apples iCloud Private

191、Relay,Urged CMA to Regulate the Service,”Tech Times,March 13,2022,https:/ Apple,iCloud Private Relay Overview:Learn how Private Relay protects users privacy on the internet(Apple:2021),5,https:/ Stockwell,Andrew Glazzard and Alexander Babuta 27 1.3 Responses to loss of capability Various solutions t

192、o potential loss of capability have been proposed,some of which are techniques to ensure what government calls lawful access to communications and metadata.71 This denotes a targeted government authorisation to access,with the assistance of the service provider,the data belonging to a user when need

193、ed,such as when there is a criminal investigation.72 In response to criticism that agencies risk creating security vulnerabilities through this technique by requiring access to cryptographic keys,key escrow systems ensure that a key is held under a contractual arrangement by a trusted third party wh

194、ich would ensure that access is strictly lawful.However,some experts view such approaches as being no more secure than simpler approaches to copying cryptographic keys,not least as key escrow databases would become attractive targets for attack by malicious actors.73 In response,more sophisticated s

195、ystems have been proposed,such as secret splitting mechanisms to escrow two private decryption keys for third parties and three public keys stored on an E2EE server.This aims to ensure that no single group can retrieve the users private keys unilaterally.74 The first fully implemented key escrow sys

196、tem was introduced by US law enforcement agencies during the 1990s,known as the Clipper chipset.The NSA sought to create a secure encryption system for telephones that would still enable investigators to gain access.75 It worked by embedding a key within the microchip that had an identical copy avai

197、lable to an authorised NSA agent to decrypt the data,if needed.However,it was extremely controversial for enabling intrusion into users phone calls,and some argued that the key copies themselves were vulnerable to malicious theft or hacking that would leave confidential data exposed.76 In 2018,GCHQ

198、publicly proposed a separate mathematical approach that could be considered in certain circumstances to enable lawful access,while seeking to preserve the 71 Ian Levy and Crispin Robinson,“Principles for a More Informed Lawful Access Debate,”Lawfare,last modified November 29,2018,https:/ Levy&Robins

199、on(2018).73 Dennis Fisher,“Key Escrow By Any Other Name is Still Key Escrow,”Decipher,last modified April 27,2018,https:/ Pooja Bharadwaj,Harshita Pal and Bhawna Narwal,“Proposing a Key Escrow Mechanism for Real-Time access to End-to-End encryption systems in the Interest of Law Enforcement,”in 2018

200、 3rd International Conference on Contemporary Computing and Informatics(IC3I)(Gurgaon:IEEE,2020),https:/doi.org/10.1109/IC3I44769.2018.9007301.75 Shaun Nichols,“Remember the Clipper chip?NSAs botched backdoor-for-Feds from 1993 still influences todays encryption debates,”The Register,January 27,2020

201、,https:/ Ibid.The Future of Privacy by Design Technology:Policy implications for UK security 28 integrity of encryption protocols.77 Dubbed the ghost protocol by its critics,this proposed technique,whereby a service provider would add a silent third party to a communication server,was portrayed as t

202、he digital equivalent of the wiretap methods of the analogue age.78 Nevertheless,the authors advocated six principles to ensure this or any lawful access technique was legitimate and proportionate,as summarised below.79 Although the principles of this proposal were well received,it was criticised by

203、 academics and sections of civil society as both a fundamental challenge to the right of privacy and a potential security risk effectively just another kind of backdoor vulnerability.80 Other proposed solutions have also been controversial.Mobile device forensic tools which enable the extraction of

204、data from devices have been used by UK agencies in criminal investigations but require physical access to the device.At least one third-party service provider used by UK agencies to manage mobile forensic access has also been subject to a large-scale cyberattack,illustrating the dangers of outsourci

205、ng any data collection process.81 The potential risk of losing interdiction capabilities(such as content removal)has also generated technological solutions.In theory,these may be more straightforward to develop and implement as they do not raise the same privacy issues as investigative capabilities,

206、and 77 Levy&Robinson(2018).78 Kieren McCarthy,“GCHQ pushes for virtual crocodile clips on chat apps the ability to silently slip into private encrypted comms,”The Register,November 29,2018,https:/ Levy&Robinson(2018).80 Mayank Varia,“A Roadmap for Lawful Access Research,”Lawfare,last modified Decemb

207、er 5,2018,https:/ Siddarth Venkataramakrishnan,“UK police and other investigators spend 4m on phone hacking software,”Financial Times,November 9,2020,https:/ Source Leaks 4TB of Cellebrite Data After Cyberattack,”Hack Read,August 5,2022,https:/ 1:Six principles for lawful access techniques 1.Legal a

208、uthorisation and minimal intrusion 2.Continued development of investigative methods to keep pace with technology 3.A recognition that complete access to a device was unreasonable or unrealistic 4.Limits on government access to the collected data 5.Maintaining trust between service providers and user

209、s 6.Maximising transparency Sam Stockwell,Andrew Glazzard and Alexander Babuta 29 the mathematical challenges(while significant)may be more tractable.For example,advanced computational techniques such as secure multi-party computation,perceptual matching and homomorphic encryption could all enable t

210、he detection and removal of illegal content without decrypting the content itself.82 However,in practice,proposed solutions have also proved controversial.Client-side scanning(CSS),which involves automated analysis of content on user devices,has been proposed as a content moderation technique that d

211、oes not compromise E2EE.However,Apples plan to introduce CSS in 2021 to detect CSAM was abandoned after widespread criticism that it infringed privacy and presented unacceptable security risks.In summary,although various technical solutions have been proposed to counter the risk of going dark,specif

212、ic details on how they work(which can determine feasibility)have often never been disclosed due to their classified nature.83 Some of these proposals do risk compromising device security for all users,in order to enable lawful access to the data of an extremely small minority.As such,rather than att

213、empting to develop technical solutions to respond to PBD developments as they arise,a more constructive approach would involve directly engaging with those creating privacy-focused protocols and services.Collaboration could help to embed innovative approaches that ensure investigative capability is

214、maintained where it is legitimately necessary,while ascertaining that the user security risks created are proportionate to the benefits that are gained.This is the focus of the following section.1.4 Governance and policy issues The polarised nature of the current debate obstructs the search for work

215、able solutions to specific challenges relating to PBD technology.UK Government officials interviewed for this research consistently argued for a multistakeholder approach to ensure that policy and technical solutions recognise the range and variety of interests.84 These include:Human rights defender

216、s,whose role is to ensure that states do not overreach,such as in repressive regimes.Industry,which provides products and services embedded with PBD features.82 For a review of the technical literature,see Sarah Scheffler and Jonathan Mayer,“SoK:Content Moderation for End-to-End Encryption,”Proceedi

217、ngs on Privacy Enhancing Technologies,no.2(2023):403-429,https:/doi.org/10.56553/popets-2023-0060.83 Interview with government expert,21 February 2023.84 CETaS workshop,20 April 2023;Interview with government experts,11 January 2023;25 January 2023.The Future of Privacy by Design Technology:Policy i

218、mplications for UK security 30 CSOs campaigning on issues such as child protection,which are eager to ensure that PBD technology does not inadvertently enable exploitation of the vulnerable.Academic experts,who can provide informed judgements on the technical feasibility of proposals and necessary t

219、rade-offs.By working in isolation and insisting that other stakeholders are misguided or acting in bad faith,different sides can fail to understand the needs,concerns and activities of others.There is an urgent requirement for governments to actively collaborative with civil society groups and indus

220、try,and vice versa for non-governmental stakeholders.From a government perspective,design of PBD technology appears largely unilateral,with industry(supported by sections of academia and civil society)enabling stronger encryption and anonymisation measures but with few considerations of the risks to

221、 collective security and public safety.These privacy features now form a central pillar of companies marketing strategies,as indicated by recent advertising campaigns from Apple and WhatsApp(among others)emphasising the additional layers of privacy their services provide.85 The majority of these pro

222、cesses,from the proof-of-concept stage to prototype testing,are conducted in-house by corporations,where products are designed for,rather than with,users.86 As a result,assumptions are made about what individuals desire from these systems,without allowing people to engage with them flexibly or custo

223、mise features after products are launched.87 In parallel to the implementation of proprietary PBD software and hardware,more fundamental decisions about Internet and communications security are indirectly made by Internet governance forums,which convene industry representatives and service providers

224、 to implement changes.88 This includes SDOs like the IETF,the Institute of Electrical Engineers Standards Association(IEEE SA),and the World Wide Web Consortium(W3C).These bodies have different responsibilities,constitutions and working practices,and some have more relevance and authority than other

225、s.For PBD technology,officials interviewed judged the IETF to be the most relevant,where experts propose enhancements to the Internets most fundamental encryption protocols and embody the principles of embedding 85 See“Message Privately,”WhatsApp,last accessed August 1,2023,https:/ Felix Chang,“To B

226、uild More-Inclusive Technology,Change Your Design Process,”Harvard Business Review,October 19,2020,https:/hbr.org/2020/10/to-build-more-inclusive-technology-change-your-design-process.87 Ibid.88 Corinne Cath,Loud Men Talking Loudly:Exclusionary Cultures of Internet Governance(Critical Infrastructure

227、 Lab:April 2023),1-2,https:/ Stockwell,Andrew Glazzard and Alexander Babuta 31 privacy at the heart of all decisions even at the expense of lawful access.89 This includes working groups that focus on specific aspects of user privacy(e.g.TLS 1.3,ECH and MASQUE),while other protocols,such as the Signa

228、l Protocol that facilitates E2EE,are proprietary and lie outside the remit of the IETF.90 Formally,the IETF is accessible for all stakeholders by enabling anyone to get involved in designing standards regardless of expertise or affiliation.This is unlike other SDOs that restrict membership to feepay

229、ers or certain groups.91 Internal groups also seek to increase diversity and inclusion,such as the Education and Outreach Directorate(EODIR)that offers mentoring support.92 However,although such principles are supposed to be translated into practice,recent analysis of the organisation has revealed a

230、 disconnect.For instance,despite the open meetings and working groups,the IETF is dominated by industry experts from North America,with limited government representation and CSOs rarely present.93 Research has also suggested that there is an unspoken but clear cultural incentive to dismiss political

231、 debates within groups developing technical specifications,undermining considerations for the impact that new standards may have on public safety.94 In combination with low diversity along national and ethnic lines,these dynamics contribute to a predisposition within the IETF to focus on a narrow se

232、t of political questions(e.g.government surveillance),rather than wider concerns held across society(e.g.personal data scraping by major technology companies).95 Notably,the IETF also includes no organisation-wide forum for exchanging views and technical best practice outside of individual working g

233、roups.96 The specific nature of these groups objectives means that there is limited scope for higher-level discussion on issues of wide strategic relevance across all IETF activities,such as collective security impacts of new standards proposed.This is in contrast to other Internet governance forums

234、,such as the Internet Corporation for Assigned Names and Numbers(ICANN).While not an SDO,ICANN is responsible for maintaining several key databases of the Internet which ensure its stable 89 Interviews with government experts,11 January 2023;25 January 2023;21 February 2023.90“Active IETF working gr

235、oups,”IETF,accessed July 21,2023,https:/datatracker.ietf.org/wg/.91 Mallory Knodel,Joey Salazar and Mehwish Ansari,A Guide to the Internet Engineering Task Force(IETF)for Public Interest Advocates(Center for Democracy&Technology:January 2023),4,https:/cdt.org/wp-content/uploads/2023/02/Art19-Guide-t

236、o-the-IETF-2023-03-21.pdf;Cath(2023),4-5.92“Diversity and Inclusion,”IETF,accessed June 12,2023,https:/www.ietf.org/diversity/;“Education and Outreach Directorate,”IETF,accessed July 20,2023,https:/wiki.ietf.org/en/group/eodir.93 Ibid,5;Cath(2023),7.94 Cath(2023),12.95 Ibid,7;Knodel et al.,(2023),9-

237、10.96 Industry participant at CETaS workshop,10 May 2023.The Future of Privacy by Design Technology:Policy implications for UK security 32 and secure operation.In 2015,ICANN established a Working Group on Public Safety(PSWG),whose focus is those aspects of ICANNs policies and procedures that implica

238、te the safety of the public.97 Nevertheless,the PSWG covers a much wider scope of public safety concerns aside from strictly national security.This includes cybercrime efforts which may have consequences for industry,such as distributed denial-of-service(DDoS)attacks,DNS abuse or botnets.98 Alongsid

239、e the IETF and ICANN,it is important to highlight other prominent SDOs and Internet governance forums that may influence the standards landscape in different ways:SDO/Internet Governance Forum Activities and Governance Structure Internet Research Task Force(IRTF)Unlike the IETF,which focuses on engi

240、neering and standards-making,the IRTF is dedicated to longer-term ambitions of research issues related to the Internet.99 Similar to the IETFs Working Groups,the IRTF has 16 Research Groups which possess the same degree of authority for the publication of new research on topics ranging from privacy

241、enhancement to quantum technology.100 In addition,the IRTFs membership is open to the public,and is likely more accessible to those from a non-technical background.Internet Society(ISOC)ISOC acts as an overarching organisation that helps to fund and support the activities of SDOs,such as the IETF,as

242、 well as offering education to improve awareness on relevant issues.101 This includes cyber incident responses and trust in the IoT.102 Internet Architecture Board(IAB)The IAB is both a committee of the IETF and an advisory body of ISOC.Within this capacity,it is primarily responsible for reviewing

243、architectural elements of Internet protocols,acting as an appeals board for complaints 97 ICANN Public Safety Working Group,“GAC PSWG Terms of Reference,”accessed August 1,2023,https:/gac.icann.org/working-group/public/gac-pswg-terms-of-reference-gac-website-main.98 ICANN Public Safety Working Group

244、,“Work Plan 2018-2019 Final Draft for GAC Endorsement,”last modified February,27,2018,https:/gac.icann.org/file-asset/public/pswg-workplan-2018-2019-endorsed-14mar18.pdf.99“Internet Research Task Force,”IRTF,accessed May 22,2023,https:/irtf.org/.100 Ibid.101 Vint Cerf,“IETF and the Internet Society,

245、”July 18,1995,https:/www.internetsociety.org/internet/history-of-the-internet/ietf-internet-society/.102“OTA and ISOC Combine Resources to Enhance Online Trust,Security and Privacy,”ISOC,last modified April 5,2017,https:/www.internetsociety.org/news/press-releases/2017/ota-and-isoc-combine-resources

246、-to-enhance-online-trust-security-and-privacy/.Sam Stockwell,Andrew Glazzard and Alexander Babuta 33 over Internet standards and publishing the IETFs work.103 In contrast to both the IETF and ISOC,the IAB is not an open organisation.104 The IAB also holds an important function in convening workshops

247、 of specialists and initiating programmes.European Telecommunications Standards Institute(ETSI)ETSI is one of only three organisations which are recognised as developing European Standards that must be adapted into a national standard by all EU Member States.105 Specifically,ETSI deals with telecomm

248、unications and broadcasting networks and services.While membership is fee-based,a range of organisations from companies,research organisations,academia and governments can join.World Wide Web Consortium(W3C)The W3C develops standards and guidelines for the World Wide Web(WWW).This is distinct from t

249、he type of Internet standards created by the IETF,in that the WWW is merely one of the many ways data is shared on the broader Internet ecosystem.106 Membership of the W3C is primarily based on organisational involvement.107 3rd Generation Partnership Project(3GPP)3GPP consists of seven telecommunic

250、ations SDOs(including ETSI).Membership is constrained to individuals with ties to one of the partnership organisations,who send members to work on the specifications.108 These specifications cover cellular telecommunications,including radio access,as well as core network capabilities.The influence o

251、f SDOs over the standards that set the limits of privacy and security on the Internet raises important questions for democratic accountability.This research has shown that UK officials are concerned that the power of the technology industry,exercised through SDOs as well as by virtue of the market d

252、ominance of a handful of global corporations,is increasingly limiting the powers conferred through legislation and exercised by democratically-accountable institutions.109 However,as SDOs are at least in theory global bodies,simply increasing the role of governments within them does not necessarily

253、enhance democratic legitimacy,given the determination of authoritarian nations to use technology to increase surveillance and control over their populations.In the United 103“Internet Architecture Board,”IETF,accessed May 22,2023,https:/www.ietf.org/about/groups/iab/.104“Description,”IAB,accessed Ma

254、y 22,2023,https:/www.iab.org/about/description/.105“About Us,”ETSI,accessed July 20,2023,https:/www.etsi.org/about/about-us.106“Help and FAQ,”W3C,accessed May 22,2023,https:/www.w3.org/Help/#activity.107“Membership FAQ,”W3C,accessed May 22,2023,https:/www.w3.org/Consortium/membership-faq#who.108“How

255、 We Work,”3GPP,last modified July 20,2023,https:/www.3gpp.org/get-involved/how-we-work.109 CETaS workshop,20 April 2023.The Future of Privacy by Design Technology:Policy implications for UK security 34 Nations International Telecommunications Union(ITU)which develops standards on various aspects of

256、telecommunications networks,China submitted 830 draft standards related to wired communications in 2019 alone,more than the next three countries(including the US)combined.110 These standards have primarily been geared towards creating an environment where state authorities have exclusive access to h

257、ighly intrusive surveillance powers.111 1.5 The way forward Significant concerns remain not only over how future PBD technology should be developed(and by whom),but also about the relationship between privacy and security in society.The current situation is characterised by a severe lack of trust an

258、d collaboration,with industry continuing to roll out PBD technology to customers,and governments insisting that this cannot come at the expense of security capabilities.In the UK,legislation intended to protect vulnerable users,such as children,has widened in scope to the point that industry and aca

259、demia now consider it a risk to privacy rights.112 As a result,a relationship between Western governments and communication service providers that was initially undermined by the Snowden allegations has only become more adversarial in nature.Several ingenious technical solutions have been proposed,b

260、ut none so far address the concerns of companies and civil liberty advocates.In some cases,objections to lawful access may be based on principle rather than the practical limitations of each proposed solution.113 Although it is rarely stated this bluntly,this implies that some believe that governmen

261、ts simply should not have the power to intrude into the private sphere.114 For this reason,some experts believe that a compromise between privacy advocates and governments may not be possible,although the search for technical solutions that can balance privacy and different conceptions of security s

262、hould continue.115 The EWG have sought to point out that progress may lie not in seeking widespread agreement on technical outcomes,but in the way processes to design this technology are structured.116 For 110 Brett D.Schaefer and Danielle Pletka,Countering Chinas Growing Influence at the Internatio

263、nal Telecommunication Union(The Heritage Foundation:2022),https:/www.heritage.org/sites/default/files/2022-03/BG3689.pdf.111 Eva Xiao,“China Passes One of the Worlds Strictest Data-Privacy Laws,”Wall Street,August 20,2021,https:/ Hodgson et al.(2023).113 Interview with government expert,24 February

264、2023.114 CETaS workshop,20 April 2023.115 Martin(2021),11-12;Interview with government experts,11 January 2023;24 February 2023.116 EWG,Moving the Encryption Policy Conversation Forward(2019),3.Sam Stockwell,Andrew Glazzard and Alexander Babuta 35 instance,they highlight that absolutist positions sh

265、ould be avoided,with different stakeholders needing to recognise that this is an area where every concern can never be addressed perfectly and will require compromise.117 Moreover,they criticise the tendency of a complex issue such as encryption to be over-simplified in public discourse,instead advo

266、cating that it is broken down into the specific aspects of data that a proposal could implicate,which could make it easier for risks,benefits,trade-offs and options to be identified.118 In light of the current impasse,the following sections of this report are designed to encourage relevant stakehold

267、ers operating in the privacy landscape to look forward to the next 10-15 years and set aside any pre-conceptions,taking a strategic long-term view over how different concerns and needs on privacy technology can be reconciled.117 Ibid,4.118 Ibid,5.The Future of Privacy by Design Technology:Policy imp

268、lications for UK security 36 2.Future Horizons of Privacy Technology The research team identified the drivers of change shaping the future PBD landscape from both a technical and policy perspective,then systematically analysed these drivers to develop several potential scenarios(vignettes)of what th

269、e future could look like depending on how these drivers develop and interact.Section 0 summarises the drivers of change considered influential in determining the future PBD policy context,and Section 0 presents the axis of uncertainty demonstrating the degree of certainty or uncertainty associated w

270、ith each driver.The scenario vignettes are not included in full,but rather were used as the basis for the vision for 2035 presented in Section 0.2.1 Drivers of change shaping the future landscape The following drivers of change were identified based on systematic analysis of primary research data co

271、llected for this project,as detailed in the Methodology section.The first 6 drivers are assessed as certain in relation to the next 12 years.This means that the research team has concluded that these factors,dynamics and trends will continue on their current trajectory.These certain drivers are as f

272、ollows.Driver of change Description Public Safety Pressures on Law Enforcement and Intelligence Agencies Agencies continue to face significant pressure to use all measures at their disposal to protect citizens from harm.This pressure will push governments towards needing to find new solutions to the

273、 challenges posed by PBD technology,given the low tolerance for risk in public perceptions of security.While opinion polls conducted in 2021 show that the UK public has greater trust in the intelligence community than in politicians,119 these agencies face continuous pressure to prevent future threa

274、ts.Even one successful attack could be perceived as an example of 119 Dan Lomas,“More Open to Stay Secret:UK Intelligence Agency Openness and the Public,”RUSI Commentary,December 7,2021,https:/rusi.org/explore-our-research/publications/commentary/more-open-stay-secret-uk-intelligence-agency-openness

275、-and-public;Julian ONeill&Niall Glynn,“Northern Ireland terrorism threat level rises,”BBC News,March 28,2023,https:/www.bbc.co.uk/news/uk-northern-ireland-65096493.Sam Stockwell,Andrew Glazzard and Alexander Babuta 37 intelligence failure,and tolerance for error is extremely low,as reflected in the

276、aftermath of the 2017 Manchester Arena bombings.120 Perceived Risk of Going Dark As has been explored in the literature,121 the threat that PBD technology poses to existing intelligence capabilities will make it harder to identify investigative targets and cause increasing reliance on methods that c

277、ould be more intrusive,such as those involving bulk powers.122 This has potential disadvantages,such as the creation or exploitation of technical vulnerabilities that weaken cybersecurity.123 Increasing availability and use of PBD technology will continue to restrict the array of low-intrusion metho

278、ds available,leaving agencies with no choice but to resort to more intrusive methods.Standards Developing Organisations and Internet Governance Forums as an Arena of Geopolitical Competition SDOs and wider Internet governance forums are increasingly becoming an arena for competition between nation s

279、tates,and will be important in shaping the future of global Internet governance.124 Authoritarian regimes have invested significant resource in influencing privacy debates within some SDOs in a direction that provides them with greater control over citizen data,enabling them to further repress their

280、 populations.125 For instance,China submitted 830 ITU standards in 2019,which was more than the next three countries combined(including the US).126 By contrast,in other SDOs,the UK and other democracies have struggled to successfully raise the case for public safety perspectives relating to new priv

281、acy standards.Supranational Nature of the Technology Industry The most powerful technology companies often have headquarters based outside the UK amidst their increasing engagement in market activities that span multiple jurisdictions.127 The limited extent to which they can be 120 John Saunders,Man

282、chester Arena Inquiry Volume 3:Radicalisation and Preventability(Home Office:March 2023),https:/assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1139712/MAI_Final_PDF_Volume_3.pdf,105.121 See sub-section on Capabilities and risks to capabilities in Section 1 of

283、 this report.122 Interview with government experts,11 January 2023;25 January 2023.It is important to note that the categories of investigative powers are highly context-specific and it is difficult to generalise their degrees of intrusion without the use case in question.123 Interview with governme

284、nt expert,11 January 2023;Baker(2019).124 Interview with government experts,11 January 2023;25 January 2023;21 February 2023;Sneha Dawda,“Competing for the Middle Ground in Internet Governance,”RUSI Commentary,June 22,2022,https:/www.rusi.org/explore-our-research/publications/commentary/competing-mi

285、ddle-ground-Internet-governance.125 Interview with government expert,11 January 2023;Schaefer&Pletka(2022).126 Schaefer&Pletka(2022).127 Henry Farrell and Abraham L.Newman,Of Privacy and Power:The Transatlantic Struggle over Freedom and Security(Princeton:Princeton University Press,2019),6.The Futur

286、e of Privacy by Design Technology:Policy implications for UK security 38 held accountable through domestic legislation enables them to evade financial penalties and accumulate greater economic influence unimpeded.128 Commercialisation of Privacy Features Technology companies are increasingly using p

287、rivacy features as a central component of their marketing campaigns for new products and services.At the same time,privacy features offer companies a competitive advantage as they deprive competitors of valuable data sources.129 Technology providers therefore have a commercial interest in maximising

288、 data privacy,but in a way that does not necessarily incorporate public safety considerations.130 Authoritarian Nation-State Market Influence Authoritarian states constitute substantial and growing markets for technology.While the US is considering the mandatory review of certain outbound investment

289、s into foreign countries based on concerns of authoritarian state involvement,industry will also be keen to maintain variable geometry in the privacy capabilities it makes available to nations.131 For instance,Apples iCloud Private Relay was not implemented in ten countries,including Russia and Chin

290、a,purportedly for regulatory reasons.132 The above drivers of change are expected to remain consistent,meaning future policy approaches should be developed in such a way that accounts for the reality of these factors,rather than attempting to change or reverse them.By contrast,the following 7 driver

291、s of change are considered uncertain in relation to the next 12 years.This means that it is unclear whether these trends will continue on their current trajectory.They should be considered as fluctuating and context-specific factors,meaning these are the areas where medium-term policy interventions

292、may have the greatest chance of success in influencing the future landscape.128 Interview with government expert,11 January 2023;25 January 2023;24 February 2023;2 March 2023.129 Rick Braddock,“How Big Tech uses data privacy concerns for market dominance,”VentureBeat,last modified April 18,2022,http

293、s:/ Interview with government expert,21 February 2023.131 Emily Benson and Margot Putnam,“The United States Prepares to Screen Outbound Investment,”Center for Strategic and International Studies(CSIS),last modified April 27,2023,https:/www.csis.org/analysis/united-states-prepares-screen-outbound-inv

294、estment.132 Stephen Nellis and Paresh Dave,“Apples new private relay feature will not be available in China,”Reuters,June 8,2021,https:/ Stockwell,Andrew Glazzard and Alexander Babuta 39 Driver of change Description Digital Literacy and Awareness of Privacy Issues among the UK Public The level of di

295、gital awareness and perceptions of cyber vulnerabilities across the general public is likely to depend on several factors,such as the pace of new privacy developments and government initiatives to encourage digital hygiene education in schools.133 Public proficiency in these issues will also influen

296、ce other drivers,such as perceptions on the relationship between privacy and security rights,as well as attitudes towards proposals for lawful access.134 Privacy Technology Design Processes At one end of the spectrum,the design processes for PBD features can occur without compromise or a comprehensi

297、ve exploration of the different trade-offs involved.135 This is often based on a belief that,when weighing up different human rights implications,user privacy cannot be compromised regardless of arguments for other security concerns.136 As such,some influential voices remain opposed to any form of d

298、igital intrusion by the state.At the other end of this spectrum,there are actors who believe that although the state may have data access in certain situations and accept that trade-offs are achievable,the security of systems should not be compromised to facilitate such access.Legal Frameworks and J

299、urisdictions The degree to which new legal frameworks or regulations will have the necessary coverage and enforcement authority to strengthen the integration of collective security concerns related to privacy technology is uncertain.Often market activities span multiple jurisdictions,which results i

300、n overlapping regulatory claims made by different sources of authority and the emergence of cross-national tensions.137 The UK has historically lagged behind in the enforcement of other regulatory interventions related to technology companies,such as in antitrust and competition law,despite current

301、legislative efforts to implement the OSB.138 133 CETaS workshop,20 April 2023.134 CETaS workshop,20 April 2023.135 Interview with government expert,21 February 2023.136 Civil society organisation participant at CETaS workshop,10 May 2023.137 Farrell&Newman(2019),6.138 Tim Cowen,Closing the Enforceme

302、nt Gap:Proposals for reform and increasing the speed of enforcement action in Big Tech competition cases(ResPublica:May 2022),3-4,https:/www.respublica.org.uk/wp-content/uploads/2022/05/Closing-the-Enforcement-Gap.pdf.The Future of Privacy by Design Technology:Policy implications for UK security 40

303、Dynamics of Relations between the Technology Industry and Government Despite current tensions over the issue of privacy technology between industry and government,these dynamics may improve in the future.This may range from reduced corporate power forcing industry to pursue more cordial relations,to

304、 industry recognising the benefits of weighing privacy and public safety considerations with greater balance into new technology,which may encourage proactive engagement with government.139 Public Trust of UK Government Institutions The level of public trust in government institutions is likely to b

305、e determined by factors outside the privacy debate.This will,however,impact the level of support for any proposed expansion of government and law enforcement powers.Collective Power of the Technology Industry There is no guarantee that the combined power of the technology industry will continue to g

306、row.Indeed,any sudden changes in market conditions or new legislation may significantly weaken the influence of the sector.140 Pace of Privacy Technology Development The pace of innovation regarding PBD technology may not continue to rapidly increase.Indeed,there may be a saturation point where indu

307、stry is unable to advance further on new privacy features or face a significant time gap before they can be developed.2.2 Axes of uncertainty As the previous 7 drivers of change which are characterised by uncertain trajectories may have a range of plausible outcomes,the research team determined the

308、most extreme ends of the spectrum at which they could materialise.This was achieved through creating an axis of uncertainty,indicating the opposite trajectories for each of these drivers.141 These are provided below in the form of scales.142 139 CETaS workshop,20 April 2023.140 CETaS workshop,20 Apr

309、il 2023.141 See:HM Government,The Futures Toolkit:Tools for Futures Thinking and Foresight Across UK Government(Government Office for Science:2017),46-48,https:/assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/674209/futures-toolkit-edition-1.pdf.142 It is impo

310、rtant to note that no values of preference have been attached to either side of the extreme trajectories.Sam Stockwell,Andrew Glazzard and Alexander Babuta 41 Extreme trajectory Driver of change Extreme trajectory High digital literacy and awareness Digital Literacy and Awareness of Privacy Issues a

311、mong the UK Public Low digital literacy and awareness Acceptable compromise on designs Privacy Technology Design Processes No compromise on designs Stringent regulation Legal Frameworks and Jurisdictions Self-regulation Collaborative Dynamics of Relations between the Technology Industry and Governme

312、nt Adversarial Full trust Public Trust of UK Government Institutions Low trust Progressively weakened Collective Power of the Technology Industry More powerful than governments Plateaued Pace of Privacy Technology Development Unbridled The Future of Privacy by Design Technology:Policy implications f

313、or UK security 42 3.An Alternative Vision for Future PBD Technology The previous section outlined the different pathways which the PBD ecosystem could follow in the next ten years and beyond,highlighting driver projections that require some form of proactive policy action.Based on systematic analysi

314、s of these drivers,this section now explores an alternative vision for 2035.In outlining a series of beneficial goals related to the drivers of change,it seeks to achieve a truly multistakeholder approach that integrates a diverse range of perspectives and an acceptable balance between the different

315、 trade-off concerns held by government,academia,civil society,industry and law enforcement.The vision presented here is based on the full range of primary research data collected for this project,including from expert interviews,as well as the multistakeholder workshops with civil society organisati

316、ons,academia,industry,SDOs and policymakers from across UK national security and central policy functions.It is an idealised and optimistic vision,and is intended to provide a tangible end-goal towards which to direct future policy efforts.3.1 Privacy 2035:An alternative vision In 2035,new privacy t

317、echnology is designed and implemented in a way that takes into account a diverse range of stakeholder perspectives,including academia,civil society organisations,government and law enforcement agencies.Industry bodies are encouraged to integrate this approach through a combination of incentives and

318、penalties.On one hand,companies which demonstrate how the impact of PBD proposals on different human rights and the security trade-offs involved have reached an accepted compromise between relevant stakeholders receive financial benefits.This includes R&D funding and grants for the implementation of

319、 these programmes.On the other hand,legal obligations are now more clearly understood and consistently enforced in relation to their interaction with new privacy designs.Companies that implement PBD technology preventing them from effectively eliminating illegal content on their platforms receive fi

320、nancial penalties through regulatory enforcement action.Consumers are given the choice to decide what degree of privacy features they would like to enable on a device,such as not embedding childrens devices with end-to-end encryption to facilitate child safety monitoring.At a conceptual level,the de

321、bate has also moved on from one that sees the rights of privacy and security as being mutually exclusive and in tension.By 2035,a more diverse range of Sam Stockwell,Andrew Glazzard and Alexander Babuta 43 voices and positions is considered in discussions.There is now recognition among all stakehold

322、ers that the scope of both privacy and security should be expanded to incorporate different contexts and perspectives,such as cyber-based security,ethical values and transparency.Rather than treating a topic as complex as encryption as a singular issue,discussions now occur on a case-by-case basis a

323、nd target specific elements of the debate(e.g.proposals exploring encrypted data at rest vs.data in transit)and provide greater clarity on the opportunities and risks involved.Many of the aforementioned changes have been consolidated by government-led public education strategies.In 2035,citizens are

324、 far more aware of the scale and ways in which their personal data is used by companies,leading to criticism over the lack of civil society representation in technical design processes.There has also been a cultural shift away from a viewpoint which emphasises the superiority of individual autonomy

325、and towards one that understands the need to take a more compromising approach to the trade-offs between competing security interests involved in the implementation of new privacy features.Public trust in both central government and investigative agencies is also at greater levels than it was ten ye

326、ars ago.While there remains a healthy level of scrutiny,more open dialogue from government on the importance of lawful access and the targeted,highly regulated nature of investigative activity has removed much of the previous mistrust over national security efforts in this space.Nevertheless,agencie

327、s have also appreciated that in gaining greater public consent over the use of investigate capabilities,any new methods introduced also pose an increased security risk of enabling unlawful access through the exploitation of cyber vulnerabilities.They also recognise that although certain human rights

328、 will be better protected,such as those pertaining to public safety,others including that of privacy may be impacted.Given the previous challenges associated with various lawful access proposals,agencies now work closely with experts in civil society,academia and industry to reduce the technical com

329、plexity and possible legal infringements of their designs including by sharing unclassified assessments of the potential investigative impacts of new PBD proposals.On the international stage,progress over the last decade has seen the creation of a voluntary international policy on privacy technology

330、 between the UK,like-mind democracies including the US and EU as well as key powers in the Global South,such as India and South Africa.Principles within this agreement standardise jurisdictions between governments and industry to reduce the barriers to constructive engagement on the development of P

331、BD designs,while promoting human-centric approaches which ensure The Future of Privacy by Design Technology:Policy implications for UK security 44 that different human rights implications are fully accounted.Alongside fostering greater alignment on key positions,multistakeholder coordination has als

332、o resulted in significant changes to the dynamics of SDOs.Previous processes for the development of technical specifications,which were primarily based on the level of resources stakeholders possessed,were recognised as being unfair and marginalising wider input from groups with opposing views.Since

333、 then,public safety working groups have been integrated into these organisations by a UK-led coalition,leading to greater awareness and compromise over the competing security risks emerging from new standards.This is seen as beneficial to improving the robustness and long-term viability of the Internets protocols.Figure 4.Privacy feature design processes in 2035 Sam Stockwell,Andrew Glazzard and A

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(CETaS:2023隐私设计技术未来发展趋势:对英国安全的政策影响研究报告(英文版)(61页).pdf)为本站 (白日梦派对) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
会员购买
小程序

小程序

客服

专属顾问

商务合作

机构入驻、侵权投诉、商务合作

服务号

三个皮匠报告官方公众号

回到顶部