上海品茶

您的当前位置:上海品茶 > 报告分类 > PDF报告下载

英国皇家学会:从隐私到协作-隐私增强技术在数据治理和协作分析中的作用(英文版)(112页).pdf

编号:115128 PDF   DOCX 112页 2.50MB 下载积分:VIP专享
下载报告请您先登录!

英国皇家学会:从隐私到协作-隐私增强技术在数据治理和协作分析中的作用(英文版)(112页).pdf

1、From privacy to partnershipThe role of privacy enhancing technologies in data governance andcollaborative analysisFrom privacy to partnership Issued:January 2023 DES7924ISBN:978-1-78252-627-8 The Royal Society The text of this work is licensed under the terms ofthe Creative Commons Attribution Licen

2、se which permits unrestricted use,provided the original author and source are credited.The license is available at:creativecommons.org/licenses/by/4.0Images are not covered by this license.This report can be viewed online at:royalsociety.org/privacy-enhancing-technologiesCover image:Visualisation of

3、 the Internet 1997 2021,byBarret Lyon as part of the the Opte Project.Barrett Lyon/The Opte Project.ContentsContentsForeword 4executive summary 5Scope 5Methodology 5Key findings 6Recommendations 8Introduction 18Background 18Key terms and definitions 19Chapter one:the role of technology in privacy-pr

4、eserving data flows 22Data privacy,data protection and information security 23What are privacy enhancing technologies(PETs)?23A downstream harms-based approach:Taxonomy of harms 24Recent international developments in PETs 25Interest in PETs for international data transfer and use 28Accelerating PETs

5、 development:Sprints,challenges and international collaboration 28Chapter two:Building the Pets marketplace 32PETs for compliance and privacy 32PETs in collaborative analysis 33Barriers to PETs adoption:User awareness and understanding in the UK public sector 35Barriers to PETs adoption:Vendors and

6、expertise 36Chapter three:standards,assessments and assurance inPets 42PETs and assurance:The role of standards 42Chapter four:Use cases for Pets 56Considerations and approach 56Privacy in biometric data for health research and diagnostics 57Preserving privacy in audio data for health research and d

7、iagnostics 65PETs and the internet of things:enabling digital twins for net zero 67Social media data:PETs for researcher access and transparency 74Synthetic data for population-scale insights 81Collaborative analysis forcollectiveintelligence 86Online safety:Harmful content detection on encrypted pl

8、atforms 90Privacy and verifiability in onlinevoting and electronic publicconsultation 95PETs and the mosaic effect:Sharing humanitarian data in emergencies and fragile contexts 97Conclusions 106Appendices 108Appendix 1:Definitions 108Appendix 2:Acknowledgements 109FRoM PRIVACY to PARtneRsHIP PoLICY

9、RePoRt 3FoRewoRdForewordThe widespread collection and use of data is transforming all facets of society,from scientific research to communication and commerce.The benefits of using data in decision making are increasingly evident in tackling societal problems and understanding the world around us.At

10、 the same time,there are inherent vulnerabilities when sensitive data is stored,used or shared.From privacy to partnership sets out how an emerging set of privacy enhancing technologies(PETs)might help to balance the risks and rewards of data use,leading to wider social benefit.It follows the Royal

11、Societys Protecting privacy in practice:The current use,development and limits of Privacy Enhancing Technologies in data analysis,which gave a snapshot of this rapidly developing field in 2019.This new publication offers a refreshed perspective on PETs,not only as security tools,but as novel means t

12、o establish collaborative analysis and data partnerships that are ethical,legal and responsible.We have three objectives for this report.Our first objective is that the use cases inspire those collecting and using data to consider the potential benefits of PETs for their own work,or in new collabora

13、tions with others.Second,for the evidence we present on barriers to adoption and standardisation to help inform policy decisions to encourage a marketplace for PETs.Finally,through our recommendations,we hope the UK will maximise the opportunity to be a global leader in PETs both for data security a

14、nd collaborative analysis alongside emerging,coordinated efforts to implement PETs in other countries.Our report arrives at a time of rapid innovation in PETs,as well as data protection legislation reform in the United Kingdom.The intention is not to provide a comprehensive view of all technologies

15、under the broad umbrella of PETs;rather,we have chosen to focus on a subset of promising and emerging tools with demonstrable potential in data governance.In demonstrating this value,we cite examples from the UK and international contexts.Realising the full potential of PETs across national borders

16、will require further harmonisation,including consideration of data protection laws in various jurisdictions.Artificial intelligence and machine learning are transforming our capacity to assess and confront our greatest challenges,but these tools require data to fuel them.As a biomedical engineer usi

17、ng AI-assistive technologies to detect disease,I recognise that the greatest research problems of our time from cancer diagnostics to the climate crisis are,in a sense,data problems.The value of data is most fully realised through aggregation and collaboration,whether between individuals or institut

18、ions.I hope this report will inspire new approaches to data protection and collaboration,encouraging further research in and testing of PETs in various scenarios.PETs are not a silver bullet,but they could play a key role in unlocking the value of data without compromising privacy.By enabling new da

19、ta partnerships,PETs could spark a research transformation:a new paradigm for information sharing and data analysis with real promise for tackling future challenges.Professor Alison noble oBe FReng FRs,Chair of the Royal society Privacy enhancing technologies working GroupAlison Noble OBE FREng FRS4

20、 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtexeCUtIVe sUMMARYExecutive summary1 Trask A.in Lunar Ventures(Lundy-Bryan L.)2021 Privacy Enhancing Technologies:Part 2the coming age of collaborative computing.See https:/ 20 September 2022).2 Infocomm Media Development Authority(Singapore grows trust in th

21、e digital environment).See https:/www.imda.gov.sg/news-and-events/Media-Room/Media-Releases/2022/Singapore-grows-trust-in-the-digital-environment(accessed 5 June 2022).3 The Royal Society.2019 Protecting privacy in practice:The current use,development and limits of Privacy Enhancing Technologies in

22、data analysis.See https:/royalsociety.org/-/media/policy/projects/privacy-enhancing-technologies/privacy-enhancing-technologies-report.pdf(accessed 30 June 2022).Privacy Enhancing Technologies(PETs)are a suite of tools that can help maximise the use of data by reducing risks inherent to data use.Som

23、e PETs provide new techniques for anonymisation,while others enable collaborative analysis on privately-held datasets,allowing data to be used without disclosing copies of data.PETs are multi-purpose:they can reinforce data governance choices,serve as tools for data collaboration or enable greater a

24、ccountability through audit.Forthese reasons,PETs have also been described as Partnership Enhancing Technologies1 or Trust Technologies2.This report builds on the Royal Societys 2019 publication Protecting privacy in practice:Thecurrent use,development and limits of Privacy Enhancing Technologies in

25、 data analysis3,which presented a high-level overview of PETs and identified how these technologies could play a role in addressing privacy in applied data science research,digital strategies and data-driven business.This new report,developed in close collaboration with the Alan Turing Institute,con

26、siders how PETs could play a significant role in responsible data use by enhancing data protection and collaborative data analysis.It is divided into three chapters covering the emerging marketplace for PETs,the state of standards and assurance and use cases forPETs.scope From privacy to partnership

27、 outlines the current PETs landscape and considers the role of these technologies in addressing data governance issues beyond data security.The aim of this report is to address the following questions:How can PETs support data governance and enable new,innovative,uses of data for public benefit?What

28、 are the primary barriers and enabling factors around the adoption of PETs in data governance,and how might these be addressed or amplified?How might PETs be factored into frameworks for assessing and balancing risks,harms and benefits when working with personal data?Methodology This work was steere

29、d by an expert Working Group as well as two closed contact group sessions with senior civil servants and regulators in April and October 2021(on the scope and remit of the report,and on the use case topics and emerging themes,respectively).FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 5exeCUtIVe sUMMARY

30、The findings in this report are the result of consultations with a wide range of data and privacy stakeholders from academia,government,third sector,and industry,as well as three commissioned research projects on the role of assurance in enabling the uptake of PETs4,PETs market readiness in the publ

31、ic sector5,and a survey of synthetic data:data that is artificially generated based on real-world data,but which produces new data points6.The use cases were drafted with input from domain specialists,and the report was reviewed by expert readers as well as invited reviewers.The details of contribut

32、ors,Working Group members,expert readers and reviewers are provided in the Appendix.Key findingsGeneral knowledge and awareness of PETs remains low amongst many potential PETs users7,8,with inherent risk of using new and poorly understood technologies acting as a disincentive to adoption.Few organis

33、ations,particularly in the public sector,are prepared to experiment with data protection9.Without in-house expertise,external assurance mechanisms or standards,organisations are unable to assess privacy trade-offs for a given PET or application.As a result,the PETs value proposition remains abstract

34、 and the business case for adopting PETs is unclear for potential users.4 Hattusia 2022 The current state of assurance in establishing trust in PETs.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/5 London Economics and the Open Data Institute.2022

35、 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/.This project was partly funded by a grant from CDEI.6 Jordon J et al.2022 Synthetic data:What,why and how?See https:/arx

36、iv.org/pdf/2205.03257.pdf(accessed 2 September 2022).7 London Economics and the Open Data Institute.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/.8 Lunar Ventures

37、,Lundy-Bryan L.2021 Privacy Enhancing Technologies:Part 2the coming age of collaborative computing.See https:/ 20 September 2022).9 London Economics and the Open Data Institute.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociet

38、y.org/topics-policy/projects/privacy-enhancing-technologies/.10 Ibid.Standardisation for PETs,including data standards,is lacking and is cited as a hindrance to adoption by potential users in the UK public sector10.Technical standards are required to ensure the underpinning technologies work as inte

39、nded,while process standards are needed to ensure users know how and when to deploy them.While few PETs-specific standards exist to date,standards in adjacent fields(such as cybersecurity and AI)will be relevant.In the future,PETs-specific standards could provide the basis for assurance schemes to b

40、olster userconfidence.6 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtexeCUtIVe sUMMARYA significant barrier to the widespread use of PETs is a lack of clear use cases for wider public benefit.To address this,Chapter 4 illustrates the potential benefit of PETs in the contexts of:Using biometric data for

41、health research and diagnostics;Enhancing privacy in the Internet of Things and in digital twins;Increasing safe access to social media data and accountability on social media platforms;Generating population-level insights using synthesised national data;Collective intelligence,crime detection and v

42、oting in digital governance;and PETs in crisis situations and in analysis of humanitarian data:The use cases demonstrate how PETs might maximise the value of data without compromising privacy.A core question for potential PETs users is:What will PETs enable an analyst to do with data that could not

43、be accomplished otherwise?Alternatively:What will PETs prevent an adversary from achieving?As the use cases illustrate,PETs are not a silver bullet solution to data protection problems.However,they may be able to provide novel building blocks for constructing responsible data governance systems.For

44、example,in some cases,PETs could be the best tools for reaching legal obligations,such as anonymity.11 For example,Meta recently conducted a survey collecting personal data,which was encrypted and split into shares between third-party facilitators,namely universities.Analyses can be run using secure

45、 multi-party computation;requests for analysis must be approved by all third-party shareholders.See https:/ 10 October 2022).Data protection is only one aspect of the right to privacy.In most cases,PETs address this one aspect but do not address how data or the output of data analysis is used,althou

46、gh this could change as PETs mature.Some recent applications utilise PETs as tools for accountability and transparency,or to distribute decision-making power over a dataset across multiple collaborators11,suggesting their potential in addressing elements of privacy beyond data security.The field of

47、PETs continues to develop rapidly.This report aims to consolidate and direct these efforts toward using data for public good.Through novel modes of data protection,PETs are already enhancing the responsible use of personal data in tackling significant contemporary challenges.The emerging role of PET

48、s as tools for partnership,enhancing transparency and accountability may entail greater benefits still.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 7ReCoMMendAtIonsRecommendationsAReA FoR ACtIon:COORDINATED INTERNATIONAL ACTION TO ENSURE THE RESPONSIBLE DEVELOPMENT OF PETS FOR PUBLIC BENEFITReCoMMendAt

49、Ion 1National and supernational organisations,including standards development organisations(SDOs)should establish protocols and standards for PETs,and their technical components,as a priority.12 Hypertext Transfer Protocol.13 Institute of Privacy Design(The DRAFT Design Process Standard).See https:/

50、instituteofprivacydesign.org/2022/02/11/the-draft-design-process-standard/(accessed 2 September 2022).PETs have been developed by experts in different fields and with little coordination between them to date.The greatest potential for PETs whether used in isolation or combination are as components o

51、f data governance systems.Open standards(available for use by anyone)are likely to help drive the development,accessibility and uptake of PETs for data governance.Furthermore,standards will be necessary for audit and assurance,encouraging a marketplace of confident PETs users with effective regulati

52、on and quality assurance marks where appropriate.SDOs such as the British Standards Institute(BSI)(UK),National Physical Laboratory(UK),Institute of Electrical and Electronics Engineers(IEEE)(US),the National Cyber Security Centre(UK)and National Institute of Standards and Technology(NIST)(US)should

53、 identify and convene international expert groups to address gaps in PETs technical standards.These should build on existing standards in cryptography and information security(Chapter 3).Open standards will be especially important in PETs that enable information networks,such as secure multi-party c

54、omputation or federated learning(similar to how HTTP12 provided a common set of rules that enabled communication over the Internet).Alongside technical standards,process standards should guide best practice in the application of PETs in data governance.Privacy best practice guides,codes of conduct a

55、nd process standards(such as the draft Institute of Privacy Design Process Standard13)could be used to integrate PETs into a privacy-by-design approach to data governance systems.Whereas technical standards will be essential for technical interoperability,codes of conduct for PETs in data management

56、 and use will be critical for social interoperability and acceptance in partnerships and digital collaborations on new scales(such as international or cross-sector partnerships).8 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtReCoMMendAtIonsReCoMMendAtIon 2Science funders,including governments and interg

57、overnmental bodies,should accelerate and incentivise the development and maturation of PETs by funding prize challenges,pathfinder projects(such as topic guides or resource lists)and cross-border,collaborative test environments(such as an international PETs sandbox).14 UK Research and Innovation(Dig

58、ital security by design challenge).See https:/www.ukri.org/what-we-offer/our-main-funds/industrial-strategy-challenge-fund/artificial-intelligence-and-data-economy/digital-security-by-design-challenge/(accessed 20 September 2022).15 Commission Nationale de lInformatique et des Liberts(Un bac sable R

59、GPD pour accompagner des projets innovants dans le domaine de la sant numrique).See https:/il.fr/fr/un-bac-sable-rgpd-pour-accompagner-des-projets-innovants-dans-le-domaine-de-la-sante-numerique(accessed 15 September 2022).Science funders should foster a network of independent researchers and univer

60、sities working on PETs challenges that address PETs in security,partnerships and transparency applications.They could involve the private sector(for example cloud providers and social media platforms)in designing challenges and through international cooperation on standards,guidance and regulation.T

61、o date,exemplary programmes include the UK-US PETs Prize Challenge led by the UKs Centre for Data Ethics and Innovation(CDEI)and the US White House Office of Science and Technology Policy;the Digital Security by Design Challenge14 funded through UK Research and Innovation;theData.org Epiverse Challe

62、nge funding call;and the French data protection authority sandbox ondigital health and GDPR15.Intragovernmental bodies such as the United Nations and Global Partnership for Artificial Intelligence should lead by creating test environments and providing data for demonstrations to test the security,pr

63、ivacy,and utility potentials of specific PETs,as well as test configurations of PETs.An international PETs sandbox would allow national regulators to collaborate and evaluate PETs solutions for cross-border data use according to common data governance principles.FRoM PRIVACY to PARtneRsHIP PoLICY Re

64、PoRt 9ReCoMMendAtIonsReCoMMendAtIon 3Researchers,regulators and enforcement authorities should investigate the wider social and economic implications of PETs,for example,how PETs might be used in novel harms(such as fraud or linking datasets for increased surveillance)or how PETs might affect compet

65、ition in digitised markets(such as monopolies through new network effects).16 Liberty Human Rights(Challenge hostile environment data-sharing).See https:/www.libertyhumanrights.org.uk/campaign/challenge-hostile-environment-data-sharing/(accessed 20 September 2022).17 Ongoing research highlights the

66、negative consequences of data sharing in dual-use or otherwise sensitive contexts.For example:Papageogiou V,Wharton-Smith A,Campos-Matos I,Ward H.2020 Patient data-sharing for immigration enforcement:a qualitative study of healthcare providers in England.BMJ Open.(https:/doi.org/10.1136/bmjopen-2019

67、-033202)18 Liberty Human Rights(Liberty and Southall Black Sisters Super-complain on data-sharing between the police and home office regarding victims and witnesses to crime).See https:/www.libertyhumanrights.org.uk/issue/liberty-and-southall-black-sisters-super-complaint-on-data-sharing-between-the

68、-police-and-home-office-regarding-victims-and-witnesses-to-crime/(accessed 20 September 2022).19 Go FAIR(FAIR principles).See https:/www.go-fair.org/fair-principles/(accessed 20 September 2022).The potential follow-on effects of PETs adoption are not well understood,particularly whether and how they

69、 might amplify data monopolies,or what oversight mechanisms are required to prevent the type of collaborative analysis that might be considered state surveillance16.For example,the Arts and Humanities Research Council could consider the ethical,social and economic implications of PETs within their p

70、rogram on AI(particularly where PETs could be dual use or surveillance technologies)17,18.Regulators,such as the Information Commissioners Office(ICO)and the Competition and Markets Authority(CMA),could investigate the wider economic implications of PETs,particularly where they could enable competit

71、ion through greater interoperability(as with open banking,for example).It is not understood how the adoption of PETs aligns with FAIR19 principles,particularly where PETs(such as privacy-preserving synthetic data)are used as an alternative to open data.In collaborative analysis,the ability to audit

72、data that is not shared should be better understood by those who might use PETs(to identify potential for biased outcomes,for example).The relationship between PETs and data trusts also remains ambiguous.10 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtReCoMMendAtIonsAReA FoR ACtIon:A STRATEGIC AND PRAGM

73、ATIC APPROACH TO PETS ADOPTION IN THE UK,LED BY THE PUBLIC SECTOR THROUGH PUBLIC-PRIVATE PARTNERSHIPS,DEMONSTRATION OF USE CASES AND COMMUNICATION OF BENEFITSReCoMMendAtIon 4The UK Government should develop a national PETs strategy to promote the responsible use of PETs in data governance:as tools f

74、or data protection and security,for collaboration and partnership(both domestically and cross-border)and for advancing scientific research.20 US Office for Science and Technology Policy(Request for Information on Advancing Privacy-Enhancing Technologies).https:/public-inspection.federalregister.gov/

75、2022-12432.pdf(accessed 17 July 2022).21 HM Government(National Data Strategy).See https:/www.gov.uk/government/publications/uk-national-data-strategy/national-data-strategy(accessed 9 September 2022).22 HM Government(National AI Strategy).See https:/www.gov.uk/government/publications/national-ai-st

76、rategy(accessed 9 September 2022).PETs could reform the way data is used domestically and across borders,offering potential solutions to longstanding problems of siloed and underutilised data across sectors.To ensure the use of PETs for public good,PETs-driven information networks should be stewarde

77、d by public sector and civil society organisations using data infrastructure for public good.A coordinated national strategy for the development and adoption of PETs for public good will ensure the timely and responsible deployment of these technologies,with the public sector leading by example.PETs

78、 have a role to play in achieving the objectives outlined in Mission 2 of the National Data Strategy,securing a pro-growth and trusted data regime,positioning the UK internationally as a trusted data partner,with wider implications for national security.This recommendation reflects emerging,coordina

79、ted PETs work in foreign governments(such as that led in the US by the White House Office for Science and Technology Policy)20.The PETs strategy should offer a vision that complements the Governments National Data Strategy21 and National AI Strategy22.The PETs strategy should prioritise a roadmap fo

80、r public sector PETs adoption,addressing public awareness and the PETs marketplace(Chapter 2),technological maturity,appropriate regulatory mechanisms and responsibilities,alongside standards and codes of conduct for PETs users(Chapter 3).FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 11ReCoMMendAtIonsRe

81、CoMMendAtIon 5Local,devolved and national governments across the UK should lead by example in the adoption of PETs for data sharing and use across government and in public-private partnerships,improving awareness by communicating PETs-enabled projects and their results.23 The Royal Society.Creating

82、trusted and resilient data systems:The public perspective.(to be published online in 2023)24 This is in line with the Digital Economy Act 2017.See:The Information Commissioners Office(Data sharing across the public sector:the Digital Economy Act codes).See https:/ico.org.uk/for-organisations/guide-t

83、o-data-protection/ico-codes-of-practice/data-sharing-a-code-of-practice/data-sharing-across-the-public-sector-the-digital-economy-act-codes/(accessed 2 September 2022).Public sector organisations could partner with small and medium-sized enterprises(SMEs)developing PETs to identify use cases,which c

84、ould then be tested through low-cost,low-risk pilot projects.Legal experts and interdisciplinary policy professionals should be involved from project inception,ensuring PETs meet data protection requirements and that outcomes and implications are properly communicated to non-technical decision-maker

85、s.Use cases illustrated in Chapter 5 highlight areas of significant potential public benefit in healthcare and medical research,for reaching net zero through national digital twins and for population-level data collaboration.Communication of PETs and their appropriate use in various contexts will be

86、 key to building trust with potential users23,encouraging the PETs marketplace(Chapter 2).The ICO should continue its work on using PETs for wider good and communicating the implications including barriers and potential benefits.The CDEI should continue to provide practical examples that will help o

87、rganisations understand and build a business case for PETs adoption.Proof of concept and pilot studies should be communicated to the wider public to demonstrate the value of PETs,foster trust in public sector data use and demonstrate value-for-money24.12 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtReCo

88、MMendAtIonsReCoMMendAtIon 6The UK Government should ensure that new data protection reforms account for the new systems of data governance enabled by emerging technologies such as PETs and ensure any new regulations are supported by clear,scenario-specific guidance and assessment tools.25 The Inform

89、ation Commissioners Office(ICO consults health organisation to shape thinking on privacy-enhancing technologies).See https:/ico.org.uk/about-the-ico/media-centre/news-and-blogs/2022/02/ico-consults-health-organisations-to-shape-thinking-on-privacy-enhancing-technologies/(accessed 20 March 2022).26 N

90、guyen T,Sun K,Wang S,Guitton F,Guo Y.2021.Privacy preservation in federate learning An insightful survey from the GDPR perspective.Computers&Security 110.(https:/doi.org/10.1016/j.cose.2021.102402)27 See for example:Koerner K.2021 Legal perspectives on PETs:Homomorphic encryption.Medium.20 July 2021

91、.See https:/ 30 June 2022).28 Trask A,Bluemke E,Garfinkel B,Cuervas-Mons CG,Dafoe A.2020 Beyond Privacy Trade-offs with Structured Transparency.See https:/arxiv.org/ftp/arxiv/papers/2012/2012.08347.pdf(accessed 6 February 2022).29 See for example:Meta AI(Assessing fairness of our products while prot

92、ecting peoples privacy).See https:/ 15 August 2022).While data protection legislation should remain technology neutral so as to be adaptable,current plans to review UK data protection laws provide an opportunity to consider the novel and multipurpose nature of these emerging technologies,particularl

93、y as they provide the technical means for new types of collaborative analysis.The ICO should continue its work to provide clarity around PETs and data protection law,encouraging the use of PETs for wider public good25 and drawing from parallel work on AI guidance where relevant(such as privacy-prese

94、rving machine learning).Further interpretation may be required to help users understand how PETs might serve as tools for meeting data protection requirements.For example,it may be required to clarify data protection obligations where machine learning models are trained on personal data in federated

95、 learning scenarios26 or the degree to which differentially private or homomorphically encrypted data meets anonymisation requirements27.Where PETs enable information networks and international data collaborations,the ICO might anticipate clarification questions specific to international and collabo

96、rative analysis use cases.Regulatory sandboxes(as in Recommendation 2)will be useful for testing scenarios,particularly for experimentation with PETs in structured transparency28(such as in open research,credit scoring systems)and as accountability tools29.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 1

97、3ReCoMMendAtIonsReCoMMendAtIon 6(CONTINUED)30 Alliance for Data Science Professionals(Homepage).See https:/alliancefordatascienceprofessionals.co.uk/(accessed 20 September 2022).The ICO could expand on its PETs guidance,for example,through developing self-assessment guides.Data ethics organisations,

98、such as the CDEI,might also develop impact assessment tools,for example,a PETs impact assessment protocol that considers downstream implications on human rights.The Alliance for Data Science Professionals certification scheme30,which defines standards for ethical and well-governed approaches to data

99、 use,could specifically consider the role of PETs in evidencing Skill Areas A(Data Privacy and Stewardship)and E(Evaluation and Reflection).14 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtReCoMMendAtIonsAReA FoR ACtIon:FOUNDATIONAL SCHOLARSHIP AND PROFESSIONALISATION TO ENCOURAGE MATURATION OF PETS,FOST

100、ER TRUST AND DRIVE UPTAKE OF PETS IN DATA-USING ORGANISATIONS31(ISC)(ISC)Information Security Certifications).See https:/www.isc2.org/Certifications#(accessed 13 May 2022).32 International Association of Privacy Professionals(Privacy Technology Certification).See https:/iapp.org/media/pdf/certificat

101、ion/CIPT_BOK_v.3.0.0.pdf(accessed 30 June 2022).ReCoMMendAtIon 7Universities,businesses and science funders should fund foundational scholarship in PETs-related fields,such as cryptography and statistics.Foundational training and fellowships in PETs fundamentals(such as cryptography)for graduate lev

102、el study will create the skilled workforce required for widespread development and implementation of PETs.Critical future-proofing questions could be addressed through fellowships and research posts(for example,evaluating the security guarantees of PETs in a post-quantum context,or the energy propor

103、tionality,sustainability and scalability of energy-intensive,cryptography-based PETs).Internships and work placement programmes in organisations developing PETs could assist new graduates in moving from academic fields into applied PETs research and development.ReCoMMendAtIon 8Organisations providin

104、g certifications and continuing professional development courses in data science,cybersecurity and related fields should incorporate PETs modules to raise awareness among data professionals.Professional certifications and Continuing Professional Development opportunities(including British Computer S

105、ociety Professional Certifications such as the Alliance for Data Science Professionals certification,Data Science Professional Certificates offered by Microsoft or IBM,or(ISC)Certifications31)should include a primer on PETs to raise awareness and encourage baseline knowledge of PETs amongst in-house

106、 data professionals.For example,the International Association of Privacy Professionals now includes a module on PETs in their Certified Information Privacy Technologist Certification32.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 15ReCoMMendAtIonstABLe 1Summary table of PETs explored in this reporttrus

107、ted execution environmentHomomorphic encryptionsecure multi-party computation(PsI/PIR)Context of data useSecurely outsourcing to a server,or cloud,computations on sensitive dataSecurely outsourcing specific operations on sensitive data;Safely providing access to sensitive dataEnabling joint analysis

108、 on sensitive data held by several organisationsPrivacy risk addressedRevealing sensitive attributes present in a dataset during computationRevealing sensitive attributes present in a dataset during computationRevealing sensitive attributes present in a dataset during computationdata protected In st

109、orage During computationX On release In storage During computationX On release*X In storage During computationX On releaseBenefitsCommercial solutions widely available;Zero loss of information;efficient computation of any operationsCan allow zero loss of information;FHE can support the computation o

110、f any operationNo need for a trusted third party-sensitive information is not revealed to anyone;The parties obtain only the resulting analysis or modelCurrent limitationsMany side-channel attacks possible;current commercial solutions limited with regard to distributed computation on big datasetsFHE

111、,SHE and PHE are usable Highly computationally intensive Bandwidth and latency issues Running time PHE and SHE support the computation of limited functions Standardisation in progress Possibility for side channel attacks(current understanding is limited)Highly compute and communication intensive;req

112、uires expertise in design that meets compute requirements and security modelsReadiness levelProductPHE/SHE/FHE in use (FHE on a smaller scale)PSI/PIR/Product,Proof of concept-PilotQualification criteriaCould be exclusive to established research groupsSpecialist skills Custom protocols Computing reso

113、urcesSpecialist skills Custom protocols Computing resources*If the client encrypts their data and sends it to a server for homomorphic computation,only the client is able to access the results(by using their secret decryption key).FHe:Fully Homomorphic Encryption sHe:Somewhat Homomorphic Encryption

114、PHe:Partial Homomorphic EncryptionPIR:Private Information Retrieval PsI:Private Set IntersectionKeY16 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtReCoMMendAtIonstABLe 1Summary table of PETs explored in this reportFederated learning/federated machine learning differential privacy Privacy-preserving synt

115、hetic dataEnables the use of remote data for training algorithms;data is not centralisedPrevents disclosure about individuals when releasing statistics or derived informationPrevents disclosure about individuals when releasing statistics or derived informationRevealing sensitive information,includin

116、g an individuals presence in a datasetRevealing sensitive information,including an individuals presence in a dataset;Dataset or output disclosing sensitive information about an entity included in the datasetRevealing sensitive attributes or presence in a datasetX In storage During computationX On re

117、lease In storage(and at point of data collection)During computation(with limitations)On release(with limitations)X In storage During computation(with limitations)On release(with limitations)Very little loss of informationFormal mathematical proof/privacy guarantee.Level of privacy protection may be

118、quantifiable.Relative to other PETs,it is computationally inexpensive.Applications beyond privacyLevel of privacy protection may be quantifiable(eg,with differentially private synthetic data)Model inversion and membership inference attacks may be vulnerabilitiesNoise and loss of information,unless d

119、atasets are large enough Setting the level of protection requires expertisePrecision of analysis limited inversely to level of protectionNoise and loss of information Setting the level of protection requires expertise Privacy enhancement unclearProduct,in useProof of concept,in useProof of concept,i

120、n useMay require scale of data within each dataset(cross-silo federated learning)Distributed systems are complex and difficult to manageSpecialist skills Custom protocols Very large datasetsAs yet,no standards for setting privacy parametersSpecialist skills requiredAs yet,no standards for generation

121、 or setting privacy parametersFRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 17IntRodUCtIonIntroduction33 The Royal Society.2017 Machine learning:the power and promise of computers that learn by example.See https:/royalsociety.org/media/policy/projects/machine-learning/publications/machine-learning-repor

122、t.pdf(accessed 30 May 2022).34 The British Academy and the Royal Society.2017 Data management and use:Governance in the 21st century.See https:/royalsociety.org/-/media/policy/projects/data-governance/data-management-governance.pdf(accessed 28 July 2022).35 Alsunaidi A J et al.2021 Applications of b

123、ig data analytics to control COVID-19 pandemic.Sensors(Basel)21,2282.(https:/doi.org/10.3390/s21072282 s21072282)36 The Royal Society.2020 Digital technology and the planet:Harnessing computing to achieve net zero.See https:/royalsociety.org/-/media/policy/projects/digital-technology-and-the-planet/

124、digital-technology-and-the-planet-report.pdf(accessed 20 September 2022).37 The Royal Society.2019 Protecting privacy in practice:The current use,development and limits of Privacy Enhancing Technologies in data analysis.See https:/royalsociety.org/-/media/policy/projects/privacy-enhancing-technologi

125、es/privacy-enhancing-technologies-report.pdf(accessed 30 June 2022).Background Data about individuals,their unique characteristics,preferences and behaviours,is ubiquitous and the power to deliver data-driven insights using this information is rapidly accelerating33,34.This unprecedented availabilit

126、y of data,coupled with new capabilities to use data,drives the frontiers of research and innovation addressing challenges from the climate crisis to the COVID-19 pandemic35,36.However,the greater collection,transfer and use of data particularly data which is personal,commercially sensitive or otherw

127、ise confidential also entails increased risks.The tension between maximising data utility(where data is used)and managing risk(where data is hidden)poses a significant challenge to anyone using data to make decisions.This report,undertaken in close collaboration with the Alan Turing Institute,consid

128、ers the potential for tools and approaches collectively known as Privacy Enhancing Technologies(PETs)to revolutionise the safe and rapid use of sensitive data for wider public benefit.It examines the possibilities and limitations for PETs in responsible data governance and identifies steps required

129、to realise their benefits.This work follows the Royal Societys 2019 report Protecting privacy in practice:The current use,development and limits of Privacy Enhancing Technologies in data analysis37,which highlighted the role of PETs in enabling the derivation of useful results from data without prov

130、iding wider access to datasets.Protecting privacy in practice presented a high-level overview of PETs and identified how these potentially disruptive technologies could play a role in addressing tensions around privacy and utility.The 2019 report made several observations for how the UK could realis

131、e the potential of PETs,including:The research and development of PETs can be accelerated through collaborative,cross-sector research challenges developed by government,industry and the third sector,alongside fundamental research support for advancing PETs;Government can be an important influencer i

132、n the adoption of PETs by demonstrating their use and sharing their experience around how PETs unlock new opportunities for data analysis.At the same time,public sector organisations should be given the level of expertise and assurance required to utilise new technological solutions;18 FRoM PRIVACY

133、to PARtneRsHIP PoLICY RePoRtIntRodUCtIon PETs can promote human flourishing through enabling new and innovative ways of governing data,as well as promoting safe and secure data use.The Department for Digital,Culture,Media and Sport(DCMS),the Centre for Data Ethics and Innovation(CDEI),Office for AI,

134、regulators and civil society can consider how PETs play a role in wider data governance structures,including how they operate alongside new data governance models such as data trusts.Key terms and definitions This report draws on multidisciplinary concepts from cryptography,business,cybersecurity,et

135、hics and analytics.Included here is a quick reference glossary of key terms used throughout.differential privacy:security definition which means that,when a statistic is released,it should not give much more information about a particular individual than if that individual had not been included in t

136、he dataset.See also privacy budget.distributed Ledger technology(dLt):an open,distributed database that can record transactions between several parties efficiently and in a verifiable and permanent way.DLTs are not considered PETs,though they can be used(as some PETs)to promote tra nsparency by docu

137、menting data provenance.epsilon():see privacy budget.Homomorphic encryption(He):a property that some encryption schemes have,so that it is possible to compute on encrypted data without deciphering it.Metadata:data that describes or provides information about other data,such as time and location of a

138、 message(rather than the content of the message).noise:noise refers to a random alteration of data/values in a dataset so that the true data points(such as personal identifiers)are not as easy to identify.Privacy budget(also differential privacy budget,or epsilon):a quantitative measure of the chang

139、e in confidence of an individual having a given attribute.Privacy-preserving synthetic data(PPsd):synthetic data generated from real-world data to a degree of privacy that is deemed acceptable for a given application.Private set Intersection(PsI):secure multiparty computation protocol where two part

140、ies compare datasets without revealing them in an unencrypted form.At the conclusion of the computation,each party knows which items they have in common with the other.There are some scalable open-source implementations of PSI available.secure multi-party computation(sMPC or MPC):a subfield of crypt

141、ography concerned with enabling private distributed computations.MPC protocols allow computation or analysis on combined data without the different parties revealing their own private inputs to the computation.synthetic data:data that is modelled to represent the statistical properties of original d

142、ata;new data values are created which,taken as a whole,reproduce the statistical properties of the real dataset.trusted execution environment(tee):secure area of a processor that allows code and data to be isolated and protected from the rest of the system such that it cannot be accessed or modified

143、 even by the operating system or admin users.Trusted execution environments are also known as secure enclaves.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 1920 FROM PRIVACY TO PARTNERSHIP POLICY REPORTLeft credit.FROM PRIVACY TO PARTNERSHIP POLICY REPORT 21Chapter oneThe role of technology in privacy-p

144、reserving data flowsLeft iStock/Poike.CHAPteR oneThe role of technology in privacy-preserving data flows38 The British Academy and the Royal Society.2017 Data management and use:Governance in the 21st century.See https:/royalsociety.org/-/media/policy/projects/data-governance/data-management-governa

145、nce.pdf(accessed 28 July 2022).39 Wolf LE 2018.Risks and Legal Protections in the World of Big-Data.Asia Pac J Health Law Ethics.11,1-15.https:/www.ncbi.nlm.nih.gov/pmc/articles/PMC6863510/40 Jain P,Gyanchandani M,Khare N.2016 Big data privacy:a technological perspective and review.Journal of Big Da

146、ta 3,25.41 The British Academy and the Royal Society.2017 Data management and use:Governance in the 21st century.See https:/royalsociety.org/-/media/policy/projects/data-governance/data-management-governance.pdf(accessed 28 July 2022).42 The Israel Academy of Sciences and Humanities and The Royal So

147、ciety.2017 Israel-UK privacy and technology workshop note of discussions.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/(accessed 20 September).43 This is distinct from aggregation across a population or group.44 Nissenbaum H.2010 Privacy In Context:Technology,Poli

148、cy,and the Integrity of Social Life.Stanford:Stanford Law Books.45 Bhajaria N.2022 Data privacy:A runbook for engineers.Shelter Island:Manning.The ever-growing quantity of data collected in contemporary life,coupled with increasing power to compute,is opening new possibilities for data-driven soluti

149、ons38.At the same time,there is unprecedented potential for the misuse of data whether intentional or unintentional leading to downstream harms at individual,community,corporate and national scales39,40.The Royal Societys 2019 report focused on the role of PETs in addressing data privacy.Acknowledgi

150、ng that privacy is a term with multiple meanings41,42,it referenced Daniel Soloves taxonomy of privacy.Soloves approach considers privacy violation as resulting from problematic data actions pertaining to personal data,including:Aggregation:the gathering together of information about an individual,w

151、hich could be used to generate insights for reidentification or profiling43;Identification:the linking of data(which mayotherwise be anonymised)to a specificindividual;Insecurity:the potential for data to be accessed by an intruder due to glitches,cybersecurity breach or intentional misuse of inform

152、ation;Exclusion:the use of personal data without notice to individuals;Disclosure:the revelation of personal data to others;Exposure:the revelation of an individuals physical or emotional attributes to others;Intrusion:invasive acts that interfere with an individuals physical or virtual life(such as

153、 junk mail).Data privacy tools can include technologies,legal instruments or physical components(such as hardware keys)that mitigate the risk of problematic data actions.However,data privacy can mean many things,and can be subjective or contextual44.Broadly,privacy may be considered the right of ind

154、ividuals to selectively express themselves or be known.Data privacy entails a degree of control and influence over personal data,including its use.It may therefore be described as the authorized,fair,and legitimate processing of personal information45.Data security relates to protecting data as an a

155、sset,whereas data privacy is more concerned with protecting people:ensuring the rights of data subjects follow their data.22 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR oneA specific definition of privacy may be less useful than considering what privacy is for46 and what is at stake by examinin

156、g potential downstream harms.The loss of privacy may also be considered intrinsically harmful to an individual.data privacy,data protection and information securityData privacy is related to information security,but there are important differences.Information security focuses on external adversaries

157、 and the prevention of undesired access to information47.Security is a necessary condition for data privacy,but privacy also entails the legitimate and fair use of(secure)data.Data security relates to protecting data as an asset,whereas data privacy is more concerned with protecting people:ensuring

158、the rights of data subjects follow their data.The unauthorised use of data shared for a given purpose is loss of privacy(a violation of intention).This suggests that data privacy tools should address accountability and transparency in data collection and use,in addition to helping meet security requ

159、irements.Data protection,on the other hand,refers to the legal safeguards in place to ensure data rights are upheld while data is collected,stored or processed.46 Zimmermann C.2022 Part 1:What is Privacy Engineering?The Privacy Blog.10 May 2022.See https:/the-privacy-blog.eu/2022/05/10/part1-what-is

160、-privacy-engineering/(accessed 20 September 2022).47 According to NIST,security is the protection of information and information systems from unauthorized access,use,disclosure,disruption,modification,or destruction in order to provide confidentiality,integrity,and availability.National Institute of

161、 Standards and Technology(Computer security resource center).See https:/csrc.nist.gov/glossary/term/is(accessed 20 September 2022).48 World Economic Forum.2019 The next generation of data-sharing in financial services:Using privacy enhancing technologies to unlock new value.See https:/www3.weforum.o

162、rg/docs/WEF_Next_Gen_Data_Sharing_Financial_Services.pdf(accessed 20 September 2022).49 Lunar Ventures,Lundy-Bryan L.2021 Privacy Enhancing Technologies:Part 2the coming age of collaborative computing.See https:/ 20 September 2022).50 Infocomm Media Development Authority(Singapore grows trust in the

163、 digital environment).See https:/www.imda.gov.sg/news-and-events/Media-Room/Media-Releases/2022/Singapore-grows-trust-in-the-digital-environment(accessed 5 June 2022).what are privacy enhancing technologies(Pets)?PETs are an emerging set of technologies and approaches that enable the derivation of u

164、seful results from data without providing full access to the data.In many cases,they are tools for controlling the likelihood of breach or disclosure.This potentially disruptive suite of tools could create new opportunities where the risks of using data currently outweigh the benefits.PETs can reduc

165、e the threats typically associated with collaboration48,motivating new partnerships for example,between otherwise competing organisations.For this reason,PETs have more recently been described as Partnership Enhancing Technologies49 and Trust Technologies50.PETs are an emerging set of technologies a

166、nd approaches that enable the derivation of useful results from data without providing full access to thedata.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 23CHAPteR oneThe term Privacy Enhancing Technologies originates in a 1995 report co-authored by the Information and Privacy Commissioner of Ontario

167、and the Dutch Data Protection Authority,which described technologies that allowed online transactions to remain anonymous51.Since then,PETs have evolved in different fields with limited coordination,and there is no consensus around a single definition of PETs.This report follows the European Union A

168、gency for Cybersecurity(ENISA)definition of PETs:a group of technologies that support data minimisation,anonymisation and pseudonymisation as well as other privacy and security principles central to data protection52.A downstream harms-based approach:taxonomy of harmsThis report considers PETs beyon

169、d data security mitigation.However,a framework for data protection and risk is useful in understanding the drivers of data governance decisions(including reluctance to partner or share data).PETs can help prevent downstream harms through bolstering data protection practices.A taxonomy of harms(Figur

170、e 1)provides a conceptual overview of how data might be used or shared,alongside the harms that may follow problematic data actions.It classifies harms into domains(individual,organisation,societal,national)and types(physical/psychological,relational,reputational,personal,economic,security).51 Infor

171、mation and Privacy Commissioner of Ontario and Registratiekamer(Netherlands)2008.Privacy-Enhancing Technologies:The Path to Anonymity.Volume 1.52 European Union Agency for Cybersecurity(Data Protection:Privacy enhancing technologies).See https:/www.enisa.europa.eu/topics/data-protection/privacy-enha

172、ncing-technologies(accessed 20 September 2022).53 National Institute of Standards and Technology(NIST Privacy Engineering Objectives and Risk Model Discussion Draft).See https:/www.nist.gov/system/files/documents/itl/csd/nist_privacy_engr_objectives_risk_model_discussion_draft.pdf(accessed 20 Septem

173、ber 2022).To demonstrate the interconnectedness of risk factors and harms,the model shows both practical elements that may result in harm,as well as downstream effects including damage that can occur far outside the perceived system53.It is important to note that,while there are general trade-offs b

174、etween privacy and utility,the relationship is rarely a simple or linear one.Threats to privacy are not always external to a data-holding institution.Internal actors may intentionally or unwittingly disclose personal data or other sensitive information.Additionally,there is no simple one-to-one mapp

175、ing between an attack and the target(type of information release)or an outcome.Multiple attacks may be used in a sequence to reveal information.The taxonomy is not an exhaustive list of all potential attacks and harms,but provides an illustrative tool designed to encourage a harms-based approach to

176、data protection risks.24 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR oneRecent international developments in PetsBeyond data security applications,PETs are gaining attention for their role in facilitating data use across national borders.In 2019,the World Economic Forum published a comprehensiv

177、e review of PETs in financial services,a sector that is among the most cited in emerging PETs uptake54.In 2020 The Organisation for Economic Cooperation and Development(OECD)recommended data sharing arrangements that use technological access controls,such as PETs,in guidance on cross-border data flo

178、ws and international trade.For international data use,they suggest PETs may be complemented with legally binding and enforceable obligations to protect the rights and interests of data subjects and other stakeholders55.In January 2022,the United Nations Committee of Experts on Big Data and Data Scie

179、nce for Official Statistics launched a pilot PET lab programme,which aims to enhance international data use with PETs56.The UN PET Lab is currently working with four National Statistical Offices(NSOs)and collaborating with PETs providers to safely experiment with PETs and identify barriers to their

180、implementation.54 World Economic Forum.2019 The Next Generation of Data-Sharing in Financial Services:Using Privacy Enhancing Techniques to Unlock New Value).See https:/www3.weforum.org/docs/WEF_Next_Gen_Data_Sharing_Financial_Services.pdf(accessed 20 September 2022).55 Organisation for Economic Co-

181、operation and Development(Recommendation of the Council on Enhancing Access to and Sharing of Data).See https:/legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0463(accessed 20 September 2022).56 Hurst A.2022 UN launches privacy lab pilot to unlock cross-border data sharing benefits.Information A

182、ge.25 January 2022.See https:/www.information- 20 March 2022).57 Infocomm Media Development Authority(Singapore grows trust in the digital environment).See https:/www.imda.gov.sg/news-and-events/Media-Room/Media-Releases/2022/Singapore-grows-trust-in-the-digital-environment(accessed 5 June 2022).58

183、HM Government(U.K.and U.S.governments collaborate on prize challenges to accelerate development and adoption of privacy-enhancing technologies).See https:/www.gov.uk/government/news/uk-and-us-governments-collaborate-on-prize-challenges-to-accelerate-development-and-adoption-of-privacy-enhancing-tech

184、nologies(accessed 13 June 2022).59 Ibid.In June 2022 Singapores Minister for Communications and Information launched the new Digital Trust Centre,which will lead research and development in Trust Technologies,including PETs and explainable artificial intelligence57.Also in June 2022,the US Office fo

185、r Science and Technology Policy and DCMS in the UK launched a joint PETs prize challenge to accelerate the adoption of PETs as tools for democracy58.Both governments are working closely with NIST(US)and the US National Science Foundation in developing the challenge.The transatlantic initiative is de

186、emed an expression of our shared vision:a world where our technologies reflect our values and innovation opens the door to solutions that make us more secure59.Beyond data security applications,PETs are gaining attention for their role in facilitating data use across national borders.FRoM PRIVACY to

187、 PARtneRsHIP PoLICY RePoRt 25CHAPteR oneFIGURe 1 Taxonomy of harmsexAMPLe oUtCoMesexAMPLe AttACKsINTERNAL TRUST BOUNDARYEXTERNAL TRUST BOUNDARYPrivacyUtilitySensitive information of individuals is released or widely knownPersonal information,especially related to protected categories,is revealedPers

188、onal information is used for profiling and predictive purposesDe-anonymisation:individuals identities are revealedData reconstructionTracing attackModel inversion/reconstruction attack:the data used to train a model is reconstructedPoisoning/classifier influence/trojan:dataset is altered to disrupt

189、the robustness or integrity,warping outcomesUnintended disclosure of information,eg biproduct of training neural networkssecurity violationInsider threats(data disclosed or reused for intended or unintended purposes)data is helddata is usedBy others for processing and analyticsInformation is release

190、d as:Anonymised dataset(s)Aggregated statisticsMachine learning modelExternal code is executed on model downstReAM HARMs26 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR oneFIGURe 1 Taxonomy of harmsexAMPLe oUtCoMesexAMPLe AttACKsINTERNAL TRUST BOUNDARYEXTERNAL TRUST BOUNDARYPrivacyUtilitySensitiv

191、e information of individuals is released or widely knownPersonal information,especially related to protected categories,is revealedPersonal information is used for profiling and predictive purposesDe-anonymisation:individuals identities are revealedData reconstructionTracing attackModel inversion/re

192、construction attack:the data used to train a model is reconstructedPoisoning/classifier influence/trojan:dataset is altered to disrupt the robustness or integrity,warping outcomesUnintended disclosure of information,eg biproduct of training neural networkssecurity violationInsider threats(data discl

193、osed or reused for intended or unintended purposes)data is helddata is usedBy others for processing and analyticsInformation is released as:Anonymised dataset(s)Aggregated statisticsMachine learning modelExternal code is executed on model Source:Royal Society meetings with Working Group for Privacy

194、Enhancing Technologies,November 2021 and April 2022.KeYPhysical/PsychologicalEconomicReputationalRelationalSecurityPersonalIndIVIdUALoRGAnIsAtIonALsoCIetALnAtIonALIdentity theftLoss of revenueDemocratic processes are underminedSecurity compromisedLoss of lifePunitive damages(legal)Less trust in rese

195、arch/industryLess trust in democracyFinancial harmLoss of profitsDetriment to public servicesNational intelligence is breachedAnxiety/worryLoss of competitive advantageLess willingness to share data or participateWrongful accusation(eg of illegal activity)Damaged trust relationship(with customers,re

196、search participants)DiscriminationOperations disruptedDiscrimination against groupsLess trust in justice systemEmbarrassmentDamaged reputationDetriment to personal reputation/public perceptiondoMAIn oF HARMFRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 27CHAPteR oneInterest in Pets for international data

197、 transfer and use A fragmented array of legal requirements covers data use across the globe.As of March 2022,there are 157 countries with data protection laws,entailing various stipulations for data transfer and use60.PETs can provide means for secure collaboration across borders,preventing unauthor

198、ised access to datasets;however,data use is still subject to local legal requirements.PETs do not provide loopholes to data protection laws in the UK.Rather,PETs can be used as tools to help data users comply with regulatory requirements,such as anonymisation.While this report refers primarily to cu

199、rrent UK GDPR,it restricts legal commentary to high-level observations,noting ongoing data reform in the UK and international relevance of PETs in other jurisdictions.Accelerating Pets development:sprints,challenges and international collaborationOther PETs development initiatives include the PRIViL

200、EDGE project,funded by Horizon Europe between 2017 and 2021.The project aimed to develop cryptographic protocols in support of privacy,anonymity and efficient decentralised consensus using distributed ledger technologies(DLTs).60 Greenleaf G.2022 Now 157 Countries:Twelve Data Privacy Laws in 2021/22

201、.Privacy Laws&Business International Report 1,38.Seehttps:/ 24 May 2022).61 Livin L.2021 Achievements of the priviledge project.Priviledge blog.30 June 2021.See https:/priviledge-project.eu/news/achievements-of-the-priviledge-project(accessed 30 June 2022).62 Infocomm Media Development Authority(Sin

202、gapore grows trust in the digital environment).See https:/www.imda.gov.sg/news-and-events/Media-Room/Media-Releases/2022/Singapore-grows-trust-in-the-digital-environment(accessed 5 June 2022).63 The DTC will serve as implementation partner for an international collaboration between the Centre of Exp

203、ertise of Montreal for the Advancement of Artificial Intelligence(CEIMIA)and the Infocomm Media Development Authority(IMDA)in Singapore.This partnership seeks to develop solutions to demonstrate how PETs can help organisations leverage cross-institution and cross-border data.As well as online voting

204、(see Use case 5.3,page 95),PRIViLEDGE developed a number of toolkits and prototypes61,including privacy-preserving data storage using ledgers(data residing on a blockchain)and secure multi-party computation(SMPC)on distributed ledgers,which allows two or more parties to compute using a ledger as a c

205、ommunication channel.Many of these resources have been opened further development.State-level collaborations to accelerate PETs include the Digital Trust Centre(DTC),launched in 2022 in Singapore62,63.The DTC is set to lead Singapores efforts in research and development for trust technologies,such a

206、s PETs,which provide solutions for data sharing and evaluation of trustworthy AI systems.This national effort includes sandbox environments,academic-enterprise partnerships and national and international collaborations between research institutes.As a founding member of the Global Partnership for AI

207、(GPAI),Singapore intends to use this platform to enhance its contributions to GPAI.These initiatives have the potential to drive innovation and are raising the profile of PETs for privacy,partnership and trust.This will be key in motivating new users and creating a wider marketplace for PETs.The fol

208、lowing section focuses on the UK public sector,describing enabling factors and barriers in the adoption of PETs.28 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR oneBox 1PETs in financial servicesA series of challenges,technology sprints and collaborative projects have propelled the development of

209、 PETs in financial services.The World Economic Forum has outlined potential uses for PETs in determining creditworthiness,identifying collusion,or flagging fraudulent transactions between multiple banks64.Financial information sharing is key in tackling financial crime,which amounts to around$1.6 tr

210、illion annually(between 2-5%of the global GDP).This requires collaboration and data sharing in a way that safeguards client data,adheres to legal requirements and does not compromise competitive advantage of banking institutions.In the UK,the Financial Conduct Authority(FCA)explored potential use ca

211、ses for PETs such as secure multi-party computation in enabling data-based financial crime detection and prevention,launching a TechSprint on Global Anti-Money Laundering and Financial Crime in July 201965,66.This event included over 140 active participants,and concluded with ten proofs of concept,i

212、ncluding:Using homomorphic encryption to enable banks to share and analyse sensitive information in order to uncover money-laundering networks,or to support the identification of existing and new financial crime typologies,or to allow banks to distinguish good from bad actors through question-and-an

213、swer when onboarding new clients;Using secure multi-party computation to uncover patterns of suspicious transactions across networks involving multiple banking institutions,or to highlight transactional mismatches in risky categories,such as account names;Using federated learning to improve risk ass

214、essment between multiple banks by enabling sharing of typologies;Using pseudonymised and hashed customer data to enable sharing and cross-referencing,to highlight potential areas of concern or for further investigation.These demonstrations illustrate how PETs can be used for a particular end goal:to

215、 identify criminal behaviour in order to target enforcement action.While this use case is applauded by those working to tackle financial crime,it is worth considering how the same methods might be used for surveillance of other behaviours(for example,to profile customers for targeted advertisements,

216、orforenhanced credit scoring).64 World Economic Forum.2019 The Next Generation of Data-Sharing in Financial Services:Using Privacy Enhancing Techniques to Unlock New Value).See https:/www3.weforum.org/docs/WEF_Next_Gen_Data_Sharing_Financial_Services.pdf(accessed 20 September 2022).65 Financial Cond

217、uct Authority(2019 Global AML and Financial Crime TechSprint).See https:/www.fca.org.uk/events/techsprints/2019-global-aml-and-financial-crime-techsprint(accessed 20 September 2022).66 Cook N.2019 It takes a network to defeat a network:tech in the fight against financial crime.Royal Society blog.19

218、September 2022.See https:/royalsociety.org/blog/2019/09/it-takes-a-network-to-defeat-a-network/(accessed 16 February 2022).FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 2930 FROM PRIVACY TO PARTNERSHIP POLICY REPORT FROM PRIVACY TO PARTNERSHIP POLICY REPORT 31Chapter twoBuilding the PETs marketplaceLeft

219、 iStock/PeopleImages.CHAPteR twoBuilding the PETs marketplace67 London Economics and the Open Data Institute.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/68 Centr

220、e for Data Ethics and Innovation(Privacy Enhancing Technologies Adoption Guide).See https:/cdeiuk.github.io/pets-adoption-guide/(accessed 20 September 2022).69 Gartner(Gartner identifies the top strategic technology trends for 2022).See https:/ 20 September 2022).Note that in Gartners analysis PETs

221、are defined similarly to this report.70 Lunar Ventures(Lundy-Bryan L).2021 Privacy Enhancing Technologies:Part 2the coming age of collaborative computing.See https:/ 20 September 2022).71 Ibid.72 Ibid.As highlighted in market research commissioned by the Royal Society and CDEI,the market for PETs is

222、 nascent67.However,agrowing number of documented examples demonstrate PETs already being used in a range of contexts68,with a substantial number of large organisations expected to use one or more privacy-enhanced computation techniques by 2025,particularly in secure cloud infrastructures69.In additi

223、on to safeguarding personal data(which is required by data protection legislation),PETs are increasingly used wherever data is sufficiently valuable(for example,where data is tied to intellectual property or natural resource management).PETs are rapidly evolving through private enterprise,as well as

224、 significant third sector and open initiatives.The development of the technology is thus greater than might be expected,given the modest size of the PETs market70.While this chapter explores the UK public sector market for PETs,it does not fully consider how PETs might shape future digital and data

225、markets at large.In some cases,PETs negate the need to make copies of datasets,allowing data holders to provide insights as-a-service and potentially disincentivising open data approaches.Considering the potentially disruptive nature of PETs in this way,further research is required to understand the

226、 full implications of PETs in digital and data markets.Pets for compliance and privacyNeither EU nor UK data protection regulation explicitly mention PETs(nor privacy).However,compliance with data protection law is a substantial motivating factor for organisations using data protection approaches.On

227、e investment firm contends that the EU GDPR has created the enterprise privacy market71.Data processors want to understand how PETs can help them in compliance(particularly where data analysis is a weakness in the data lifecycle).While privacy challenges are risk-related,they are not always assessed

228、 as commercial problems72,particularly where the use of data is not commercially motivated(or where data use is altogether optional).Many data-holding organisations already use secure cloud services and analytics by default,and PETs are unlikely to be more cost-effective security tools in the near-t

229、erm.In the wider marketplace,collaborative analysis may provide the most compelling business case for these technologies.32 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR twoPets in collaborative analysisCollaborative analysis(including collaborative computing73 and collaborative learning74)is a g

230、rowing area of interest in PETs applications.Researchers requiring data to generate insights,or to fuel machine learning and other AI applications,can leverage PETs to establish data partnerships effectively augmenting the data available to them.For example,organisations with a mandate to use data f

231、or public good are using PETs to make in-house data usable for external analysts75;cross-sector partnerships between crime agencies and human rights NGOs involves the pooling of datasets for analysis without revealing their contents to one another76,enabling efficient,collective intelligence between

232、 analysts who do not see the original data.73 Ibid.74 Melis L,Song C,De Cristofaro E,Shmatikov V.2018 Inference attacks against collaborative learning.Preprint.See https:/ 20 September 2022).75 See Use case 1.1,page 57.76 See Use case 6,page 97.77 London Economics and the Open Data Institute.2022 Pr

233、ivacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/Data availability and access is a priority for public sector bodies with remit to use data for public benefit,provision of s

234、ervices or to provide digital functions.For example,the Greater London Authoritys London Datastore is designed to proactively link data assets to generate insights.Likewise,DataLoch aservice developed between the University of Edinburgh and NHS Lothian aims to encourage non-typical researchers,such

235、as charitable organisations,to use in-house health and social care data for the region of South-East Scotland.In interviews,PETs for collaborative analysis were seen by such public sector bodies as possible methods for reaching these aims;however,no examples of this application of PETs were identifi

236、ed by the UK organisations interviewed77.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 33CHAPteR twoLegal and technical friction points prevent timely and straightforward access to public sector data,limiting its value as a public resource.PETs that allow the sending or processing of datasets internatio

237、nally could be key to realising the value of data use across institutions and borders,which has been estimated to be between$3-5 trillion USD annually78.Governments and data-holding organisations are beginning to understand this value in terms of both economic and social benefits,and are seeking tec

238、hnology-based tools to enable collaboration79.The same PETs could also enhance data use across departments within an organisation,whether for reuse or when subject to further restrictions(as with International Traffic in Arms Regulations compliance in the US).78 McKinsey.2013 Collaborating for the c

239、ommon good:Navigating public-private data partnerships.See https:/ 18 July 2022).79 World Economic Forum.2019 The Next Generation of Data-Sharing in Financial Services:Using Privacy Enhancing Techniques to Unlock New Value).See https:/www3.weforum.org/docs/WEF_Next_Gen_Data_Sharing_Financial_Service

240、s.pdf(accessed 20 September 2022).80 Lunar Ventures(Lundy-Bryan L.)2021 Privacy Enhancing Technologies:Part 2the coming age of collaborative computing.See https:/ 20 September 2022).81 Gartner(Gartner Top Strategic Technology Trends for 2021).See https:/ 26 September 2022).82 Geppert T,Deml S,Sturze

241、negger D,Ebert N.2022 Trusted Execution Environments:Applications and Organizational Challenges.Front.Comput.Sci.4(https:/doi.org/10.3389/fcomp.2022.930741)83 Gartner(Gartner Top Strategic Technology Trends for 2021).See https:/ 26 September 2022).84 The Confidential Computing Consortium,which is ru

242、n by the Linux Foundation,is promoting the use of TEEs in cloud services internationally.The Consortium includes every large cloud provider(Alibaba,Baidu,Google Clous,Microsoft,Tencent),demonstrating confidential computing as a priority to leaders in digital technology.Confidential Computing Consort

243、ium Defining and Enabling Confidential Computing(Overview).See https:/confidentialcomputing.io/wp-content/uploads/sites/85/2019/12/CCC_Overview.pdf(accessed 15 March 2022).For these reasons,collaborative analysis has been predicted by one firm as the largest new technology market to develop in the c

244、urrent decade80.Cloud services are one substantial market already being impacted through the widespread use of Trusted Execution Environments(TEEs),which allow for data processing and analysis in a secure environment with restricted access81.TEEs can provide an application domain for SMPC,enabling c

245、ollaborative analysis of confidential datasets82.Given its role in secure and collaborative analysis,confidential cloud could be an area of significant market growth in the near future83,84.34 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR twoBarriers to Pets adoption:User awareness and understand

246、ing in the UK public sectorA number of barriers prevent the widespread use of PETs for data protection and collaborative data analysis in the UK public sector.The first obstacle is general knowledge and awareness of PETs,their benefits and potential use cases85,86.Researchers and analysts are often

247、familiar with traditional privacy techniques(such as anonymisation,pseudonymisation,encryption and data minimisation);for some,it is unclear what PETs can add to these approaches.PETs that enable collaborative analysis include some of the most technically complex and least used to date(such as secur

248、e multi-party computation and federated learning).While PETs may be some of the most promising,the risk inherent to using new and poorly understood technologies is a strong disincentive to adoption:few organisations,particularly in the public sector,are prepared to experiment with privacy87.85 Londo

249、n Economics and the Open Data Institute.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/86 Lunar Ventures(Lundy-Bryan L.)2021 Privacy Enhancing Technologies:Part 2th

250、e coming age of collaborative computing.See https:/ 20 September 2022).87 London Economics and the Open Data Institute.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologie

251、s/88 Ibid.89 Information Commissioners Office(What is personal data?).See https:/ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/key-definitions/what-is-personal-data/#:text=If%20personal%20data%20can%20be,subject%20to%20the%20UK%20GDPR(acce

252、ssed 20 September 2022).90 GDPR Info(EU GDPR Recital 26).See https:/gdpr-info.eu/recitals/no-26/(accessed 20 September 2022).91 London Economics and the Open Data Institute.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.or

253、g/topics-policy/projects/privacy-enhancing-technologies/92 Ibid.93 Ibid.A lack of understanding around PETs within wider data protection requirements means stakeholders are hesitant to adopt them88.For example,anonymised personal data is not subject to the principles of data protection requirements

254、detailed in the UK GDPR or EU GDPR89,90;however,in the UK,there is no universal test of anonymity.Technology-specific guidance may be useful in interpreting requirements and best practices in emerging technologies,for example,how archived synthetic data should be handled91.Currently,organisations mu

255、st turn to assessments by internal or external parties for guidance.These uncertainties lead to a culture of risk-aversion described by some UK public bodies92.Without assurance or technical standards,some question the genuine security PETs offer,particularly where privacy threats and adversaries ar

256、e undefined or hypothetical93.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 35CHAPteR twoWhere organisations are unable to assess privacy trade-offs for a given PET or application,cost-benefit analysis becomes impractical.As a result,the PETs value proposition remains speculative and the business case f

257、or adopting PETs is unclear.Demonstrations are needed to establish the potential benefit of PETs,for example,through case studies that include cost-benefit analyses94.The use cases and examples in Chapter Four(page 56)provide a starting point for such an approach.According to those interviewed,marke

258、t confidence could be enhanced through better data readiness and the development of standards(Chapter Three)95.PETs are subject to relevant legal frameworks and existing regulators,such as the ICO in the UK.However,they are not specifically regulated as technologies,and their efficacy is illegible t

259、o non-experts.Standards could be followed by assurance and certifications.Implementation frameworks for PETs would allow some elements of decision-making to be outsourced,although additional expertise will likely be required in practice96.94 Ibid.95 Ibid.96 Ibid.97 Ibid.98 London Economics and the O

260、pen Data Institute.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/99 Lunar Ventures(Lundy-Bryan L.)2021 Privacy Enhancing Technologies:Part 2the coming age of colla

261、borative computing.See https:/ 20 September 2022).Other barriers are institutional in nature.For example,where technical expertise does exist in-house,these individuals are often organisationally removed from decision-makers97.Foundational data governance issues,such as data quality and interoperabi

262、lity,are primary concerns for many organisations and,as such new,unknown technologies are deprioritised.Compute power is also a practical limiting factor,particularly with energy-intensive approaches such as homomorphic encryption98.Barriers to Pets adoption:Vendors and expertiseThe development of P

263、ETs requires a deep understanding of cryptography.However,unlike other computing-related fields(such as software engineering),the cutting edge of cryptography remains largely in academia.This leads to a gap between cryptography expertise and market drivers,such as cost and convenience.As a result,th

264、eoretical cryptography risks over-serving the market on security99.Bridging the gap between cryptography talent and entrepreneurs could create viable PETs vendors.Where organisations are unable to assess privacy trade-offs for a given PET or application,cost-benefit analysis becomes impractical.36 F

265、RoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR twoProfessional certifications and online courses for privacy professionals could integrate a PETs primer into existing courses to raise awareness and expertise in the profession.For example,the Alliance for Data Science Professionals100,which defines

266、standards to ensure ethical and well-governed data use,could consider PETs in designing standards around data stewardship and analysis.Modules on general and specific PETs are appearing in university syllabuses,particularly at the postgraduate study level.Several of the universities within the Acade

267、mic Centres of Excellence in Cyber Security Research have a focus on privacy,and PETs and privacy is a remit of the doctoral training.In more informal education,online courses are starting to appear such as OpenMineds Our Privacy Opportunity Foundations of Private Computation and Introduction to Rem

268、ote Data Science101.These can go a long way in raising general awareness and inspiring use cases.ConclusionsA flourishing PETs market will require both trust in the technology and users ability to discern appropriate applications.PETs vendors can help address scepticism by integrating PETs in wider

269、data governance approaches,rather than promoting one-size-fits-all solutions.Where public sentiment around the use of PETs is unknown,further research including focus groups or public dialogues could be used toward ensuring end-user acceptance of(and demand for)the technologies102.100 British Comput

270、ing Society(The Alliance for Data Science Professionals:Memorandum of Understanding July 2021).See https:/www.bcs.org/media/7536/alliance-data-science-mou.pdf(accessed 2 September 2022).101 OpenMined(The Private AI Series).See https:/courses.openmined.org/(accessed 7 October 2022).102 The Royal Soci

271、ety.Creating trusted and resilient data systems:The public perspective.(to be published online in 2023)103 Lunar Ventures(Lundy-Bryan L.)2021 Privacy Enhancing Technologies:Part 2the coming age of collaborative computing.See https:/ 20 September 2022).104 London Economics and the Open Data Institute

272、.2022 Privacy Enhancing Technologies:Market readiness,enabling and limiting factors.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/Today,businesses are incentivised to accumulate data for exclusive use.PETs may engender new business models,for exa

273、mple data or analytics as-a-service.This could entail a data-holding organisation allowing clients to query or run analyses on in-house datasets.This could be done using PETs that do not reveal the data,only the insights or solutions gathered from the query or analysis.Data is not transferred and re

274、mains unseen by the external client.In this way,PETs may enable a shift from data sharing(through agreements or otherwise)to a dynamic data processing and analytics market103,such as through commissioned analyses104.It will be important to consider this potential shift and incentivise organisations

275、to utilise PETs for collaboration,rather than data gatekeeping.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 37CHAPteR twotABLe 2105 Modified from Hattusia 2022 The current state of assurance in establishing trust in PETs.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhan

276、cing-technologies/(accessed 20 September 2022).106 A type of HE called multi-key FHE can perform a similar function:several parties each have a secret key and can encrypt their own data,which is sent to a trusted third party who for computation.The result can be decrypted by all parties who contribu

277、ted data to the process.PETs described in this report and their function with regard to security and collaborative analysis105.Homomorphic encryptiontrusted execution environments(tees)what does this Pet do?Allows the use,or analysis,of encrypted data without decrypting it.Allows data to be used or

278、analysed within a secure,isolated environment.In what circumstances would it be used?To create meaningful insights in computation without revealing the contents of a dataset to those running the analysis(which could be done by a trusted third-party).When data needs to be stored securely,or to genera

279、te insights from data without revealing the dataset to party running the analysis or hosting the TEE.whose data is being protected and from whom?The data held by the institution running the computation is being protected from whoever runs the analysis,whether a third-party or the institution themsel

280、ves.If the third-party were to act in bad faith,they would not have access to the data in question.The data held by the institution running the research can only be decrypted and used within the TEE,and only used by approved code.The TEE is protected from outside environment,including the operating

281、system and admin users.whose interests are being protected and what are they?the data controller They have an interest to carry out their computation in the safest and most effective way possible.the data subjects Those who the data is about have an interest in making sure their data is not accessed

282、 by bad actors.the data controller They have an interest to carry out their research in the safest and most effective way possible.the data subjects Those who the data is about have an interest in making sure their data is not accessed by bad actors.Relevance to security and collaborative analysisse

283、curity Data is protected from unauthorised access.security Data is protected from unauthorised access.38 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR twotABLe 2105 Modified from Hattusia 2022 The current state of assurance in establishing trust in PETs.The Royal Society.See https:/royalsociety.o

284、rg/topics-policy/projects/privacy-enhancing-technologies/(accessed 20 September 2022).106 A type of HE called multi-key FHE can perform a similar function:several parties each have a secret key and can encrypt their own data,which is sent to a trusted third party who for computation.The result can b

285、e decrypted by all parties who contributed data to the process.PETs described in this report and their function with regard to security and collaborative analysis105.secure multi-party computation(sMPC)differential privacyFederated l earningThis allows multiple parties to run analysis on their combi

286、ned data,without revealing the contents of the data to each other106.Mostly for use with large data sets,DP allows institutions to reveal data or derived information to others without revealing sensitive information about the groups or individuals represented in the data set.This allows for the trai

287、ning of an algorithm across multiple devices or datasets held on servers.Removes the need for a trusted central authority that would have access to everyones data.Rather,multiple organisations can keep their data sets private from each other,but still run joint analysis on the combined data.An insti

288、tution may want to share analytical insights that they have derived from their data with another group or with the public,but their data set contains sensitive information which should be kept private.An organisation wants to train a machine learning model,but has limited training data available.The

289、y send the model to remote datasets for training;the model returns having benefitted from those datasets.Each collaborating organisation holds data about individuals(or other sensitive data),and that data is protected from those collaborating on analysis.The data also is protected from any potential

290、 misconduct or incompetence from any of the parties.Sensitive information about the groups or individuals present in the dataset is being protected from whoever the data is being shared with or analysed by,whether thats a trusted third-party,the general public,or the institution themselves.Each coll

291、aborating organisation holds data about individuals(or other sensitive data)and that data is protected from those collaborating on analysis.Only the trained model is exchanged.the collaborating organisations They have an interest to carry out their research in the safest and most effective way possi

292、ble.the data subjects Those who the data is about have an interest in making sure their data is not accessed by bad actors.the data controller They have an interest to carry out their research and share data in the safest and most effective way possible.the data subjects Those who the data is about

293、have an interest in making sure their data is not accessed by bad actors.the collaborating organisations They have an interest to carry out their research in the safest and most effective way possible.the data subjects Those who the data is about have an interest in making sure their data is not acc

294、essed by bad actors.security Data is protected from unauthorised access.Collaborative analysis Multiple parties can work on datasets held by parties of mutual distrust;the data remains safe from unwarranted interference.security Data is protected from unauthorised access.Collaborative analysis There

295、 is potential for open access to the data without revealing the presence or attributes of individuals.security Data is protected from unauthorised access.Collaborative analysis Federated learning is also called collaborative learning;multiple parties are required.FRoM PRIVACY to PARtneRsHIP PoLICY R

296、ePoRt 3940 FROM PRIVACY TO PARTNERSHIP POLICY REPORT FROM PRIVACY TO PARTNERSHIP POLICY REPORT 41Chapter threeStandards,assessments andassurance inPETsLeft iStock/olaser.CHAPteR tHReeStandards,assessments and assurance inPETs107 Hattusia 2022 The current state of assurance in establishing trust in P

297、ETs.The Royal Society.See https:/royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/See also Table 3.108 Ibid.109 Zimmermann C.2022 Part 1:What is Privacy Engineering?The Privacy Blog.10 May 2022.See https:/the-privacy-blog.eu/2022/05/10/part1-what-is-privacy-engineering/(accesse

298、d 20 September 2022).110 Edelman(2020 Trust Barometer).See https:/ 15 February 2022).The Royal Societys 2019 report,Protecting privacy in practice suggested a system of standards and certification for PETs may provide a pathway for assurance,leading to wider adoption of the technologies.Similar init

299、iatives have shaped the development and uptake of emerging technologies(such as cybersecurity products)and global information sharing platforms(as with the protocols that continue to enable the internet).However,PETs are unlike cybersecurity in that they address highly contextual,often intersectiona

300、l,privacyconcerns107.This chapter reviews the role of trust and assurance in PETs implementation108.The review finds that,given their current state of maturation,PETs are generally best used in a systems approach to data privacy by addressing the twin goals of compliance and trust109.Compliance is a

301、dherence to legal and statutory obligations(such as the UK GDPR)to avoid penalties,while trust enables data flows and collaboration.The 2020 Edelman Trust Barometer110 identified two types of trust:Moral the trustor believes the trustee can articulate and act on the best interests of thetrustor and;

302、Competence the trustor believes the trustee has the ability to deliver on what has beenagreed.Trust in privacy systems is similarly twofold (see Table 3):Trust that the PET will be used in a way that protects the rights of the data subject(moral)and;Trust in the technical ability of PET as a securit

303、y tool(competence).Currently,only technical standards exist for PETs(and these are few).These pertain to the technical capabilities of PETs in achieving security(trust in competency).The following sections explore data privacy frameworks,technical standards and assurances in fostering the rapid and

304、responsible use of PETs.Pets and assurance:the role of standardsAssurance in new technologies takes many forms.Certifications,Kitemarks and other formal guarantees for products are perhaps most well-known.These official marks of assurance require external audit based on formal standards,which set ou

305、t requirements for a product or system.Global standards have been effective in cybersecurity and privacy;likewise,encryption-based PETs may rely on encryption standards.Similar approaches may be feasible where risk of disclosure is quantifiable,such as with differential privacy.PETs are generally be

306、st used in a systems approach to data privacy by addressing the twin goals of compliance andtrust.42 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR tHReetABLe 3Assurances and trust relationships in the use of PETs in privacy-preserving data governance systems.trustorstrusteesMoral trustworthinesst

307、rust in competenceAssurances neededPets users(eg,engineers or data scientists)The technology itself;collaborators;external actors;organisations executives(decision-makers)or PETs vendors(if using).Have the executives or PETs vendors prescribed the right PET for the application,such that it functions

308、 in a privacy-preserving way?Will the PET fulfil its expected technical function?Will the data remain secure from outside actors who want access to it?technical assuranceTechnological specifications demonstrating the PET will function as intended.Assurance in the application The use of the PET is ap

309、propriate for the given use case;the PET is part of wider responsible data governance.executives and Pets vendors(those diagnosing use cases and deploying PETs)PETs users;PETs vendors;PETs developers;the technology itself.N/AAre the developers competent in delivering a fit-for-purpose technology?Wil

310、l the PET fulfil its expected function?technical assuranceProfessional qualifications detailing the PET users ability.technical assurance Technological specifications demonstrating the PET will function as intended.data subjects(the people whom the data is about)The data governance ecosystem of orga

311、nisations that collect and use their dataWill personal data be used in accordance with intent,and not lead to increased surveillance and exploitation?Will data remain safe from interference from unauthorised users?Assurance in the applicationThe PET is used as part of wider responsible data governan

312、ce.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 43CHAPteR tHReeA standardisation of approach to PETs will be essential in:Developing higher-level guidance for best practice and codes of conduct;facilitating the early phases of PETs adoption;incorporating PETs into privacy frameworks and impact assessme

313、nts in an informed and responsible manner.The National Institute of Standards and Technology(NIST)also highlight the need for technical standards111.NIST promotes the standardisation of technologies that underpin PETs(such as secret-sharing and encryption regimes),alongside a guidance-based approach

314、 to the standardised use of PETsthemselves.Process standards for data protectionProcess standards can be used to assist in compliance with data protection law and general privacy protection.Privacy frameworks are one example;these are built around a set of questions or controls:points that must be c

315、onsidered and addressed in building an effective system.This structure allows frameworks to specifically address data protection laws,such as the UK GDPR.111 National Institute of Standards and Technology(Roadmap to the Privacy Framework).See https:/www.nist.gov/privacy-framework/roadmap(accessed 15

316、 March 2022).112 The AI Standards Hub is led by the Alan Turing Institute with support from the British Standards Institution and the National Physical Laboratory.HM Government(New UK initiative to shape global standards for Artificial Intelligence).See https:/www.gov.uk/government/news/new-uk-initi

317、ative-to-shape-global-standards-for-artificial-intelligence(accessed 19 March 2022).A popular privacy framework approach entails:1.mapping of information flows.2.conducting a privacy risk assessment(or privacy impact assessment).3.strategising to manage identified risks.Frameworks do not prescribe m

318、ethods or technologies for implementation;rather,the implementer may decide to use classic and emerging PETs to fulfil the frameworkrequirements.Existing standards,guidance and frameworks that address privacy systems are highlighted in Table 4.The pathway to PETs standardsStandards for PETs are bein

319、g developed through a range of international,national and sector-specific SDOs.In addition,there is an emergence of open standards initiatives.These initiatives seek to make standards on PETs accessible by anyone and can entail a collaborative approach to standards development,involving community-le

320、d groups and stakeholders from government,industry and academia.There is a growing movement for this standardisation approach,particularly within emerging technologies.An example of this is the UKs AI Standards Hub which aims to create practical tools and standards to improve the governance of AI112

321、.44 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR tHReeBox 2Lessons from standardisation:Openstandards and the internetThe internet operates smoothly thanks to consensus-driven protocols that continue to be developed by a vast community of technologists.The Internet Engineering Task Force(IETF)is

322、 an informal,volunteer-led group that serves as the standards body for the Internet.The IETF has played a critical role in the development of the internet without a formal,centralised standards body.They developed such inter-domain standards such as HTTP(HyperText Transfer Protocol)and TCP(Transmiss

323、ion Control Protocol),allowing users to access the same internet and transfer data around the world.Open standards can be led by technologists,who know what is technically possible and can propose standards to adapt and meet new legal or other requirements.They may also benefit from additional input

324、s from other stakeholders.In being open,standards are made available for anyone who wishes to use them.Innovators can then use these protocols in the development of new technology;assurance against such standards becomes a marketable added value to suchorganisations.The development of open standards

325、 in PETs will be crucial in ensuring PETs work for everyone by allowing for the global and interoperable use of data.FRoM PRIVACY to PARtneRsHIP PoLICY RePoRt 45CHAPteR tHReetABLe 4 113 See for example the Professional Evaluation and Certification board training courses https:/ standards and guidanc

326、e relevant to data privacynamenumberstandards development organisationdate publishedInformation technology security techniques Privacyframework ISO/IEC 29100:2011/AMD 1:2018ISO and IECJune 2018Information technology security techniques Privacyarchitecture framework ISO/IEC 29101:2018ISO and IECNov 2

327、018Information technology security techniques Information security management systems RequirementsISO/IEC 27001:2013ISO and IECOct 2013,will be replaced by ISO/IEC FDIS 27001(under development)Information security,cybersecurity and privacy protection Information security controls ISO/IEC 27002:2022I

328、SO and IECFeb 2022security techniques extension to Iso/IeC 27001 and Iso/IeC 27002 for privacy information management Requirements and guidelines ISO/IEC 27701:2019ISO and IECAug 2019Information technology security techniques Privacy capability assessment modelISO/IEC 29190:2015ISO and IECAug 2015,r

329、eviewed 2021data protection specification for a personal information management systemBS 10012:2017+A1:2018BSIJul 2018Ieee standard for data Privacy ProcessIEEE 7002-2022IEEEApr 2022Privacy enhancing data de-identification terminology and classification of techniquesISO/IEC 20889:2018ISO and IECNov

330、2018Privacy enhancing data de-identification frameworkISO/IEC DIS 27559ISO and IECTBCAnonymisation,pseudonymisation and privacy enhancing technologies guidanceICOTBCInformation technology security techniques Code of practice for personally identifiable information protection ISO/IEC 29151:2017 ISO a

331、nd IECAug 2017de-Identification of Personal InformationNISTIR 8053NISTOct 201546 FRoM PRIVACY to PARtneRsHIP PoLICY RePoRtCHAPteR tHReetABLe 4 113 See for example the Professional Evaluation and Certification board training courses https:/ standards and guidance relevant to data privacytraining avai

332、labledescriptionReference to PetsCertificatePrivacy framework for Personal Identifiable Information(PII)useFocus on ICT systems for PIIPETs used as privacy controls;refers to PETs such as pseudonymization,anonymization or secret sharing.Briefly mentions HE in regards to encryption.CertificateCyber s

333、ecurity focussed standard with related standards that include guidance for auditing.Courses available113Includes reference materials for security controls and implementation guidance,used regularly in conjunction with ISO/IEC 27001.CertificateGuidance for Privacy Information Management Systems(PIMS),building on ISO 27001Provides high-level guidance for organisations to assess their management of p

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(英国皇家学会:从隐私到协作-隐私增强技术在数据治理和协作分析中的作用(英文版)(112页).pdf)为本站 (白日梦派对) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
相关报告
会员购买
客服

专属顾问

商务合作

机构入驻、侵权投诉、商务合作

服务号

三个皮匠报告官方公众号

回到顶部