当前位置:首页 > 兰德 >

兰德:数字时代的人权【英文版】

  • 2021年08月24日
  • 50 金币

Human Rights in the Digital Age Final Report Jacopo Bellasio, Linda Slapakova, Fiona Quimbre, Sam Stockwell, Erik Silfversten For more information on this publication, visit www.rand.org/t/RRA1152-1 About RAND Europe RAND Europe is a not-for-profit research organisation that helps improve policy and decisionmaking through research and analysis. To learn more about RAND Europe, visit www.randeurope.org. Research Integrity Our mission to help improve policy and decisionmaking through research and analysis is enabled through our core values of quality and objectivity and our unwavering commitment to the highest level of integrity and ethical behaviour. To help ensure our research and analysis are rigorous, objective, and nonpartisan, we subject our research publications to a robust and exacting qualityassurance process; avoid both the appearance and reality of financial and other conflicts of interest through staff training, project screening, and a policy of mandatory disclosure; and pursue transparency in our research engagements through our commitment to the open publication of our research findings and recommendations, disclosure of the source of funding of published research, and policies to ensure intellectual independence. For more information, visit www.rand.org/about/principles. RAND’s publications do not necessarily reflect the opinions of its research clients and sponsors. Published by the RAND Corporation, Santa Monica, Calif., and Cambridge, UK © 2021 RAND Corporation R® is a registered trademark. Cover: cienpiesnf/Adobe Stock Limited Print and Electronic Distribution Rights This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorised posting of this publication online is prohibited. Permission is given to duplicate this document for personal use only, as long as it is unaltered and complete. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial use. For information on reprint and linking permissions, please visit www.rand.org/pubs/permissions. III Preface This report constitutes the Final Public Report for a study commissioned by the United Kingdom Foreign, Commonwealth and Development Office (FCDO) and specifically by the FCDO’s Conflict, Stability and Security Fund (CSSF) – Cyber Security and Tech Programme. The document provides an overview of human rights in the digital age (HURDA), the trends and challenges associated with them and the capacity-building approaches that have been adopted to foster and protect them. The report presents a series of recommendations to be considered by stakeholders involved in capacitybuilding efforts focusing on human rights. This report is intended for an audience with an interest in and familiarity with policy issues and challenges associated with HURDA, as well as with capacity-building initiatives aimed at fostering and protecting these. RAND Europe is a not-for-profit research organisation that aims to improve policy and decision making in the public interest through research and analysis. RAND Europe’s clients include European governments, institutions, non-governmental organisations, and others with a need for rigorous, independent, interdisciplinary analysis. For more information about RAND Europe or this study, please contact: Jacopo Bellasio Senior Analyst – Defence, Security & Infrastructure Westbrook Centre, Milton Road Cambridge CB4 1YG United Kingdom Tel: +44 (0)1223 353 329 bellasio@randeurope.org IV Human Rights in the Digital Age V Executive summary Digitalisation and its underlying technologies and services provide both opportunities and challenges for fostering and promoting human rights and fundamental freedoms. Digital spaces provide a unique opportunity for individuals and groups to interact and exercise their rights and freedoms. Conversely, the digital age also features challenges stemming from the increasing use of digital technologies and services as economic or governance commodities, as well as their deliberate exploitation in the context of rising digital authoritarianism, mis- and disinformation, malicious cyber activities, and other online harms. The implications of digital technologies for human rights are already pervasive, blurring the boundaries between human rights in the physical and digital worlds, thus requiring new approaches towards safeguarding, advancing, and exercising human rights in the digital age (HURDA). In this context RAND Europe, with support from the UK Foreign Commonwealth and Development Office, conducted a study to identify effective ideas for interventions and activities to foster and protect human rights and fundamental freedoms in a manner consistent with UK policy objectives. This document provides a summary of the key findings and recommendations of the study. Capacity-building interventions can serve as an important mechanism for fostering and promoting HURDA, though implementers have to recognise various factors that can shape the impact of interventions as well as potential unintended consequences. The study identified several overarching principles that should inform and guide HURDA-related capacitybuilding efforts to enable implementers to maximise their impact. These are presented in Figure ES.1 below. Figure ES.1: Overarching principles for HURDA-related capacity-building Adopt a holistic approach to human rights and fundamental freedoms embedding a focus on these across all capacity building initiatives and abandoning any residues of digital exceptionalism Ensure that policy and international engagement work support and are coherent and resonate with capacity building initiatives to foster and protect human rights and fundamental freedoms Develop a strategic approach and overarching framework to guide capacity building initiatives focusing on human rights and fundamental freedoms a. Identify overarching priorities and desired outputs and impact to achieve through a portfolio of initiatives b. Identify prioritisation approaches for selecting individual capacity building initiatives to ensure their relevance to and coherence with the overall strategic vision Build on established principles and good practices for the delivery of individual interventions to maximise their impact, sustainability, effectiveness, and efficiency a. Consider the adoption of complex interventions spanning multiple activities, objectives, and beneficiaries of focus b. Tailor interventions to local and regional contexts to facilitate local ownership and ensuring adequate nuance in intervention activities and content c. Ensure the adoption of inclusive, multi-stakeholder approaches to capacity building intervention design and implementation d. Incorporate comprehensive planning, risk assessment, and evaluation activities to mitigate potential unintended consequences and maximise learning e. Embed knowledge-, skills-, and competence-transfer components in capacity building initiatives and interventions VI Human Rights in the Digital Age HURDA-related capacity-building interventions take place in a rapidly evolving technology, political and socio-cultural landscape with concrete implications for HURDA. Recognising the various opportunities, threats and challenges this landscape provides, organisations active in capacity building can employ a range of approaches to help foster and safeguard HURDA. Figure 2 below provides an overview of these intervention approaches, organised within four thematic categories. HURDA-related capacity building is likely to include the sequencing of multiple activities and instruments from across those presented in the figure above. In designing and implementing capacity-building activities, implementers should consider the tradeoffs, strengths, and limitations of different approaches particularly in relation to the specific contexts in which they operate, the beneficiary groups they target, and the resources, expertise and know-how available. The main study report provides an in-depth description of each intervention approach, as well as an overview of illustrative intervention examples and further considerations for implementers. Figure ES.2: Overview of HURDA-related capacity-building approaches Dialogue and consultative platforms and initiatives GOVERNANCE AND REGULATION Knowledge generation and consolidation Lobbying and watchdog activities Enabling software for civil society and general population TECHNICAL INTERVENTIONS Enabling software for governmental and private sector actors Strategic litigation and legal campaigns Databases and repositories EDUCATION Training for senior decision-makers and institutional actors Training for civil society actors Education and lifelong learning Hackathons and competitions Awareness raising campaigns STRATEGIC COMMUNICATIONS Strategic communications campaigns VII Table of contents PrefaceIII Executive summary V Overarching principles for HURDA-related capacity-building V Overview of HURDA-related capacity-building approaches  VI FiguresVIII TablesIX AbbreviationsX AcknowledgementsXII 1. Introduction1 1.1. Study background 1 1.2. Purpose and scope  1 1.3. Research approach and methodology 2 1.3. Structure of this document 4 2. Understanding HURDA 5 2.1. Overview of human rights in the digital age  5 2.2. Cross-cutting trends and challenges to HURDA 14 3. Capacity-building approaches for HURDA 27 3.1. Introduction 27 3.2. Governance and regulation 27 3.3. Technical interventions 31 3.4. Education 35 3.5. Strategic communications 40 3.6. Comparative and cross-cutting analysis of capacity-building approaches for HURDA 41 4. Conclusions and recommendations 49 4.1. Summary of results 49 4.2. Recommendations 53 References65 Figures Figure ES.1: Overarching principles for HURDA-related capacity-building V Figure ES.2: Overview of HURDA-related capacity-building approaches VI Figure 1.1: Overview of project tasks, underpinning activities, and resulting outputs and deliverables 3 Figure 3.1: Capacity-building intervention approaches relevant to HURDA-focused interventions 28 Figure 3.2: Comparative assessment of the shortlisted intervention approaches 42 Figure 4.1: Mapping human rights and fundamental freedoms in the digital age 50 Figure 4.2: Cross-cutting opportunities, trends, challenges, and threats for HURDA 51 Figure 4.3: Intervention approaches for HURDA-focused capacity-building interventions 52 Figure 4.4: Overarching principles for HURDA-related capacity-building efforts and programmes 54 IX Tables Table 1.1: Study research questions 2 Table 2.1: Overview of human rights in the digital age 6 Table 2.2: Overview of cross-cutting trends and challenges to HURDA 15 Table 3.1: Figure key – intervention approach labels 42 Table 4.1: Recommendations for governance and regulation-related interventions 59 Table 4.2: Recommendations for technical interventions 60 Table 4.3: Recommendations for education-focused interventions 62 Table 4.4: Recommendations for strategic communications 64 X Human Rights in the Digital Age Abbreviations AI BT CEPOL CSAM CSO CSSF DCMS DDoS ECHR EU FCDO HURDA ICCPR ICERD ICTs IFES IoT IoX ML NGO NLP OECD OHCHR Artificial Intelligence British Telecom European Union Agency for Law Enforcement Training Child Sexual Abuse Material Civil Society Organisation Conflict, Stability and Security Fund Department for Digital Culture, Media & Sport Distributed Denial-of-Service European Convention on Human Rights European Union Foreign, Commonwealth & Development Office Human Rights in the Digital Age International Covenant on Civil and Political Rights International Convention on the Elimination of All Forms of Racial Discrimination Information and Communications Technologies International Foundation for Electoral Systems Internet of Things Internet of X Machine Learning Non-Governmental Organisation Natural Language Processing Organisation for Economic Co-operation and Development Office of the United Nations High Commissioner for Human Rights XI OII OSINT RQ SIS SME STREAM UDHR UK UN UNCRC UNESCO VPN WP Oxford Internet Institute Open-source Intelligence Research Question Smart Information System Small and Medium Enterprise Systematic Technology Reconnaissance, Evaluation, and Adoption Methodology Universal Declaration of Human Rights United Kingdom of Great Britain and Northern Ireland United Nations United Nations Convention on the Rights of the Child United Nations Educational, Scientific and Cultural Organization Virtual Private Network Work Package XII Human Rights in the Digital Age Acknowledgements The RAND Europe team is grateful to several individuals for their support in the conduct of this study. Firstly, the team is thankful to the FCDO for commissioning this research and to the Conflict, Security and Stability Fund for their support throughout delivery. Thanks are especially owed to Luke Champion, Thiona Philips and Michael Potter for their contributions. Secondly, the team is grateful to colleagues from RAND Europe and experts who supported the research through their participation in interviews and workshops, providing invaluable insights on current trends, challenge sand opportunities in the human rights in the digital age landscape. Finally, the team wish to thank the two RAND Quality Assurance reviewers, Luke Huxtable and Ruth Harris, for their feedback and comments on the report, as well as Jessica Plumridge for her work on graphic design. While all these individuals have made important and valued contributions to the study, any and all errors contained in this report remain the sole responsibility of the authors. 1 1 Introduction 1.1. Study background Promoting human rights and fundamental freedoms is a key responsibility of governments. In recent years, the significance, saliency, safeguarding and exercising of human rights and fundamental freedoms has been influenced by digital technologies. Cyberspace and online activities play a central role in modern life in what has been called the ‘digital age’ – a period within which information telecommunication technologies (ICTs) have led to an increasing ability to generate and transfer information and a shift towards economies and societies that are underpinned by ICT-enabled activities. While the growing importance of digital technologies and spaces create myriad opportunities for contemporary socio-economic activities, they are equally vulnerable to exploitation to undermine human rights and democratic values through a variety of malicious activities. These include but are not limited to mis- and disinformation, online surveillance, implementation of controls to restrict online discourse and other actions. The transnational nature and inherent difficulty of attribution in cyberspace further exacerbate the challenges of addressing or mitigating threats, including those to civil society, human rights, fundamental freedoms and democratic values. The UK National Cyber Security Strategy recognises the need to ‘rigorously protect and promote’ the UK’s core values in cyberspace, including democracy, the rule of law, privacy and human rights. There are various approaches and avenues for strengthening national and international action on human rights in the digital age (HURDA), including capacity-building interventions. Building on a definition for cyber capacity building formulated by Hohmann et al (2017), this study defines capacity-building interventions as activities aimed at: (1) empowering individuals, communities, private and public sector actors to foster human rights and fundamental freedoms in the digital age; or (2) providing relevant safeguards to prevent, limit or mitigate risks stemming from access and use of digital technologies and services to human rights and fundamental freedoms. Capacity-building approaches focusing on HURDA may be characterised by different benefits and limitations and entail engagement from different stakeholder groups such as governments, civil society, international organisations, and private sector organisations. Furthermore, barriers and enablers to the implementation of interventions to safeguard human rights and fundamental freedoms online may need to be accounted for when devising interventions in this space. 1.2. Purpose and scope In this context, the United Kingdom Foreign, Commonwealth and Development Office’s 2 Human Rights in the Digital Age (FCDO’s) Conflict, Stability and Security Fund (CSSF) – Cyber Security and Tech Programme commissioned RAND Europe to conduct a study on HURDA. The objective of this study was to identify and scope, on the basis of the available evidence base and expert consultations, impactful ideas for interventions and activities to foster and protect human rights and fundamental freedoms in the digital age in a manner consistent with UK policy objectives. Findings of this study are expected to inform FCDO decision-making regarding the funding of future interventions concerning the fostering and protection of human rights and fundamental freedoms in the digital age. To pursue these objectives, the study was guided by three overarching research questions (RQs), outlined in Table 1.1 below. Table 1.1: Study research questions Research questions (RQs) RQ1 What does the current landscape of interventions to promote human rights and fundamental freedoms online comprise? What interventions and activities are conducted to foster and protect human rights and fundamental freedoms online? RQ2 What are the benefits, drawback, barriers and enablers that characterise different interventions and approaches to promoting human rights and fundamental freedoms online? RQ3 What interventions and activities should the UK FCDO prioritise with its funding instruments to promote human rights and fundamental freedoms online in order to maximise impact of interventions across a wide range of stakeholder groups and keeping in line with UK policy? Source: RAND Europe. 1.3. Research approach and methodology To address the objectives and answer RQs, the study was structured around three analytical Work Packages (WPs). These comprised: • WP1 Project Inception and Scoping. This WP entailed project inception activities as well as data collection and initial analysis efforts to: (1) develop an understanding of challenges and threats concerning human rights and fundamental freedoms in the digital age; and (2) take stock of the landscape of existing capacity-building interventions and approaches fostering human rights and fundamental freedoms in the digital age. • WP2 Identifying and Selecting Interventions. This WP expanded on WP1 results and activities to comprehensively identify and map interventions on HURDA, and collect data concerning the potential impact, benefits, limitations, barriers and enablers for different capacity-building interventions. On the basis of these data, the WP also shortlisted interventions for further examination through comparative analysis in WP3. • WP3 Final Analysis and Reporting. This WP consolidated the evidence base established under WPs 1-2 by gathering expert and stakeholder perspectives on the potential impacts and barriers to implementation of various interventions and then produced relevant recommendations for future intervention priorities for the UK FCDO. The WP also entailed the cross-cutting analysis of all study data and the development of policy recommendations for the FCDO. Figure 1.1 provides an overview of the study’s WPs and of the specific activities, outputs, and deliverables characterising them. 3 Figure 1.1: Overview of project tasks, underpinning activities, and resulting outputs and deliverables Source: RAND Europe. Caveats and assumptions insights stemming from consultations with A number of caveats, limitations and stakeholder. assumptions should be considered in relation • Background of interviewees and insights to the activities and findings presented in this generated during stakeholder engagement study, including: activities. The study team conducted a • Limited extent of data collection and analysis activities conducted. The scoping nature of this study, as well as the limited timeframe and resources available to it, required the adoption of a focused approach to data collection and analysis activities. As such, while recognising the breadth of the existing evidence base regarding challenges pertaining to HURDA and the number of potential capacitybuilding approaches for fostering and protecting them, the study team could only engage with part of the existing evidence base and available expertise. To minimise the impact of these challenges, the study team prioritised the selection of most up-to-date sources and publications, triangulating findings and analysis with limited number of key informant interviews over a short timeframe. In total, sixteen interviewees from academia, government, international organisations, and civil society were consulted. Data and insights generated during consultations inherently reflect the expertise and the background of those engaged. As such, findings and insights from these should not be taken as wholly representative of the entire stakeholder and expert community. To mitigate for this, the study team endeavoured to integrate and triangulate the views and inputs collected during consultations with data from literature and documentation available in the public domain with a view to minimise the under or overrepresentation of individual or sectoral perspectives. 4 Human Rights in the Digital Age • Robustness of quantitative insights generated during the Systematic 1.3. Structure of this document Technology Reconnaissance Evaluation This report constitutes the Final Report of and Adoption Methodology (STREAM) a study commissioned by the FCDO’s CSSF workshop. The expert STREAM workshop Cyber Security and Tech Programme. As gathered a limited number of experts. As such, it has been drafted for an audience with such, insights stemming from this activity knowledge and understanding of policy issues should be reviewed and understood in the and challenges associated with HURDA, as context of the broader insights generated well as FCDO priorities and initiatives in the and discussed in the study and report. space of capacity building. Further to this introductory chapter, this document contains • Focus on capacity-building interventions. Initiatives and approaches for fostering and three additional chapters: protecting HURDA comprise a wide range • Chapter 2 – Understanding HURDA of technical, political, legislative, legal and provides an overview of key trends and regulatory mechanisms. While recognising considerations concerning human rights the breadth of interventions available to and fundamental freedoms in the digital stakeholders for promoting HURDA, the age. focus of the study is on capacity-building interventions as its ultimate aim is to inform the work of the UK FCDO CSSF Cyber Security and Tech Programme. • Chapter 3 – Capacity-building approaches for HURDA discusses capacity-building approaches for HURDA, analysing their characteristics, strengths, weaknesses, • Focus of recommendations. barriers and enablers. Recommendations for this study are generated on the basis of data collected and analysed during study activities. Recommendations proposed are not intended to be prescriptive indications for how FCDO funding and resources for • Chapter 4 – Summary and next steps provides a final summary of project results and proposes recommendations to be considered by the FCDO and like-minded organisations. HURDA focused capacity building should be allocated. Rather, recommendations seek to provide guiding principles and informative perspectives that could help inform and support FCDO decision-making and prioritisation of future capacity-building investments. 5 2 Understanding HURDA This chapter provides an overview of key human rights and fundamental freedoms whose saliency and application are particularly affected by socio-technological developments that have occurred in recent decades and in the context of the so-called digital age. The chapter begins with an introduction to and overview of selected human rights whose saliency, safeguarding, and exercising has been affected in the digital age (Section 2.1). The chapter then describes key cross-cutting trends, opportunities, and challenges characterising HURDA (Section 2.2). 2.1. Overview of human rights in the digital age Human rights are enshrined in the Universal Declaration of Human Rights (UDHR) and in accompanying covenants and instruments, including the International Covenant on Civil and Political Rights (ICCPR).1 The responsibility for upholding, safeguarding, and promoting human rights and fundamental freedoms falls on nation states. Individuals, however, also have a right to promote and to work to protect human rights and fundamental freedoms at the national and international levels, both alone and in association with others.2 Furthermore, while human rights law typically does not directly govern the activities or responsibilities of the private sector, a number of instruments and initiatives have been adopted in recent years to help private businesses comply with human rights and fundamental freedoms. These instruments include, but are not limited to, the Human Rights Council-endorsed Guiding Principles on Business and Human Rights. As such, while ultimate responsibility for upholding, implementing, and promoting human rights and fundamental freedoms falls on states, both individuals and associations, including private sector actors and businesses, have a role to play in such actions. These considerations are particularly relevant in the context of human rights and fundamental freedoms in the digital age. This is because private sector actors play a pivotal, and at times unique, role in organising, managing, providing access to, regulating and populating the internet. Such responsibilities and roles include those pertaining to: enabling connection to the internet, designing and maintaining hardware and operating systems, allocating web domains, hosting information, facilitating the aggregation, sharing, and searching for information, producing and 1 OHCHR (2021a); UN (2021). 2 OHCHR (1998). 6 Human Rights in the Digital Age regulating access to content, connecting users and communities, selling goods and services, facilitating transactions, and collecting, repurposing and selling data.3 The next sections of this chapter provide an overview of a selection of human rights whose saliency, significance, safeguarding, and exercising have been influenced by sociotechnological changes occurred in the digital age. The rights presented below were identified on the basis of a non-systematic review of academic and grey literature. The list of rights presented in this chapter should not be taken as exhaustive, but rather as illustrative of how digital technologies and their uses and applications have influenced a wide range of human rights and fundamental freedoms. Furthermore, it should be noted that while it may be possible to differentiate between different rights and freedoms at an abstract level, such distinctions are less applicable in practice. This is particularly so in the context of capacity-building efforts, as the safeguarding and exercising of different rights and freedoms are inherently interconnected and overlapping. Table 2.1 provides a summary overview of the human rights discussed in the chapter along with a brief definition and a cross-reference to the relevant section where each right is discussed in greater details in the chapter. Table 2.1: Overview of human rights in the digital age Human rights Right to privacy Right to freedom of opinion and expression Definition Implications in the digital age Section This right guarantees that all individuals are allowed to hold a private life, family, home and correspondence, and that they can choose when, where, how and to whom information about them may be disclosed to others, in the absence of arbitrary and unlawful interference and excessive unsolicited intervention from government and other uninvited entities.4 Informational privacy requires that both substantive information and metadata be safeguarded to protect this right. 2.1.1 Freedom of opinion encapsulates the ability to hold, change and form opinions about all matters, free from any interference or restriction. The right to freedom of expression is all-encompassing and guarantees that all individuals should be able to seek, receive, impart and express their opinions, ideas and views openly through any means of communication.5 Individuals should have free and equal access to the internet and digital technologies and be able to form, hold, and express their opinions in the digital environment. 2.1.2 3 OHCHR (2018); Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 4 OHCHR (2018). 5 Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 7 Human rights Definition Implications in the digital age Section Right to peaceful assembly and association The right to peaceful assembly and association enshrines the right of individuals to peaceful assembly and their freedom to associate with others, also to enable the broader exercising of their broader civil, political, economic, social, and cultural rights.6 The right applies also in the digital environment, digital technologies may be used as enablers for exercising the right both in the digital and physical worlds. 2.1.3 Right to equal participation in political and public affairs This enshrines the right of individuals to fully participate in and effectively influence public decision-making processes that affect them, including through direct and indirect participation in political and public affairs.7. The right applies also in the digital environment whose underlying technologies may be used both as an enabler or barrier to its safeguard. 2.1.4 Right to free and fair elections This enshrines the right of citizens to take part and influence the conduct of public affairs, either directly or through freely chosen representatives and to vote for representatives or be elected as one in genuine, periodic elections.8 Digitalisation may facilitate participation in electoral affairs but carries risks of manipulation, interference, and enabling violation of other rights. 2.1.5 Right to education This recognises that no person shall be denied education, providing safeguards for those receiving it and providing it, as well as enshrining that education should be in conformity with their religious and philosophical convictions. Digitalisation can facilitate education delivery but also lead to exclusion of disadvantaged groups with no access to digital technologies. 2.1.6 Right to health This recognises the right to health as a basic human right, outlining standards for addressing health needs of individuals and specific groups.9 Digitalisation can facilitate innovative healthcare practices but also carries risks and vulnerabilities associated with misand disinformation. 2.1.7 6 Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association (2019). 7 OHCHR (2014b). 8 OHCHR (2021a). 9 Leary (1994). 8 Human Rights in the Digital Age 2.1.1. Right to privacy The right to privacy is recognised and guaranteed by Article 12 of the UDHR,10 Article 17 of the ICCPR,11 and other major international, regional and national measures.12 The right guarantees that all individuals are able to hold a private life, family, home and correspondence, and that they can choose when, where, how and to whom information about them is disclosed. It also guarantees the absence of arbitrary and unlawful interferences and excessive unsolicited interventions from government and other uninvited entities. This right applies not only to private secluded spaces, but also to public spaces and to information that is publicly available. This is based upon the assumption that all individuals have the right to liberty, to autonomy and to live with dignity.13 The right to privacy is intrinsically tied with other human rights and fundamental freedoms, as well as democratic values and principles.14 It is a key right and enabler for the enjoyment and exercise of other fundamental rights and freedoms, including those concerning freedom of expression, freedom of association and assembly, the prohibition of discrimination, the right to health and others. For example, freedom of assembly may be affected or discouraged by the collection, retention and disclosure of private information concerning political actors, where that information is used against individuals to coerce them, intimidate them, or prevent them from gathering.15 In relation to democratic values and freedoms, anonymous participation in all aspects of political life represents a prerequisite of a healthy, liberal democracy free from internal and external interferences. As such, any interference with this right must be neither arbitrary, nor unlawful and must meet overarching principles of legality, necessity and proportionality.16 Implications in the digital age In the digital age, the right to privacy is linked to informational privacy, a concept that covers information that exists or can be derived about a person and their life. In the online and digital context, informational privacy and the protection of the right to privacy are seen as encompassing not only substantive information contained in communications, but also metadata that, if aggregated and analysed, may help generate insights on individuals’ behaviours, relations, preferences, and identity.17 Various international and national legislative frameworks, and other measures, have been developed and implemented to provide relevant safeguards. This includes international frameworks (e.g. international and regional instruments on the protection of personal data)18 and national data protection and privacy legislation. While these measures and provisions have facilitated 10 United Nations (2021). 11 OHCHR (2021a). 12 OHCHR (2018); Privacy International (2017). 13 DeCuw (2018); OHCHR (2018). 14 OHCHR (2014a). 15 Aston (2017). 16 Gardner (2011); OHCHR (2018). 17 OHCHR (2018). 18 Privacy International (2017). 9 the exercise of the right to privacy in the digital age, in some instances legislation has also been designed to promote or facilitate mass surveillance practices.19 As such, the enjoyment of this right depends on a wide range of legal, regulatory, and institutional frameworks providing adequate safeguard and oversight mechanisms. 2.1.2. Right to freedom of opinion and expression The right to freedom of opinion and expression is recognised and guaranteed by Article 19 of the UDHR,20 Article 19 of the ICCPR,21 and other major international, regional and national measures.22 The right is interlinked with other human rights including the right to privacy, the right to peaceful assembly and association, freedom of information, and freedom of the press and the media. For example,23 the right to freedom of opinion and expression can be seen as an enabler for the broader exercising of the right to equal participation in political affairs, as well as for safeguarding the right to free and fair elections.24 While forming and holding an opinion and articulating an expression are intimately intertwined, human rights law has traditionally drawn a conceptual distinction between the two.25 The ability to hold an opinion freely has been understood as fundamental to human dignity and democratic selfgovernance, encapsulating without exception or restriction the ability to hold, change and form opinions about all matters, free from any interference.26 This is based upon the recognition that all individuals are endowed with mental autonomy, reason, conscience and with an inviolable and inalienable control over their mind.27 In contrast, the right to freedom of expression is all-encompassing and guarantees that all citizens should be able to seek, receive, impart and express their opinions, ideas and views openly through any means of communication without interference. Although the right to freedom of expression is inalienable and inviolable, restrictions to this right may apply for protecting the exercise of other rights (e.g. the right to dignity or freedom of religion) and safeguarding national security and public order.28 Implications in the digital age The right to freedom of opinion and expression is equally protected in the digital environment as it applies regardless of frontiers and through any media, in accordance with Articles 10 of 19 Shahbaz et al. (2020). 20 United Nations (2021) 21 OHCHR (2021a). 22 OHCHR (2003). 23 Benedek & Kettemann (2013). 24 Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 25 Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 26 McCarthy-Jones (2019); Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 27 OHCHR (2003). 28 Any restrictions to the right to freedom of expression must still comply with the principles of legality, proportionality and necessity for legitimate use OHCHR (2003); Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 10 Human Rights in the Digital Age the UDHR and ICCPR.29 Individuals should therefore be able to form and hold an opinion not only in their own mind, but also digitally, including by seeking access to and storing information and using available data, without external influences (e.g. public, private or algorithmic interventions).30 Individuals should also be able to circulate opinions and content, to share and express their views, and to receive and transfer information and ideas in the digital space, without interference from public or private actors and regardless of geographic boundaries. The exercising of this right assumes that all citizens should also have free and equal access to the internet and digital technologies.31 2.1.3. Right to peaceful assembly and association The right to peaceful assembly and association is recognised by Article 20 of the UDHR, as well as in Articles 21 and 22 of the ICCPR.32 This right recognises that individuals have a right to peaceful assembly and freedom of association with others, also to enable broader exercising of their civil, political, economic, social and cultural rights. 33 The safeguarding of the right to peaceful assembly and association is underpinned by a number of key principles, including a requirement for civil society actors and organisations to operate in a civic space where they have the ability to: 34 • Ensure the existence and effective operations of any peaceful association. • Seek, receive and use resources and funding, and to mobilise resources available within society as well as from the international community and foreign donors under international human rights law. • Be governed by an equitable set of rules and regulations. The rationale for these principles stems from an understanding that civil society and civil society organisations (CSOs) represent critical components and enablers for facilitating the promotion and upholding of human rights, democratic values and rule of law. As such, states are seen as having both a negative obligation not to hinder or interfere with this right, but also a positive obligation to facilitate its exercise (e.g. through facilitating access to necessary resources; by establishing a regulatory environment conducive to the flourishing of a plural civil society; etc.). 35 Implications in the digital age The right to peaceful assembly and association is recognised to apply equally in the digital environment. Digital technologies can be used both as an instrument to facilitate the organisation and coordination of offline activities pertaining to the exercising of this right, as well as to establish digital spaces to exercise this right directly and fully in a digital context. As such, digital technology is recognised as a critical enabler and instrument 29 Benedek & Kettemann (2013). 30 Taylor et al. (2018). 31 Benedek & Kettemann (2013); Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 32 OHCHR (2021a); UN (2021). 33 OHCHR (2021a); UN (2021). 34 OHCHR (2021c). 35 OHCHR (2021c). 11 necessary to exercise and enjoy the right to peaceful assembly and association.36 2.1.4. Right to equal participation in political and public affairs The safeguarding of the right to equal participation in political and public affairs is central to the development and sustainment of democratic societies and polities, as well as for the broader advancement of human rights through upholding the rule of law, social inclusion and economic development. This enshrines a right for all individuals to fully participate in and effectively influence public decision-making processes that affect them, including through:37 • Direct participation in political and public affairs (e.g. through voting in referenda or by participating in public affairs as a freely elected representative). • Indirect participation through the election of representatives in free and regular electoral processes and participating in public affairs through public debate or dialogue with elected officials. The right to equal participation in political and public affairs is closely linked to other rights, including the right to peaceful assembly and association, as well as the right to freedom of expression and opinion. The right to peaceful assembly and freedom of association, for example, comprises freedom to hold meetings and assemblies for public mobilisation in public and political processes. Article 25 of ICCPR and the interpretative General Comment and jurisprudence adopted by the Human Rights Committee specify the obligations of states to provide relevant safeguards for participation in political and public affairs. These centre on addressing various obstacles to participation in political and public affairs, including direct and indirect discrimination against vulnerable and marginalised groups (e.g. women, minority groups and indigenous peoples).38 Such obstacles include any unreasonable and discriminatory restrictions to participation in political and public affairs stemming from considerations related to the automatic disenfranchisement of vulnerable groups, impairment determined on the grounds of intellectual or psychological reasons, as well as the adoption of arbitrary administrative barriers (e.g. linguistic requirements).39 Implications in the digital age The digital age can play both an enabling or stifling role for the right to equal participation in public and political affairs. As reported by the Office for the High Commission for Human Rights (OHCHR), 40 digital technologies and services, including social media platforms, have enabled new horizontal forms of sociopolitical mobilisation and participation.41 Digital technologies and applications, however, have also facilitated the implementation of mass surveillance and targeted monitoring activities, as well as the persecution of political opposition groups and vulnerable groups in non-democratic societies (e.g. political opposition movements, 36 Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association (2019). 37 OHCHR (2014b). 38 OHCHR (2014b). 39 OHCHR (2014b). 40 OHCHR (2014b; 2016c). 41 OHCHR (2016c) 12 Human Rights in the Digital Age minorities, etc.). As such, digital technologies can be considered a double-edged sword as regards the facilitation of the implementation and safeguarding of the right to equal participation in political and public affairs.42 fair elections may include lawful restrictions on the freedom of expression during an election period (e.g. in relation to impartiality of public media during elections in order to support civil society in exercising their vote).45 2.1.5. Right to free and fair elections Free and fair elections are considered a cornerstone of democratic systems and processes. The right to free and fair elections is established in existing human rights law. In particular, Article 25 of the ICCPR recognises the right of citizens to take part in public and political affairs, either directly or through freely chosen representatives and to do so by voting in genuine and periodic elections. 43 Similarly, Protocol 1, Article 3 of the European Convention on Human Rights (ECHR) recognises the right of citizens to free and democratic elections. Under the ECHR, a further distinction is made between active and passive electoral rights. Active rights are seen as encompassing the right to vote (i.e. the right to participate in an election as a voter), whereas passive rights encompass the right to participate in elections as a candidate.44 The right to free and fair elections is closely linked to other rights including those pertaining to freedom of expression, equal participation in public and political affairs, and privacy. The right to freedom of expression during election periods is, for example, a core pillar and enabler of the right to free and fair elections. In practice, the realisation of the right to free and Implications in the digital age In the digital age, links between the right to free and fair elections and others, such as the right to freedom of expression and the right to privacy, have become increasingly salient due to the potential for personal data of voters to be vulnerable to risks of surveillance, interception, and manipulation.46 The exploitation of digitalisation for novel forms of mass and targeted digital campaigns has raised concerns for and pressures on the right to privacy and data protection safeguards.47 In some instances, digital technologies have been leveraged by actors for the purposes of electoral manipulation and interference, posing a direct threat to the safeguard and exercising of the right to free and fair elections. Digital communication technologies may, for example, be employed to confuse or mislead voters or leverage at scale targeted manipulation tactics through personalised appeals underpinned by big data analytics and mass communication tools. From an indirect perspective, digital technologies and personalised mass communication through social media have also led to the creation of so-called ‘filter bubbles’ and ‘echo chambers’, which have been highlighted as a challenge for the right to free and fair elections due to their associated risk of distortion for democratic deliberations.48 42 OHCHR (2014b; 2016c). 43 OHCHR (2021a). 44 ECHR (2013). 45 Equality and Human Rights Commission (2017). 46 Bennett & Oduro-Marfo (2019). 47 Bennett & Oduro-Marfo (2019). 48 Kofi Annan Foundation (2020). 13 2.1.6. Right to education The right to education recognises that no individual should be denied education in conformity with their own religious and philosophical convictions. International human rights law recognises the right to education under a number of existing frameworks, including Article 26 of the UDHR, as well as the International Covenant on Economic, Social and Cultural Rights (ICESCR), the Convention on the Rights of the Child, and the United Nations Educational, Scientific and Cultural Organisation (UNESCO) Convention against Discrimination in Education.49 The right to education encompasses primary, secondary, and higher education, as well as fundamental education for those who did not receive primary education, the right to quality education in public and private institutions, and a right for parents to choose education approaches in conformity with their religious, philosophical, and moral convictions. Through this approach, the right entails safeguards enshrining a right to education including in the context of conflict, as well as for vulnerable groups, such as for example women and girls, individuals with disabilities, migrants and refugees, and members of minorities and indigenous groups. 50 The right to education includes safeguards also for individuals and organisations providing education. In particular, the right is seen as encompassing a recognition of freedom of teachers, as well as a right for individuals and associations to establish and direct education institutions in conformity with the minimum standards established by states.51 Implications in the digital age The use of digital technologies has become ubiquitous not only in the context of sociopolitical activities, but also for the provision of basic services, including education. The adoption of digital technologies has notably been described as revolutionising education by facilitating access to and dissemination of knowledge, thus providing new learning opportunities, including by facilitating access to education through online education materials and e-learning.52 These novel tools and practices facilitate not only the multiplication of learning pathways for various communities and interest groups, but also enable the diversification of learning approaches.53 It should be noted however that the digitalisation of education delivery may also lead, inadvertently or purposefully, to the exclusion of individuals and disadvantaged parts of society who have no access to digital technologies and platforms. Other challenges associated with the right to education in the digital age pertain to the spreading of misand disinformation, which may negatively affect the education pathway of consumers of such information. For example, with regard to the right to health, misinformation may be conducive to the materialisation of health and safety risks for individuals crediting such information.54 49 OHCHR (2021a); Right to Education Initiative (2018); Special Rapporteur on the right to education (2016); UN (2021); UNESCO (2020). 50 Right to Education Initiative (2018); Special Rapporteur on the right to education (2016); UNESCO (2020). 51 Special Rapporteur on the right to education (2016); UNESCO (2020). 52 Special Rapporteur on the right to education (2016). 53 Special Rapporteur on the right to education (2016). 54 Carnegie Mellon University (2020). 14 Human Rights in the Digital Age 2.1.7. Right to health The right to health is recognised by Article 25(1) of the UDHR, as well as by several other international human rights treaties and instrument such as the International Convention on the Elimination of All Forms of Racial Discrimination, and the Convention on the Rights of the Child (ICESCR).55 In this context, the right to health is understood not as an obligation for states, international organisations or individuals to guarantee a person’s health.56 Rather, it provides a concept under which different principles of international law seek to recognise health as a basic human rights and facilitate the establishment of standards and means to both implement the right to health and provide for the health needs of all individuals, including groups with specific needs.57 The right to health and its safeguarding are closely linked with the realisation of other human rights recognised under international law. These include the right to social security, the right to food and the right to education – since access to education provides opportunities for protecting one’s health through, for example, access to relevant knowledge and services.58 Implications in the digital age The digital age has fostered several opportunities for strengthening human health, including through providing connectivity and digital technologies for innovative healthcare practices, such as e-health and telehealth. Digital platforms also facilitate democratised access to health advice through internet connectivity and dedicated platforms. Equally, digitalisation and democratised access to digital technologies have produced risks to the health and wellbeing of various vulnerable populations and introduced new vulnerabilities to the right to health. For example, the COVID19 pandemic has highlighted the impact of mis- and disinformation spread through digital platforms on public health. Here, for example, we might cite false or misleading information incentivising unsubstantiated treatments or seeking to undermine the adoption of evidencebased public health measures, such as socialdistancing and the wearing of respiratory masks.59 Existing research also highlights the potential detrimental effects of digitalisation to the health and wellbeing of children due to heightened risks of exposure to harmful online content or abuse through digital platforms (e.g. child sexual exploitation).60 2.2. Cross-cutting trends and challenges to HURDA As described above, digitalisation has provided various opportunities and challenges for the exercising and advancing of HURDA. The study team identified a number of cross-cutting opportunities, trends, challenges, and threats that shape the HURDA landscape. These are summarised in Table 2.2 below and discussed in greater detail in the remainder of this section. 55 Leary (1994); OHCHR (2021a, 2021b); UN (2021). 56 Leary (1994). 57 Leary (1994). 58 OHCHR (2021b). 59 Galvao (2020). 60 Holly (2020). 15 Table 2.2: Overview of cross-cutting trends and challenges to HURDA Category Key trends and challenges Section • Opportunities for • human rights in • the digital age • Democratisation of access to digital spaces, technologies, and services Strengthening of human rights safeguards Enhanced access to information and knowledge regarding HURDA Enhanced opportunities for the exercising and protecting of human rights and fundamental freedoms 2.2.1 Data, digital technologies and services as an economic commodity • Commodification of data and surveillance capitalism • Mass-scale adoption of machine learning-driven personalisation in communication • Lack of oversight and control for the uptake of data- intensive technologies • Lack of security and human rights safeguards by default 2.2.2 Data, digital technologies and services as a governance commodity • Increasing digitalisation of public services and 2.2.3 emergence of digital welfare states • Unintended consequences of legislative and regulatory provisions • Mass surveillance • and digital • authoritarianism • Weakening or criminalisation of encryption and privacy- 2.2.4 enhancing technologies Increased capabilities for mass surveillance Overbearing biometric data collection practices Exploitation of tools for monitoring, filtering, and blocking the dissemination of information or removing content online • Illicit or criminal activities in digital • spaces Weaponisation of the information environment for mis- 2.2.5 and disinformation, propaganda, and recruitment Exploitation of digital spaces for targeted online harms • Wider political and socio-cultural • trends in the digital age • Rise of digitally transparent societies and limited societal understanding of challenges for HURDA Declining social trust, public uncertainty and disengagement Worsening digital divide and inequality Source: RAND Europe analysis. 2.2.6 16 Human Rights in the Digital Age 2.2.1. Opportunities for civic space, to extend to digital platforms human rights in the and virtual networks.64 digital age • Digital technologies have facilitated Digital technologies and access to information and knowledge, services have evolved over enabling civil society and mitigating the years enabling individuals to exercise the impacts of restrictions, limitations, their human rights in both online and offline and censorship. The development of settings in previously unseen ways.61 telecommunications infrastructure has Different stakeholder groups have leveraged facilitated unprecedented access to connectivity, and digital technologies and information and knowledge, enabling services as critical enablers facilitating the individuals and civil society across the exercising, safeguarding, and fostering of globe to develop informed opinions. The HURDA. Some of the opportunities and breakdown of boundaries in the digital beneficial implications of digital technologies space has also encouraged more rapid for HURDA include the following. and widespread dissemination of ideas, • Digital technologies and services have provided enabling tools for the exercising of human rights and fundamental freedoms, both online and in the physical world. The advent of the Internet and digital technologies has presented numerous opportunities for promoting the right to freedom of opinion and expression.62 The multiplication of online platforms has provided opportunities for reaching the disenfranchised, engaging large crowds as well as targeting specific individuals. The democratisation of content creators has opinions and content, including to new audiences and previously excluded groups, further enabling the participation of different communities, including marginalised and disenfranchised ones, in informed discourse. This access to information and the engagement of individuals and communities in open dialogue based on open expression of views and opinions, including dissent, represent key factors underpinning and facilitating the development of an open and pluralistic civic space.65 promoted the multiplication of viewpoints, • Some technologies offer opportunities enabling the active exercising of freedom for directly strengthening human rights of opinion and expression throughout safeguards. Digital technologies have online spaces.63 These opportunities have equipped CSOs with greater means to facilitated a fundamental re-shaping of advance their agendas and facilitate their core human rights concepts, such as the activities, reaching broader audiences and strengthening their capabilities. Social 61 RAND Europe interview (February 2021; ID03) 62 Benedek & Kettemann (2013). 63 Benedek & Kettemann (2013); Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015, 2016). 64 Civic space is understood as the ‘environment that enables civil society to play a role in the political, economic and social life of our societies’. See OHCHR (2021a). 65 OHCHR (2021a) 17 media and other telecommunications platforms and tools have also been used to mobilise individuals and large strata of society across the globe, as well as to facilitate the activity of human rights defenders and activists, and to address social and human rights issues.66 HURDArelated safeguards have also been advanced through specific technological tools, such as privacy-enhancing technologies or distributed ledger systems.67 While connectivity and digital technologies provide several opportunities to uphold HURDA and strengthen relevant safeguards, the increasing ubiquity of digital technologies and a growing permanent digital footprint also pose challenges to human rights. In particular, the digitalisation of public services, democratisation and increased use of smart devices, including the so-called ‘Internet of X’ (IoX),68 is shaping the HURDA landscape by increasing the amount of data and metadata, as well as personal information, shared and stored online.69 Such data can be legitimately shared or used, with consent, to improve the provision of services. However, it may also be traded and exchanged by public and private actors for purposes other than the original and legitimate uses, a phenomenon known as data repurposing. In addition, awareness about individual digital footprints and their uses remains limited, creating various challenges and increased vulnerabilities for human rights. The following sections discuss some of the main cross-cutting challenges and threats for human rights stemming from actions by private and public sector actors, as well as individuals and civil society, to use and exploit digital technologies and spaces for economic, political, security or other gains. The cumulative effects of these challenges and threats form a complex and multi-faceted landscape. 2.2.2. Exploitation of data, digital technologies and services as an economic commodity While regional and international discourses on privacy and data protection have increasingly recognised these as fundamental rights, their safeguarding has also hinged on a balancing of privacy safeguards with the economic opportunities provided by the digital economy and data as an economic resource or commodity.70 Activities by public and private sector actors to capitalise on the increasing availability of data have therefore incentivised a number of practices that highlight the tension between human rights and economic opportunities and incentives provided by digital services and technologies. Such practices include: • Commodification of data and surveillance capitalism: Increasing data availability and the potential to generate data-driven insights that contribute to commercial, political, or other advantages has created incentives to monetise data, for example by reselling information through data 66 Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association (2019). 67 Bellasio et al. (2020). 68 The Internet of X (IoX) is a concept capturing an integrated system in which ‘all things, entities, people and thinking are interacted seamlessly’. See Ning et al. (2020). 69 Internet Society (2021); OHCHR (2018). 70 Prasad & Aravindakshan (2020). 18 Human Rights in the Digital Age brokers.71 Such practices can result in technologies that require extensive data privacy asymmetry, whereby data brokers collection. Further to the collection of hold more information about the general personal information for identification public than the general public does purposes, which is required for several about them.72 Furthermore, a number of online services, the selling of data to factors including the availability of big interested third parties adds additional data and advances in computing and risks to HURDA.75 Many technologies, analytical capabilities are enabling the however, are developed without associated re-identification or de-anonymisation oversight and control mechanisms in of specific individuals by matching relation to the data collection and transfer anonymous data points with publicly practices. This is partly due to the relative available information.73 freedom of action and manoeuvre that • Mass-scale adoption of Machine Learning-driven personalisation in communication: The widespread adoption of machine learning (ML) by traditional and social media has altered the way people access and consume information. It has enabled the customisation and recommendation of personalised content as well as targeting of individuals based on personal data, metadata and feedback loops. As such ML-driven personalisation has minimised the diversity of information individuals’ access, enabled the development of filter bubbles and selfreinforcing polarised clusters (also known as ‘eco chambers’), which may negatively affect freedom of opinion and expression and contribute to societal polarisation.74 private sector actors have enjoyed in providing digital platforms and tools. Consumer interest in privacy and data protection safeguards has incentivised the adoption of both bottom-up and topdown regulatory measures. HURDA-related safeguards on the other hand, particularly in the context of privacy and data protection standards, have often centred on privacy limits originating in the private sector, thereby inverting the relationship between the regulator and regulated.76 A general opacity of data collection practise – i.e. a lack of understanding even among public sector actors of how data is collected and held – has exacerbated this challenge through limiting the ability of public sector actors to identify what HURDA-related safeguards are needed in • Lack of oversight and control for the the first place.77 uptake of data hungry technologies: The exploitation of data and digital technologies • as economic commodities has incentivised rapid development and uptake of Lack of security and human rights safeguards by default: In conjunction with the above-described lack of oversight and 71 McKinsey (2017). 72 Crain (2016). 73 Rocher et al. (2019). 74 Ignatidou (2019) 75 Weber (2015). 76 Jones (2020). 77 Jones (2020). 19 control for the exploitation of data as an economic commodity, a further challenge in technology development practices for HURDA has been a lack of security and human rights safeguards by design. Digital technologies and tools are frequently developed with security and safeguards being mere afterthoughts for issues such as privacy. In addition, commercial interests often drive rapid technology development, despite the presence of potential vulnerabilities, notably those relating to privacy and data protections. While privacy represents an increasingly relevant issue guiding consumer preferences (e.g. in relation to the adoption of IoX devices), restrictions on data sharing practices by default remain limited. The rapid pace of technological development represents an additional cross-cutting trend that has augmented the economic incentives for exploitation of data and digital technologies with limited restrictions on such activity through relevant technical or regulatory safeguards. As noted by experts interviewed for the study, as regulation struggles to keep pace with technological development, commercial interests often shape its direction rather than regulation itself. 2.2.3. Exploitation of data, digital technologies and services as a governance commodity Further to their economic potential, data and digital technologies have been exploited by public sector actors as governance commodities. In particular, data and digital technologies have been used in governance processes and mechanisms supporting services across security, healthcare, housing, education and other sectors.78 The scale of digitalisation across public services has prompted commentators to describe futures where government agencies may be rely on full-scale automatisation and digitalisation.79 Some cross-cutting trends in relation to digitalisation of public services include: • Emergence of digital welfare states: As highlighted in existing literature, the digital age has entailed the emergence and increasing proliferation of digital welfare states (i.e. states in which social protection and assistance systems are driven by digital data and technologies).80 While this provides opportunities for the improvement of welfare services and associated governance mechanisms, it also poses various HURDA-related risks. These can include a reduction in access to digital services for disadvantaged societal strata, the perpetuation of discrimination and marginsalisation of communities, the elimination of welfare services for contentious groups, and the imposition of stronger sanctions regimes that may fundamentally undermine traditional governance models. As such digitalised and automated systems may enable the weaponisation of welfare systems against specific groups and communities thanks to wider predictive, surveillance and targeting capabilities.81 • Overbearing data collection approaches for policy- and decision-making: The 78 Van Veen (2019). 79 Coglianese & Lehr (2017). 80 Special Rapporteur on extreme poverty and human rights (2019). 81 Special Rapporteur on extreme poverty and human rights (2019). 20 Human Rights in the Digital Age digitalisation of public services has or processed within national or regional included digitalisation of public health and borders. Driven by an increasing perceived other interventions leveraging extensive imperative to control global data flows data collection and analysis capabilities. and enhance information security within The COVID-19 pandemic has highlighted national borders, the number of data both the utility and risks associated with localisation laws has steadily risen globally, these approaches, notably with regard to including in non-democratic countries.84 the possible tensions between the use of While some data localisation practices COVID-19 track and trace mobile apps as are seen as relatively uncontroversial, the public health crisis management tools, expansion of data localisation beyond their potential exploitation for wider mass cybersecurity and data protection into surveillance practices, and risks associated social and political issues has posed with privacy and civil liberties.82 Experts several risks.85 These include overarching interviewed in the context of this study concerns over ‘digital protectionism’ and noted that the scale of data collection in the exploitation of data localisation for the context of COVID-19 may produce increasing collection of personal data, long-term consequences for the HURDA including information on sexual, religious landscape, including through reimagining and political orientations, which may be the significance of connectivity for public used by authorities for cracking down on access and participation, as well as minorities and political and human rights reshaping socio-cultural structures (e.g. dissidents.86 the relations between the state and civil society).83 • The imposition of social media taxes: Driven by efforts to restrict social media The leveraging of data and digital technologies discourse and strengthen national tax as a governance commodity has, similarly bases, various countries have opted to to economic opportunities, incentivised adopt taxes on the use of social media. the development of a range of legislative This has sparked concerns over the and regulatory measures to facilitate their potential of these taxes to limit access effective exploitation. This has highlighted to the internet and information for civil the potential risks of these uses of data and society and poorer strata of populations digital technologies as well as the potential for and to exacerbate the digital divide. The legislative and regulatory actions themselves to effects of social media taxation on access produce potential unintended consequences. inequalities also carry potential constraints The key trends and challenges in this context on the exercising of the right to peaceful include: assembly and association, by limiting the • Data localisation: Data localisation refers to legislation requiring data to be stored free flow of information, communication between members of the civil society and 82 Boudreaux et al. (2020). 83 RAND Europe interview (February 2021, ID03). 84 Issa & Jha (2020). 85 Svantesson (2020). 86 Shahbaz et al. (2020). 21 placing de facto restrictions on both offline and online mobilisation.87 • Splintering of the Internet architecture along national and judicial boundaries: Efforts to restrict data flows and impose controls on digital services within national boundaries have contributed to a cumulative effect of the so-called ‘splinternet’. ‘Splinternet’ connotes the fragmentation of digital systems and services along national, judicial or economic boundaries, through the development of distinct Internet architectures or the use of firewalls; it has the potential to undermine the right to peaceful assembly and association.88 Such fragmentation can restrict the flow of information on global events or international views on local events, which could, in turn, undermine the ability to form or hold an opinion and express it through peaceful assembly and association. Experts interviewed in the context of this study highlighted that while tensions between the exploitation of data, digital technologies and services as a governance commodity and HURDA can constrain the scope and scale of relevant safeguards, interventions to foster or promote HURDA may also have unintended consequences producing risks for other human rights. This is particularly noted in relation to content moderation, spanning technical and regulatory interventions to monitor, detect and remove false, misleading or harmful online content.89 A lack of transparency in content moderation measures, reliance on self-regulation in the private sector, as well as a reliance on automated content moderation may potentially exacerbate the risks that content moderation poses to rights such as freedom of expression online.90 This is due to the potential of content moderation practices systematically producing false positives and moderating legitimate online content, e.g. through systematic biases reproduced by algorithmic content moderation models on the basis of biased training data.91 2.2.4. Exploitation of the information environment for mass surveillance and authoritarian practices As discussed in previous sections, data and digital services and technologies have generated a number of opportunities as well as risks to HURDA through their exploitation as governance commodities. While such exploitation assumes a deliberate use of data and digital technologies and services for liberal, democratic governance practices and processes, the same technologies and services may also be exploited for nefarious activities and purposes such as mass surveillance and authoritarian practices. As a result, normative, legal and regulatory safeguards for HURDA have been purposely weakened in the context of what has been labelled as digital authoritarianism. This is a phenomenon encapsulating a range of trends and activities relating to the adoption of digitally-enabled authoritarian practices, including provisions to limit human rights and fundamental freedoms within one country, as well as the use of adversarial tools and tactics 87 Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association (2019). 88 Tkacheva et al. (2013). 89 RAND Europe interview (February 2021, ID01, ID04). 90 Ofcom (2019); RAND Europe interview (February 2021, ID04). 91 Ofcom (2019). 22 Human Rights in the Digital Age to undermine third countries that authoritarian practices, whilst reducing the costs institutions perceive as challenging or opposing associated with it. For example, the them.92 Several threats and trends can be ubiquity of IoT and IoX devices provides identified that have led to an erosion of HURDA new means for eavesdropping and or restricted safeguards established to uphold collecting information on behavioural them: patterns and habits, typically without user • The weakening or criminalising of encryption or privacy-enhancing technologies: Concerns over the exploitation of encryption or privacyenhancing technologies for illicit or criminal purposes such as mis- and awareness. Similarly, the availability of user data combined with advances in data analytics enables the detection of unusual behaviours (e.g. changes in search history, movement location) for enhanced scrutiny of individuals’ behaviour and preferences. disinformation have in some instances • Measures for monitoring, filtering, and incentivised their weakening or prohibition blocking the dissemination of information by state authorities. Some states have online or for removing content online: implemented or proposed measures to Certain security risks are exacerbated by purposely weaken encryption and privacy or born out of the dissemination of online enhancing tools, including through the use content (e.g. online harms, disinformation, of so-called backdoors. This entails the the promotion of terrorism and extremist deliberate embedding of vulnerabilities ideologies). This has prompted and weaknesses in digital technologies governments to adopt and implement and products to facilitate monitoring of measures for monitoring, filtering and their use and circumventing of encryption blocking the dissemination of harmful and other privacy-enhancing measures. content online or for removing harmful While from a legal standpoint these may online content. Proportionate and necessary be justified for use by law enforcement, measures can contribute to fostering intelligence agencies and the wider the development of a safe cyberspace national security apparatus under specific and, in doing so, the promotion of human circumstances with adequate oversight, rights. Yet, in contrast, disproportionate their misuse and leveraging for mass and and abusive measures for online content targeted surveillance or prosecution may removal can, by contrast, seriously harm lead to a significant erosion of human free speech and freedom of expression; for rights safeguards and practice. example, by inciting proactive or automatic • Increased capabilities for mass surveillance: ICTs have enhanced content removal and facilitating government control of information. the effectiveness and the capacity of • Measures for restricting or blocking access public, private and individual actors to to the Internet and online information: conduct mass surveillance, as well as In recent years, several governments to intercept, collect and share data. ICTs have on occasion restricted or altogether have significantly expanded the scale and blocked access to the Internet through a potential duration of mass surveillance range of measures including Distributed 92 Yayboke & Brannen (2020). 23 Denial-of-Service (DDoS), country-wide internet shutdowns, and through the by imposing costs to Internet access or by limiting access at specific times and places. In particular the number of intentional disruptions of internet-based services to exert control over the flow of information (also known as ‘internet shutdowns’) has increased significantly over the last two decades. Such practices have implications for exercising of the right to freedom of opinion and expression in the digital age, as well as on wider rights concerning freedom of association and peaceful assembly, and to participation in public and political life. • Collection of biometric data: The implementation of systems to collect biometric data (e.g. facial recognition systems, voice recognition technology and other forms of uniquely identifiable data) for security, political, sanitary and counterterrorism purposes may erode the right to privacy, as this type of uniquely identifiable data, and its collection and use may be inadequately safeguarded and protected in a number of countries.93 2.2.5. Exploitation of the information environment for illicit or criminal purposes Democratising access to information through digital channels has provided many opportunities for contemporary societies by facilitating popular participation in political processes and enabling closer information exchange between democraticallyelected elites and their electorates. It has, however, also introduced and exposed various national security threats and vulnerabilities. For example, these include the exploitation of digital platforms for malicious purposes and the so-called weaponisation of the information environment. Relevant trends and phenomena include: • Mis- and disinformation: Wingfield (2019) defines misinformation as the ‘inadvertent or unintentional spread of false or inaccurate information without malicious intent’ while disinformation is described as entailing the spreading of false or inaccurate information ‘designed, presented and promoted to intentionally cause public harm or for profit’.94 Digital technologies and services have had several impacts on mis- and disinformation by facilitating access to and use of tools and platforms, such as social media. On the one hand, this has expanded opportunities for malign actors to spread false or misleading information at scale and targeting vulnerable and at-risk groups and individuals. Further, and beyond providing access to wider and new audiences, digital technologies have also created opportunities for actors to create novel tools for more sophisticated mis- and disinformation threats and spreading false or misleading information more rapidly.95 • Propaganda and recruitment: Digital spaces, including social media platforms, have been extensively leveraged by state and non-state actors for the purposes of spreading propaganda. This has been done for various reasons, including facilitating recruitment. Existing research has highlighted the extensive use of sophisticated manipulation techniques 93 Huszti-Orbán & Aoláin (2020). 94 Wingfield (2019a). 95 Marcellio et al. (2020b). 24 Human Rights in the Digital Age on social media to spread propaganda, with implications for the integrity of democratic processes (including elections), as well as public and political participation in democratic processes.96 Similarly, non-state actors such as terrorist organisations and violent extremist groups have leveraged social media platforms to propagate their narratives, enhance communication with existing followers, and conducting outreach to potential new ones, thereby strengthening recruitment.97 of targeted online harms. As noted in existing literature, definitional and conceptual ambiguities have made it difficult to comprehensively assess the extent of the problem of harmful online content. There is, however, increasing consensus that the growing challenges of the magnitude and complexity of harmful online content are facilitated by the digitalisation and democratisation of access to online platforms and services.99 Targeted online harms include a range of threats to individuals online, including: • Targeted attacks (e.g. trolling, cyberattacks) against human rights defenders and political dissidents: The evolving HURDA landscape features a multitude of deliberate information threats targeted against civil society actors, such as human rights defenders and political dissidents. Such threats include targeted acts of mis- and disinformation, as well as cyber harassment and hate speech. Existing research, however, highlights a wider range of digital tools and tactics employed by state actors to target civil society actors. These include the hacking of individual or organisational devices, networks, and platforms (including social media profiles) as well as the causation of information leaks that seek to undermine the work of specific individuals or groups or, more broadly, undermine democratic processes and institutions.98 Further to these challenges, digital spaces have also been used to propagate a range • Online hate speech: While hate speech is a definitionally contested concept, it can be understood as encapsulating all forms of expression that disseminate, incite, promote or justify acts of intolerance or aggression against a group of people on the basis of their national, ethnic or other forms of identity.100 As such, hate speech is defined by the content, tone, nature, targets and implications of speech acts.101 • Cyber harassment and cyber bullying: This may be defined as involving the threat of violence and death, as well as other nefarious practices such as invasions of privacy and dissemination of personal information, and the diffusion of false information causing reputational damage.102 While definitions of cyber harassment correspond with those of harassment outside of the digital space, digital content may exacerbate the damage suffered by victims of cyber harassment due to the extended lifetime 96 See, for example, Helmus et al. (2018). 97 Cox et al. (2018). 98 Maschmeyer et al (2021). 99 Titley et al. (2014). 100 Baldauf et al. (2019); Titley et al. (2014). 101 Titley et al. (2014). 102 Citron (2016). 25 of information posted online as well as the typically greater visibility and accessibility of such information.103 • Extremist and terrorist content: Extremist or terrorist narratives can be considered as a form of harmful online content due to the potential harms that such narratives can cause by promoting messages propagating intolerance, hostility or discrimination against selected groups or communities. Extremist content may also pose direct threats to human life by legitimising or inciting extremist or terrorist attacks. It remains, however, challenging to define ‘extremist content’ without a risk of infringement into legitimate speech.104 • Child sexual exploitation and abuse: The UK Online Harms White Paper conceptualises child sexual exploitation and abuse as an online harm propagated through the use of online platforms to ‘view and share Child Sexual Abuse Material, groom children online, and live stream the sexual abuse of children’.105 2.2.6. Further political and socio-cultural trends shaping the HURDA landscape The landscape of normative, legal and regulatory HURDA safeguards has been shaped from the top-down through the actions of public and private sector actors, and from the bottom-up by a wide range of trends and phenomena. A range of political and socio-cultural trends have shaped the HURDA landscape by defining public attitudes and actions towards digital technologies and services and their use by private and public sector actors. Some of these trends include: • The rise of digitally transparent societies and change in social and cultural attitudes towards human rights: Social and cultural attitudes towards privacy are changing.106 People have become more willing to place information into public domains, by sharing content on their social media or live streaming parts of their lives. At the same time, the distinction between public and private spheres is becoming more blurred.107 The increasing use of social networking sites for business as well as remote working and employee monitoring are contributing to blurring the boundaries between social and work identities. The result of this is a cultural shift towards more transparent societies.108 • Limited societal understanding of threats and challenges for HURDA: While public debate concerning issues such as privacy and data protection has been increasingly saturated, existing research highlights an ongoing challenge posed by limited societal understanding of the scope of threats and challenges for HURDA. In part, this reflects the wider challenge of capturing the full scope and scale of relevant trends even for public and private sector actors. • Declining social trust, public uncertainty and disengagement: As highlighted 103 Citron (2016). 104 Wingfield (2019b). 105 HM Government (2019). 106 GOS (2013). 107 GOS (2013). 108 Flyverbom (2016). 26 Human Rights in the Digital Age by existing research, online harms connectivity and penetration of digital and phenomena such as mis- and technologies and countries that are not) disinformation have evolved in a wider and its potential implications for global socio-cultural context of declining social inequality.111 Forecasts of further increases trust, increasing public uncertainty and in data availability and associated growth disengagement from political processes in the digital economy note that such and public participation.109 In conjunction trends are likely to drive an increasing with an overall decline in the importance concentration of digital market value and of fact and data in political discourse, influence in the hands of a selected few these trends have worked to erode civil actors, including states and others in the discourse and the key institutions that private sector.112 provide HURDA-related safeguards to civil society.110 The associated increased propensity of public debate, defined by disagreements over objective facts and a blurring of lines between opinion and fact, also increase the vulnerabilities of communities to malign uses of digital spaces for targeted online harms and other illicit or criminal purposes. These trends highlight the need to consider the impact of digitalisation in a wider political and socio-cultural context, beyond the isolated impacts of technology development and adoption. As states continue to navigate the risks and opportunities of digitalisation for economic, governance, security, and other gains in ways that are consistent with existing human rights safeguards, it is thus • Worsening digital divide and inequality: important to consider how the impact of The interaction of globalisation and individual interventions may interact with this digitalisation is producing significant evolving landscape of HURDA-related trends opportunities for economies globally. The and challenges. The next chapter (Chapter 3) interaction, however, is also linked to a builds on this overview by elaborating on a worsening global digital divide (i.e. a gap selection of intervention approaches that can between countries characterised by high be leveraged in this context. 109 Kavanagh & Rich (2018). 110 Kavanagh & Rich (2018). 111 UNCTAD (2019). 112 UNCTAD (2019). 27 3 Capacity-building approaches for HURDA 3.1. Introduction This chapter presents an overview of selected approaches for delivering HURDArelated capacity-building interventions. The intervention approaches discussed in this chapter were shortlisted through internal analysis of the data gathered through the initial literature review, desk-based research to map interventions, expert interviews, and consultations with the FCDO. The shortlisting of interventions was based on: • Expected impact: Potential impact on human rights and fundamental freedoms in the digital age with a view to protecting and promoting the UK’s core values in cyberspace, including democracy, the rule of law, privacy and human rights. • Relevance to the FCDO CSSF programme: The extent to which an intervention approach is likely to be of relevance and palatable to the FCDO’s CSSF programme. Figure 3.1 provides an overview of the intervention approaches discussed in subsequent sections of this chapter. 3.2. Governance and regulation 3.2.1. Dialogue and consultative platforms and initiatives Fostering HURDA is, at its core, a multi-stakeholder process that necessitates multi-stakeholder engagement and dialogue, particularly in relation to the shaping of the policy, regulatory, and normative environment. To facilitate this engagement and dialogue, implementers can establish, fund or otherwise support interventions in the form of platforms (intended here as collaborative and/or dialogue mechanisms), multi-stakeholder fora, bespoke or recurring conferences, seminars and other types of events. Such initiatives can be carried out at the local, national, regional and international level. Interventions in this category could be used to facilitate interaction between different stakeholder groups (e.g. social media platforms and wider private sector actors, government stakeholders, and civil society actors), as well as foster the participation of particular stakeholder groups in legislative, regulatory, normative, and policy-related processes. This could help strengthen the legitimacy and likelihood of successful implementation of governance and regulatory frameworks by ensuring early buy-in from 28 Human Rights in the Digital Age Figure 3.1: Capacity-building intervention approaches relevant to HURDA-focused interventions GOVERNANCE AND REGULATION Dialogue and consultative platforms and initiatives Knowledge generation and consolidation Lobbying and watchdog activities Strategic litigation and legal campaigns TECHNICAL INTERVENTIONS Enabling software for civil society and general population Enabling software for governmental and private sector actors Databases and repositories EDUCATION Training for senior decision-makers and institutional actors Training for civil society actors Education and lifelong learning Hackathons and competitions Awareness raising campaigns STRATEGIC COMMUNICATIONS Strategic communications campaigns relevant stakeholder groups. Mechanisms in this category could also facilitate the active exchange of knowledge and best practices across different stakeholder groups. Experts consulted and interviewed in the context of this study emphasised the existence of a demand and appetite for dialogue and consultative platforms stemming from perceived opportunities associated with such initiatives, as well as what were perceived as relatively low barriers to implementation.113 Interventions such as discussion forums, 113 RAND Europe interview (February 2021, ID02, ID04, ID09. ID10); RAND Europe workshop (March 2021). 29 conferences and events appear to be generally considered palatable from the perspective of funding organisations.114 The multiplication of virtual platforms for online participation and collaboration and increased teleworking in the context of the COVID-19 pandemic have also facilitated the implementation of these initiatives and reduced costs associated with organising large-scale events.115 Experts consulted, however, noted that the proliferation of these initiatives and saturation of the policy space with a wide range of dialogue and exchange initiatives poses a risk of contributing to a dilution of their impact and sustainability. In other words, the creation of a significant number of new platforms, panels or workshops runs the risk of causing stakeholder fatigue.116 As such, experts consulted in the study highlighted the need to focus on strengthening existing dialogue and consultative platforms rather than duplicating efforts through the creation of new initiatives.117 Further, the creation of new platforms and initiatives to cover an increased breadth of issues risks perpetuating thematic silos. This can implicitly limit the potential effectiveness and impact of these programmes by constraining the ability of existing networks of expertise that touch on different HURDA themes to engage with them. Focusing on deeper-dive initiatives addressing fewer but more specific issues may be beneficial for enhancing the overall impact of initiatives. Leveraging the personal networks of target beneficiaries and strategic communications may also be used to increase visibility and raise awareness of the outputs of relevant initiatives.118. The underrepresentation of marginalised and local actors in dialogue and consultative platforms and initiatives represents a significant limit for interventions in this space. Whilst the goal of these interventions is to promote multi-stakeholder engagement, dialogue, knowledge-sharing and exchange of perspectives, marginalised populations and local actors often remain inadequately represented in these spaces. As such, there may be merit in supporting multi-stakeholder approaches in combination with the increasing representation and agency of marginalised voices and local actors in order to enhance sustainability and impact through ensuring buy-in at the implementation stage of these initiatives.119 3.2.2. Knowledge generation and consolidation HURDA initiatives may focus on generating and consolidating knowledge, particularly on issues concerning how human rights apply and have evolved or are evolving in the digital age. This type of intervention can take the form of research projects and evaluation activities aimed at strengthening the evidence base on human rights and fundamental freedoms in the digital age, and understanding and identifying evidence-based practices and approaches for fostering these. Activities 114 RAND Europe workshop (March 2021). 115 RAND Europe workshop (March 2021). 116 RAND Europe workshop (March 2021). 117 RAND Europe workshop (March 2021). 118 RAND Europe workshop (March 2021). 119 RAND Europe workshop (March 2021). 30 Human Rights in the Digital Age in this category of interventions may be designed to provide greater clarity to relevant stakeholders on emerging areas of interest, such as new technologies or wider socioeconomic developments affecting the digital domain. These interventions could inform the development of more robust and evidencebased measures and approaches to fostering and protecting human rights and fundamental freedoms in the digital age. Knowledge generation and consolidation can help provide opportunities for advancing knowledge and fostering an enabling environment for HURDA, as well as for raising awareness of human rights and fundamental freedoms and their implications in the digital age. Specifically, interventions can enable the development of necessary foundational knowledge of HURDA-related issues.120 They can also contribute to strengthening the evidence base concerning best practices to foster HURDA in beneficiary countries through different initiatives and interventions. 121 Consultations with experts highlighted that existing interventions should be comprehensively evaluated in order to promote evidence-based approaches to HURDA. the evidence base. Such gaps include but are not limited to the impact of surveillance technologies, content moderation approaches, and the changing use of the Internet 123 Experts also noted that as researchers often lack access to relevant and up-to-date data (e.g. from private sector actors on content moderation practices), wider data access and greater transparency from social media and technology platforms was identified as a key enabler for the generation of evidence-based research on a number of existing gaps.124 From the UK perspective, the continuous influence of the UK legislative and academic systems and ideas over partner countries, and especially Commonwealth countries, may serve as an important enabler in this context. This influence also points to opportunities for the UK to provide thought-leadership in HURDA-related issues, leveraging the FCDO’s experience in assisting partner countries in the redrafting of legislation on HURDA-related issues .125 This further adds to a relatively high palatability to the FCDO as a funding organisation and the low financial requirements for design and implementation that were highlighted by experts consulted.126 While some elements of the HURDA landscape, such as freedom of expression, are more comprehensively covered by existing research, other issues have received less attention, resulting in gaps in the evidence base.122 Experts interviewed highlighted a need to target research towards key issues of interest in specific local contexts or key gaps in Whilst knowledge generation and consolidation activities may present opportunities to strengthen HURDA-related safeguards in partner countries, the scale of impact of these activities may be limited by the relative lack of local contextualisation and skills gaps. As such, experts consulted for the study highlighted the need to ensure relevance and 120 RAND Europe workshop (March 2021). 121 RAND Europe workshop (March 2021). 122 RAND Europe interview (February 2021, ID09). 123 RAND Europe workshop (March 2021). 124 RAND Europe interview (February 2021, ID01). 125 RAND Europe interview (February 2021, ID11). 126 RAND Europe workshop (March 2021). 31 contextualisation of knowledge produced to recipient countries for maximising impact.127 They also highlighted the need to consider potential competences and skills barriers in partner countries that may constrain meaningful engagement and adoption of knowledge and ideas generated during interventions.128 3.2.3. Lobbying and watchdog activities Lobbying and watchdog measures enable a degree of observation, scrutiny and accountability to be held for an array of different stakeholder groups that interact and influence developments within the HURDA ecosystem. Watchdog activities are designed to verify that actors who play a role in implementing legislation, technical initiatives and other relevant interventions do not act unethically or illegally in a manner that undermines HURDA. They can also be designed to ensure that any programmes designed to protect and promote HURDA meet their specified commitments. In addition, lobbying efforts may be used to leverage influence from human rights defenders and other civil society stakeholders in an attempt to integrate a greater inclusion of principles and safeguards for HURDA into the activities of relevant policy-makers and institutional actors. Interventions under this category can be beneficial in establishing an avenue for the general public and CSOs to positively shape legislation and other initiatives in a way that protects and promotes HURDA. At the same time, initiatives of this type may help generate relatively swift changes in the HURDA environment, helping counteract violations or initiatives that negatively affect the HURDA landscape in a given context. Indeed, watchdog and lobbying activities can help gather support for actors involved in these activities from other sections of society, for instance through publicly disseminated reports, videos or news articles that may help increase the incentives for targeted stakeholders to incorporate greater HURDA considerations into their activities. Strategic litigation and legal campaigns Strategic litigation and legal campaigns offer an opportunity for civil society actors to facilitate change in legislation or regulations towards a greater inclusion of safeguards for HURDA. Initiatives that fall under this category, particularly strategic litigations campaigns, can therefore generate long-term positive changes in the HURDA landscape by focusing on creating legal precedents that may impact important societal norms, regulations or cultural traits that contribute towards an undermining of HURDA. Broader legal initiatives may also be beneficial for protecting at-risk groups in society that may suffer from HURDA violations stemming from a lack of legal safeguards or from flawed existing legal provisions. 3.3. Technical interventions 3.3.1. Enabling software for civil society actors and general population Civil society actors, including journalists, human rights defenders, other grassroots organisations and the general public should have access to adequate software resources and tools to foster, exercise, and safeguard human rights and fundamental freedoms in the digital age. 127 RAND Europe workshop (March 2021). 128 RAND Europe workshop (March 2021). 32 Human Rights in the Digital Age A range of technical tools and instruments may be required to ensure adequate protection when leveraging ICTs, including software to empower different stakeholder groups to contribute to debates on and practices of human rights and fundamental freedoms. Implementers may therefore orient interventions towards the development and uptake of enabling software solutions targeting different stakeholder groups and actors from among civil society and the general public. This could include detection software that can flag certain types of malicious content, such as ‘deepfakes’, browser extensions or plug-ins, and tools to enable human rights defenders to conduct digital activism more securely (e.g. online collaborative platforms, virtual private networks (VPNs), etc.). Interventions aimed at developing and facilitating the uptake of software may help bolster cyber resilience and cyber hygiene among civil society actors. Equally, access to and use of software solutions can improve the cyber hygiene and resilience of the general public enabling a greater understanding and protection from threats and processes that may undermine their digital rights. Experts consulted noted that enabling software for civil society actors and the general population is often perceived as double-edge sword for promoting HURDA.129 On the one hand, when deployed in hand with educational interventions such as digital literacy programmes, enabling software can equip civil society with the digital tools for exercising HURDA or strengthening their safeguards from the bottom up.130 For example, enabling software can provide end users with novel tools for enhancing privacy, countering mis- and disinformation, exercising their right to freedom of expression and association. On the other hand, technical interventions can be adopted as quick fixes to complex challenges within the HURDA problem space. As such, technical interventions can fail to address the underlying roots of threats to HURDA. Additional challenges for technical interventions include: • Gaps in local skills, competences and infrastructure: Relatively low levels of digital literacy and lack of technological understanding among civil society actors and the general population may hinder uptake of software solutions.131 Similarly, the conditionality of these interventions on reliable Internet access and bandwidth may limit the impact of these interventions, particularly in the face of limits to connectivity and restrictions on Internet access (e.g. due to internet shutdowns).132 • Mis- and distrust in new and emerging technologies and lack of sustainability: Mis- or distrust in emerging technologies due to political beliefs, low levels of digital literacy and increasing cybersecurity risks may constrain the adoption of technical interventions by different stakeholder groups.133 Similarly, the pace of technological change and a potential ‘technological hyperdrive’ may increase the risk of rapid obsolescence of technical 129 Daly (2014). 130 RAND Europe interview (February 2021, ID14). 131 RAND Europe workshop (March 2021). 132 Internet Society (2019); RAND Europe workshop (March 2021). 133 Nishinaga & Natour (2019); RAND Europe workshop (March 2021). 33 solutions and reduce the sustainability of impact.134 • Unintended consequences for HURDA: Experts consulted noted the existence of risks associated with unintended consequences and risks for HURDA stemming from software solutions due for example to poor security provisions in software developed or in solutions and services underpinning their use.135 Expert consultations highlighted a number of principles that can increase the impact of enabling software for civil society and the general population. Technical solutions could be fostered in tandem with educational measures for improving digital literacy and cyber hygiene, thus informing end users of the risks tied with these tools, and ultimately lowering barriers related to skills and competences.136 Cross-government collaboration and multi-stakeholder engagement could support the integration of technical interventions with other initiatives as well as encourage incorporation of HURDA standards in technology design.137 To facilitate the uptake of relevant tools, sponsors may also prioritise funding for software targeting individuals with limited digital skills and competences (e.g. user-friendly encryption tools) and provide guidance, advice and tips for end-users on important considerations to factor in when adopting software to address HURDA-related challenges.138 3.3.2. Enabling software for governmental and private sector actors Providing technical tools and software for governmental and private sector actors may facilitate a top-down fostering and protection of human rights and fundamental freedoms in the digital age. Software solutions designed to support government and private sector stakeholders can focus on managing technical issues, as well as content. For example, software solutions could be designed to assist law enforcement agencies in monitoring, detecting or safeguarding against malicious activities in cyberspace. Enabling software in this category can vary in complexity, scope, and objectives. It may include complex solutions that leverage new and emerging technologies (e.g. Artificial Intelligence (AI)- and ML) to autonomously conduct activities previously considered labour intensive (e.g. content moderation and takedown) at greater scale. Enabling software for governmental and private sector actors may be designed to contribute to a structural strengthening, monitoring, and moderation of digital spaces against exploitative or harmful activities. This may, in turn, entail benefits for broader communities including civil society actors and the general public. Enabling software for governmental and private sector actors offers opportunities for stakeholders to adopt a range of HURDAsupporting practices to slow the spread of misinformation and respond to hateful extremism online. For example, verification and factchecking software used by social 134 RAND Europe workshop (March 2021). 135 RAND Europe workshop (March 2021). 136 RAND Europe workshop (March 2021). 137 RAND Europe interview (February 2021, ID05). 138 RAND Europe workshop (March 2021). 34 Human Rights in the Digital Age media platforms may help prevent the further dissemination of false information.139 Takedowns of harmful online content by private sector actors can also prevent exposure to hateful extremist content.140 Interventions of this type may, for example, include databases containing instances of hate crime or manipulated images, or interactive online maps that provide overviews of certain practices that may impact human rights. Enabling software for governmental and private sector actors shares many barriers, challenges and limitations pertaining to its impact and implementation with technical interventions designed to develop software for civil society and the general population. These include skills and competences gaps among stakeholders, and the low palatability for funder due to reputational risks and the lack of a human rights framework in technological design (see section 3.3.1).141 The risk of enabling software being exploited for purposes harmful to HURDA or producing unintended consequences is particularly salient in relation to software provided to government actors and civil society actors. As such, funding organisations may opt to encourage uptake of tools built externally rather than directly sponsoring the development of specific technologies.142 3.3.3. Databases and repositories Databases and online repositories may provide an additional means of fostering HURDA. These technical measures can be seen as providing access to ‘one-stop-shops’ for users to access and analyse information or raw data concerning particular phenomena. Measures that fall under this category may increase the accessibility of data that human rights defenders, policymakers, researchers, and other activists could use for improving their understanding and activities in the context of HURDA. These repositories may also reduce barriers associated with obtaining information through traditional data-gathering means, which can often be time-consuming, technically complicated and financially burdensome. Databases and repositories present opportunities for filling gaps in knowledge or evidence and, as such, could have important enabling impact for promoting HURDA.143 As technical enablers, databases and repositories could provide the evidence base required to design more effective interventions or raising awareness concerning the trends and challenges to HURDA. If data are shared through databases and repositories, this may also lead to less duplication in initiatives and thereby free-up resources to be concentrated in other pressing areas of research.144 Databases and repositories, however, may face barriers and challenges to implementation. In particular, there are persistent difficulties in accessing raw data (especially data on content moderation and removal) from social media platforms for building repositories 139 Amazeen et al. (2018). 140 CCDH & Restless Development (2020); Polyakova & Fried (2019). 141 RAND Europe workshop (March 2021). 142 RAND Europe workshop (March 2021). 143 RAND Europe workshop (March 2021). 144 RAND Europe workshop (March 2021). 35 and conducting research.145 As such, the development of effective database and repository resources may be combined with support to efforts by academia or civil society stakeholders to increase data transparency through regulation, codes of ethics and other mechanisms.146 Databases may also lack context-specific resources such as language or image/vision resources, which may increase risks of bias in inputted data and limit their relevance and feasibility of use for designing interventions in a third beneficiary country.147 3.4. Education 3.4.1. Training for senior decision-makers and institutional actors This intervention approach concerns activities entailing a time-bound and active transfer of knowledge, skills and competences to stakeholders within government and public institutions, including senior policy and decision-makers, members of law enforcement agencies, the judiciary and prosecution services. Interventions within this category may include programmes such as training sessions, modules or workshops designed to develop skills and competences on different trends, challenges and issues related to HURDA. Intervention approaches within this cluster can provide beneficiaries with an enhanced understanding and awareness of issues concerning human rights and fundamental freedoms in the digital age, which may in turn lead to a greater integration of relevant provisions and perspectives on these issues in their work and activities. This can, for example, help complement CSO activities to lobby for direct policy and legislative changes for human rights-centric digital and technology policies. As such, this category of interventions can support the formulation and development of both new safeguards and of an enabling environment for human rights and fundamental freedoms in the digital age, as well as a more effective implementation and protection of existing measures and provisions. Experts consulted highlighted that training for senior decision-makers and institutional actors provided opportunities to generate a high level of impact for protecting and promoting HURDA. Notably, due to the degree of institutional power these stakeholders possess, training programmes for this stakeholder group could help integrate rights-based considerations in legislative and regulatory frameworks while also facilitating organisational change to enhance the protection of HURDA.148 A key consideration for implementers of this intervention approach is to adopt different approaches to training programmes to tailor individual interventions. Instead of approaching the provision of training for this stakeholder group in a standardised manner, experts consulted in the study recommended offering a combination of baseline training with more targeted initiatives for selected target beneficiaries. Whereas the former may involve training focused on establishing a foundational level of considerations for HURDA, the latter could focus on more in-depth training for on specific issues within the HURDA problem space. 145 RAND Europe interview (February 2021, ID01, ID04). 146 RAND Europe interview (February 2021, ID01, ID04, ID09). 147 RAND Europe workshop (March 2021). 148 RAND Europe workshop (March 2021). 36 Human Rights in the Digital Age Expert interviews highlighted that a lack of appetite from senior decision-makers and institutional actors can however represent a barrier for training-focused interventions. In a context in which incentives for participation may be limited, securing engagement from relevant stakeholder groups can be a significant challenge for implementers.149 Implementers should therefore seek to institutionalise or formalise incentives for policy-makers and institutional actors to engage in relevant training, such as through formal government departmental policies. Similarly, the impact of training for larger stakeholder groups may be limited beyond immediate awareness-raising, with little impact on the longer-term behaviour of decisionmakers and institutional actors. Given the potential risk of stakeholder fatigue, it may be beneficial to prioritise training for selected individuals who could then act as ‘HURDA champions’ within key stakeholder groups, raising awareness and acting as an influencer on HURDA-related issues.150 3.4.2. Training for civil society actors This intervention approach concerns activities entailing a time-bounded and active transfer of knowledge, skills, and competences to stakeholders and actors within civil society, including human rights defenders, journalists, lawyers and other CSOs. Interventions within this category may include programmes such as training sessions, modules or workshops designed to provide participants with skills and competences on specific aspects of HURDA and ways to foster, protect and exercise them. While such training interventions may typically focus on tactical improvements (e.g. training for journalists on detecting sources of disinformation), they may contribute and support the achievement of longer-term, higher-level outcomes (e.g. limiting the spread of disinformation and enhancing the resilience of the national information environment). Experts consulted during study activities highlighted that training for civil society actors can help to empower stakeholders in this category and provide widespread engagement in all thematic areas on HURDA.151 This includes engagement from various groups of CSAs that may traditionally not be at the centre of HURDA-related interventions. For example, journalists were one particular stakeholder group that was highlighted by experts interviewed as often suffering from inadequate training provisions.152 Interventions may also contribute to broadening the pool of practitioners and actors with relevant technical skills and know-how, empowering a larger number of organisations, activists, lawyers and other relevant civil society actors with the competences to influence, protect and exercise HURDA. In order, however, to enhance the impact of any measures implemented, a strong understanding of the local skills gaps and needs of civil society actors is required in order to foster local ownership and increase buy-in from beneficiaries.153 This could be 149 RAND Europe interview (February 2021, ID6). 150 RAND Europe interview (February 2021, ID12). 151 RAND Europe workshop (March 2021). 152 RAND Europe interview (February 2021, ID8). 153 RAND Europe workshop (March 2021). 37 achieved by conducting adequate trainingneeds-analysis and include train-the-trainer components to increase the sustainability of impact achieved.154 provide the additional benefits of integrating medium to long-term considerations that could enhance the future-proofing aspects of new training measures that may be implemented.158 A specific gap in current training interventions for civil society actors concerns capacity to identify and visual forms of online misinformation. The sophistication, quantity and speed at which manipulative visual content is shared digitally, especially through new formats such as deepfakes, has resulted in journalists and media outlets struggling to use traditional manual detection tools for combatting these threats.155 At the same time, such actors have reportedly received insufficient training on how to effectively use emerging technical tools designed to counteract these complex sources of misinformation.156 In this sense, the implementation of training courses may benefit from a combination with technical interventions in order to enable civil society actors to exploit the potential benefits that technical measures could offer in this environment. Finally, the adoption of a strategy focused on engagement with CSOs, including through training programmes, is a measure that may allow sponsors to address limitations in current training practices. This would entail establishing a clear vision and objectives for promoting and protecting the civic space dimension of HURDA, which could include mapping an overall picture of existing training requirements to identify gaps that need addressing.157 Indeed, such strategies may 3.4.3. Education and lifelong learning Intervention approaches that fall under this category concern longer-term education programmes that aim to foster and structurally enhance through formal education and lifelong learning the availability of relevant knowledge, skills and competences on HURDA and associated challenges, threats and opportunities. Unlike training initiatives, these interventions are usually broader in scope and audience, focusing on developing a wider set of skills and competences. Such interventions may include the development of curricula at the higher education level, but also early childhood interventions or mid-career education interventions. These interventions can be beneficial in facilitating long-term positive outcomes for the availability of relevant skills and competences to foster and protect HURDA. Experts consulted during the study noted that educational measures must be highly tailored to local contexts. This is not only in terms of local involvement and ownership over educational programmes, but equally in regard to ensuring that interventions address the specific learning needs of key communities.159 Given the strong local element of this intervention approach, external implementers 154 OECD (2020). 155 Thomson (2020). 156 Thomson (2020). 157 OECD (2020). 158 OECD (2020). 159 RAND Europe workshop (March 2021). 38 Human Rights in the Digital Age may be better placed to indirectly support education and lifelong learning by providing relevant enablers such as financial and other resources, as well as guidance for relevant beneficiary organisations on the design and implementation of educational programmes.160 Experts consulted indicated that there were opportunities to expand the audiences of educational projects to foster lifelong learning. Rather than exclusively targeting the young, initiatives could also aim to teach other age groups in society, such as the elderly, on important HURDA topics.161 Experts consulted noted, however, that implementers may also face difficulties in implementing lifelong learning interventions. Notably, defining and measuring the concrete outcomes of education and lifelong learning interventions remains challenging.162 Additionally, there is a risk of standalone educational courses addressing only the symptoms rather than the underlying roots of threats to HURDA. As such, any educational programmes may benefit from parallel implementation with other interventions.163 Finally, steep financial costs involved with highquality, long-term educational interventions may also further limit the palatability of this intervention approach to external international sponsors such as the FCDO.164 3.4.4. Hackathons and competitions Further to training and educational programmes, hackathons and competitions offer an alternative mechanism for enhancing skills and knowledge through engaging relevant stakeholder groups (e.g. technologists, students) in collaborative learning events. Furthermore, hackathons and competitions are often designed to solve practical challenges and issues, and thus lead not only to improved skills and competences, but potentially also to additional outputs. Hackathons and competitions include the provision of measurable rewards (e.g. prize money, scholarships, project funding) for achieving defined objectives (e.g. development of ethical AI applications or tools for tackling harmful online content) within a competitive, fast-paced environment. Hackathons and competitions represent mechanisms for engaging relevant stakeholder groups through hands-on, practical experience as compared to other educational interventions. During hackathons participants can engage in the formulation and development of initiatives to foster and promote human rights and fundamental freedoms in the digital age. Initiatives may be targeted at certain groups (e.g. university students) or open to broad, multi-stakeholder audiences, as well as being relatively niche (e.g. a hackathon on COVID-19 disinformation) or broad in scope (e.g. a competition on constructing ethical AI). Mechanisms that fall under this cluster may be useful for actively engaging a wide range of relevant stakeholders, such as industry professionals, technology developers, students and academics. The interactive nature of hackathons and competitions may encourage individuals to pursue further endeavours that 160 RAND Europe workshop (March 2021). 161 RAND Europe interview (February 2021, ID8). 162 Huguet et al. (2019). 163 RAND Europe interview (February 2021, ID6). 164 RAND Europe workshop (March 2021). 39 positively shape the human rights’ landscape or create new networks between different actors to work on promoting human rights and fundamental freedoms. By facilitating engagement between different groups of stakeholders, interventions in this category may also enable the identification of new pools of talent or project ideas that policymakers, industry leaders and other institutional actors can directly integrate to improve existing strategies and safeguards for human rights and fundamental freedoms online. Despite these opportunities, experts consulted highlighted the limited sustainability and substantive impact of these types of interventions as a key concern. These stem from hackathons’ and competitions’ shortterm nature and from their tendency to engage audiences that have pre-existing interests and knowledge in the subject matter of focus.165 Hackathons and competitions are typically designed to attract a digitally literate audience who may not be part of the most at-risk sections of society from HURDA violations.166 The scope of current practice suggests competitions and hackathons also remains relatively narrow in their scope.167 Experts consulted during study activities noted that there are various opportunities to expand the traditional target audiences and thematic focus of hackathons and competitions.168 Expanding target audiences of interventions could include, for example, competitions and hackathons focused on engaging women and girls in STEM-related initiatives.169 As such, careful considerations should be made by implementers regarding the objectives and underpinning strategy for delivering competitions and hackathons as HURDA-related capacity-building interventions. Novel approaches to leveraging competitions and hackathons as educational mechanisms may be needed to foster linkages with new kinds of audiences. Additionally, follow-up events, such as multi-stakeholder conferences, may be organised as follow-ons to existing competition or hackathon events with a focus on identifying positive outcomes drawn out from the conclusions of these programmes. 3.4.5. Awareness raising campaigns Awareness-raising campaigns entail the promotion and advocation of different issues within the HURDA ecosystem through a wide variety of formats (e.g. social media posts, videos, posters and advertisements). They are intended to increase the recognition of relevant topics by certain stakeholder groups. Interventions that fall under this category may be tailored in order to generate awareness with a particular set of actors who may be relevant for achieving the desired outcome of the campaign, or aimed more broadly at wider society to garner greater engagement on the issue at hand. Awareness-raising campaigns can provide opportunities to positively shape certain trends relating to HURDA that may require the sole engagement and involvement of end users, by highlighting ways that these stakeholders can contribute towards beneficial actions and behaviours. At the same time 165 RAND Europe workshop (March 2021). 166 RAND Europe workshop (March 2021). 167 RAND Europe workshop (March 2021). 168 RAND Europe workshop (March 2021). 169 RAND Europe workshop (March 2021). 40 Human Rights in the Digital Age such interventions can generate greater awareness on HURDA topics that may not receive attention through alternative or existing initiatives, by disseminating targeted messages that highlight the importance of the relevant issue. Finally, these campaigns can empower traditionally marginalised communities who may have suffered from HURDA violations with a platform and influence to advocate necessary changes to prevent these infringements. This, in turn, may garner recognition from key actors that could result in alterations to legislation, working practices or behavioural norms that foster a more conducive environment towards HURDA. 3.5. Strategic communications Strategic communication measures provide an opportunity for implementers to design and promote narratives concerning HURDA and actively shape national and international discourse. Strategic communications can be understood as encompassing interventions that aim to ‘reclaim’ narratives from hostile and malicious actors in the online information space. In doing so, they ensure that positive messages concerning human rights and fundamental freedoms in the digital age reach the wider public or specific at-risk communities and segments. Although these can be conceived of as government-led initiatives, strategic communications can involve a wider range of actors – from the public and CSOs to the private sector. Strategic communications approaches can include so-called counterspeech strategies, which aim to counteract specific instances of online hate speech, misinformation, or disinformation through alternative narratives and framing or rebuttals. Strategic communications can offer a proactive way of directly challenging onlinebased content propagated by malicious actors designed to erode or undermine HURDA. Interventions of this kind may also raise awareness for digital users of the multifaceted nature of online content and narratives. Furthermore, strategic communication interventions can be rapidly produced and deployed by leveraging new and emerging technologies to mitigate the influence of messages deployed by malicious actors (including through in-person messaging and messaging through ‘bot’ spam accounts) that seek to negatively impact the online information ecosystem. Experts consulted during study activities highlighted that the integration of local actors into strategic communications and ensuring context-specificity was critical for maximising their impact.170 Furthermore, experts consulted noted that funders should make careful considerations of how certain target audiences may be digitally excluded while experiencing HURDA violations. As such, strategic communication projects require a comprehensive understanding of the potential impacts of the local socio-economic and sociocultural environment on the receptiveness of local audiences to different kinds of messages.171 While strategic communications is considered a key cross-cutting mechanism for the promotion and fostering of HURDA, a number of challenges may limit the impact of these interventions. These include the absence of planning or risk assessments at the design stage regarding the potential ways that 170 RAND Europe workshop (March 2021). 171 RAND Europe interview (February 2021, ID16). 41 these interventions could fail to properly engage beneficiaries or have unintended consequences. Such consequences could include, for example, back-lash from target audiences based on a general distrust or suspicion of fact-checkers and public authorities scrutinising certain narratives as mis- or disinformation. Considerations regarding potential unintended consequences are therefore seen as a key part of advancing wider future-proofing of strategic communication initiatives.172 Additionally, challenges of evaluating the impact of counternarrative projects and low levels of digital literacy among at-risk groups may undermine the ability of funders to effectively reach out to relevant target audiences and improve practices in an evidence-based way.173 To enhance the impact of strategic communication interventions, an integration of online messaging and offline initiatives or other types of interventions (e.g. educational measures) may be pursued.174 Beyond current practice, strategic communication initiatives may also be expanded to include alternative stakeholders or narratives. This could, for example, include strategic communications based on storytelling narratives and showcasing the success or progress of ongoing projects as a means of engaging a wider range of potential donors.175 3.6. Comparative and crosscutting analysis of capacitybuilding approaches for HURDA Building on the discussion of a selected shortlist of individual intervention approaches discussed in the previous section, this section presents a comparative and cross-cutting analysis of the opportunities, challenges and other considerations for funders in relation to delivering HURDA-related capacity-building interventions. Insights informing this chapter were derived from the data collection and analyses that were carried out throughout the study, including the literature review, interventions mapping, expert interviews and STREAM workshop activities. 3.6.1. Comparative analysis To comparatively assess the opportunities associated with different HURDA-related capacity-building intervention approaches, the study team engaged experts and stakeholders from across academia, CSOs and government departments in a STREAM scoring exercise. This scoring exercise served to elaborate on the relative benefits and limitations of different intervention approaches through assessing their impact and feasibility of implementation across a range of criteria. Figure 3.2 provides an overview of the findings from this comparative analysis, showing the assessed impact and feasibility of implementation of each of the shortlisted intervention approaches, with a supporting figure key in Table 3.1. 172 RAND Europe workshop (March 2021). 173 RAND Europe workshop (March 2021). 174 RAND Europe workshop (March 2021). 175 RAND Europe interview (February 2021, ID10). 42 Human Rights in the Digital Age Figure 3.2: Comparative assessment of the shortlisted intervention approaches Table 3.1: Figure key – intervention approach labels Category Governance and regulation Technical interventions Education Strategic communications # Intervention approach 1 Dialogue and consultative platforms and initiatives 2 Knowledge generation and consolidation 3 Enabling software for civil society and general population 4 Enabling software for governmental and private sector actors 5 Databases and repositories 6 Training for senior decision-makers and institutional actors 7 Training for civil society actors 8 Education and lifelong learning 9 Hackathons and competitions 10 Strategic communications campaigns The scoring results and workshop discussions provided a number of high-level insights concerning the relative impact and feasibility of implementation of each category of interventions as well as individual intervention approaches: • Governance and regulation-focused interventions provide important opportunities for fostering stakeholder collaboration and incentivising evidencebased approaches to fostering the HURDA landscape. This represents opportunities 43 for systemic change by institutionalising impact but faced significant barriers key principles for strengthening HURDA- to implementation. As highlighted by related safeguards, such as ‘multi- experts consulted during the study, a stakeholderism’ and evidence-based key cross-cutting factor shaping the policy-making. Interventions approaches impact of education and training is in this category also generally face lesser the extent to which interventions can barriers to implementation from the engage new audiences that may not perspective of the FCDO. have pre-existing interests in issues • Technical intervention approaches are believed to provide important technical enablers for fostering HURDA, including through enabling the filling of gaps in evidence or knowledge, particularly relevant in light of rapid technological change. Technical interventions, however, require careful consideration such as digital technologies and human rights. While innovative mechanisms such as hackathons can provide more practical, hands-on experience through which relevant audiences can strengthen their skills and knowledge, there is a risk that interventions deliver only short-term visibility for HURDA-related issues. about their integration with wider, • Strategic communications represent an non-technical responses. This is due important overarching narrative-focused to the risk of technological solutions layer of HURDA-related interventions. being considered a quick and easy fix to Experts consulted however highlighted that technology-driven issues. The design of strategic communications campaigns may technical tools can also significantly shape struggle to engage key audiences (e.g. the impact of technical interventions, most at-risk communities) limiting their with early integration of inclusivity scope and scale of impact. Additionally, the and transparency in the design of impact of interventions in this category can technical tools a key priority. Overall, the be more uncertain than in the case of other feasibility of implementation of technical intervention approaches due to the difficulty interventions may be limited for the of measuring the impact of strategic FCDO due to potential reputational risks communications in an isolated way. if technical interventions have unintended consequences on the HURDA landscape or are deliberately exploited for alternative purposes. While the workshop scoring exercise provided insights concerning potential trade-offs between different intervention approaches stemming from their relative impact and • The scoring exercise indicated significant feasibility of implementation, the scoring also variations among education-focused highlighted similarities between many of the interventions in relation to both their intervention assessments. This highlighted the impact and feasibility of implementations. need for more nuanced considerations of how While some intervention approaches, such interventions are designed and implemented – as hackathons and competitions, were factors that may contribute to maximising their believed to contribute relatively little to the impact and facilitate their implementation by strengthening of the HURDA landscape the FCDO. The following sections discuss these despite low barriers to implementation, additional factors in greater detail. others such as education and lifelong learning promised significant potential 44 Human Rights in the Digital Age 3.6.2. Cross-cutting considerations on HURDA-related capacity-building interventions As the comparative assessment of the shortlisted HURDA-related interventions indicated, while some intervention approaches are generally believed to be more impactful in relation to HURDA, interventions are likely to aim at fostering impact at a different level of HURDA landscape (e.g. strengthening capabilities in contrast to shaping culture and attitudes). As such, organisations may benefit from focusing on a portfolio-style approach of multiple, mutually-reinforcing interventions in order to achieve a sufficient breadth, depth and sustainability of impact. This also recognises that while some intervention approaches may be more impactful than others, no single intervention approach is likely to achieve sustainable outcomes in isolation. There are several additional considerations in relation to such a portfolio-style approach: • The impact of individual interventions may materialise differently in combination with other interventions as opposed to in isolation. Experts consulted for example highlighted that technical tools such as databases and repositories are likely to make an important contribution to the strengthening of the HURDA landscape through serving as technical enablers. The scope, however, of the impact of this intervention approach in isolation is more uncertain. As such, the prioritisation of funding for HURDA-related capacitybuilding may require a comprehensive assessment of the interaction of individual interventions as well as their complementarity. • The fostering of a comprehensive portfolio of HURDA-related capacitybuilding interventions may facilitate the integration of HURDA as a cross-cutting dimension to wider capacity-building approaches and engagements between the FCDO and key partner nations or regions.176 Several factors underpin the importance of fostering of an integrated HURDA framework through the embedding of human rights considerations across capacity-building activity. Firstly, it is important to consider how funding organisations can foster consensus over shared human rights values from a principled standpoint across different lines of effort in capacity-building. Frequently, while human rights considerations are well integrated in certain portfolios, they are lacking in others. This may undermine the overall impact of efforts pursued by a funding organisation due to lacking coherence. Such coherence is key both in a horizontal and vertical dimension. In relation to the former, it should ensure coherence in HURDA-related activities between UK’s domestic policy actions and external initiatives.177 In relation to the latter, coherence is also needed between UK and other actors’ activities in a given national or regional context. The fostering of a coherent and integrated HURDA framework also recognises the inherent limits of capacity-building activities to the fostering of human rights and fundamental freedoms online. While capacity-building can make significant contributions to the strengthening of HURDA, a sustainable improvement in structures, systems and practices relating to HURDA may require 176 RAND Europe interview (February 2021, ID13). 177 RAND Europe interview (February 2021, ID11, ID13). 45 additional levers of power across government to be applied in tandem. This includes the mobilisation of diplomatic, financial and political resources across the HURDA problem space. The mobilisation of these resources alongside HURDA capacity-building may be required not least due to the myriad trends and challenges that shape the HURDA landscape but fall outside of the reasonable scope of capacity-building interventions, such as: • Direct shaping of international norms and standards concerning HURDA through bilateral or multilateral engagements (including through relevant UN bodies and initiatives). • Engagement with and regulation of the private sector on matters of concern to HURDA (including fostering transparency in content moderation and accountability of digital service providers). • The fostering of broader socio-economic development with implications for HURDA (e.g. access to education and healthcare). In mobilising a variety of governmental levers and resources to support HURDA, experts interviewed emphasised that the scale of engagement can have a significant impact in its own right. The scale of investment from the UK on priority issue areas in priority countries and regions serves to reiterate the opportunities that exist for the UK to signal its commitment to these issues and countries or regions. Given the overall importance of credibility and buy-in from local stakeholders to the success of HURDA-related capacity building, as discussed further below, the signalling of such commitment can serve to: (1) directly enhance the UK’s credibility as an actor invested in strengthening the HURDA landscape; and (2) foster trust and strategic partnerships between the UK and key stakeholders. 178 Further to the scale of investment and resources dedicated to HURDA-related capacity-building interventions, the impact of individual interventions is likely to significantly depend on features in their design and implementation. The next section discusses in greater detail some of the key cross-cutting principles for consideration in the context of HURDA-related capacity building. 3.6.3. Cross-cutting challenges for HURDA-related interventions The implementation of capacity-building interventions to foster and promote HURDA may be constrained by various challenges for funding organisations. These stem from structural trends shaping the HURDA landscape, the inherent complexity of the HURDA problem space and the intersectionality of human rights in general, as well as the dynamics among and behaviour of different stakeholders. As such, it is important to recognise a number of cross-cutting challenges for funding organisations and implementers to effectively navigating the landscape of HURDArelated capacity-building. These challenges include: • Rapid pace of technological change: As discussed in Chapter 2, the HURDA landscape is most significantly shaped by advances in ICTs and technologies such as biotechnology, space-based technologies (e.g. satellite imagery) and quantum computing. The pace of technological advances, while providing opportunities for rapid improvement of technical tools underpinning HURDA safeguards, also present significant challenges, particularly in relation to regulation, legislation and 178 RAND Europe interview (March 2021, ID16). 46 Human Rights in the Digital Age governance. In addition to potential gaps in and regional approaches to some of the skills and expertise on data use and data key HURDA principles. This includes, protection that constrain public sector for example, freedom of speech and organisations, the pace of technological expression, which, historically, has been change poses challenges to the ability of interpreted differently in various national funding organisations to design ‘future and regional contexts.181 The heterogeneity proof’ interventions rather than playing of understanding of these principles in catch-up with technological advances.179 turn constrains the potential for coherent As such, capacity-building requires international actors on HURDA. It may consideration of possible future-proofing of also undermine efforts to highlight the interventions. universality of human rights in strategic • Lack of shared understanding of key concepts and taxonomies: HURDA-related communications and other categories of interventions. capacity-building interventions frequently • Uncertainties concerning the impact of face challenges stemming from a lack of interventions and potential unintended consensus concerning the conceptual and consequences: Though a maturing definitional delineations of key concepts at evidence base concerning factors the heart of interventions. This includes, for and principles that underpin effective example, the concepts of the civic space capacity building may help guide future and hate speech, undermining efforts identification of best practices, there are by the international community to focus persistent uncertainties concerning the resources as well as shape a coherent potential impact of many intervention international governance and regulatory approaches. This is not least due to the environment that is conducive to fostering difficulties of measuring and evaluating sustainable outcomes for human rights the impact of individual interventions, and and fundamental freedoms online. As such, possible inconsistencies in how evaluation there is a need to clearly define concepts and assessment practices are incorporated that constitute the strategic objectives of into capacity-building interventions as different interventions and consider how well as how robust these practices are.182 this may differ from the understanding Furthermore, HURDA interventions may of other actors in the wider international face the risk of sparking adverse impacts community.180 through unintended consequences, • Heterogenous understanding of HURDA principles: Further to the lack of shared understanding on key concepts and taxonomies in HURDA capacity building, experts interviewed noted that there for example if technical capacities are exploited by stakeholders for alternative purposes, or efforts to strengthen certain HURDA principles (e.g. by strengthening counter-disinformation frameworks) work are significant differences in national 179 RAND Europe interview (February 2021, ID03, ID08, ID12). 180 RAND Europe interview (February 2021, ID12). 181 RAND Europe interview (February 2021, ID01). 182 RAND Europe interview (February 2021, ID15, ID16). 47 to undermine others (e.g. freedom of expression online). • Lack of inter-stakeholder coordination and stakeholder fatigue: Identifying local partners and fostering their interest, participation and buy-in for capacitybuilding interventions and initiatives can be challenging.183 Furthermore, the role of some stakeholders (e.g. different UN agencies) may not be understood equally, risking that local and international stakeholders with relevant expertise and capabilities may be excluded or missing from capacity-building activities.184 Where partners are effectively identified and engaged in participatory HURDA capacitybuilding activities, funding organisations however also face the risk of stakeholder fatigue.185 As such, interventions require careful consideration of who the key partners are and how they can be meaningfully engaged to foster sustainable change in the understanding of and practices vis-à-vis HURDA. 183 RAND Europe interview (February 2021, ID12). 184 RAND Europe interview (February 2021, ID14). 185 RAND Europe workshop (March 2021). 48 Human Rights in the Digital Age 49 4 Conclusions and recommendations 4.1. Summary of results As discussed throughout this report, digitalisation and its underlying technologies and associated applications and services provide both opportunities and challenges for the fostering and promotion of human rights and fundamental freedoms. Digital spaces provide a unique opportunity for individuals and groups to interact and exercise their rights and freedoms. Conversely, digitalisation has given rise to novel threats and risks for human rights and fundamental freedoms as well as augmented existing threats. Overall, human rights and fundamental freedoms have seen, albeit in different ways, their saliency, application, safeguarding and exercising affected by socio-technological developments that have occurred in the so-called digital age. Chapter 2 of the report presented a characterisation of the key human rights and fundamental freedoms that are shaped by digitalisation as well as of the main associated trends, challenges, opportunities, and threats. Figure 4.1 below presents a visual summary of the content of this chapter. As noted in Chapter 2, the rights discussed in this report should not be taken as exhaustive, but rather as illustrative of how digital technologies and their uses and applications have influenced a wide range of human rights and fundamental freedoms. Furthermore, it should be noted that while it may be possible to discern between different rights and freedoms at an abstract level, such distinctions are less applicable in practice, and particularly in the context of capacity-building efforts, as the safeguarding and exercising of different rights and freedoms are inherently interconnected and overlapping. While the impacts of technological advances and digitalisation may materialise differently in relation to each of the human rights and fundamental freedoms outlined above, the study identified a number of cross-cutting trends and challenges. These stem both from the impacts of increasing utilisation of data and digital technologies and services as economic or governance commodities, as well as deliberate exploitation of the information environment in the context of rising digital authoritarianism or for purposes such as misand disinformation, targeted cyberattacks, and a range of other online harms. As such, the evolving HURDA landscape is characterised by considerable complexity and rapid change, as well as tensions between multiple contravening interests for the utilisation of maturing and advancing technological tools and services. Figure 4.2 provides a visual summary of some of the main opportunities, trends, challenges, and threats for HURDA. 50 Human Rights in the Digital Age Figure 4.1: Mapping human rights and fundamental freedoms in the digital age RIGHT TO PRIVACY D Individuals are allowed to hold a private life, family, home and correspondence, and choose when, where, how and to whom information about them is disclosed to others. I Informational privacy requires that both substantive information and metadata be safeguarded to protect this right. RIGHT TO FREEDOM OF OPINION AND EXPRESSION D Individuals can seek, hold, receive, impart and express their opinions, ideas and views openly through any media. Individuals should have free and I equal access to the internet and digital technologies and be able to form, hold, and express their opinions in the digital environment. RIGHT TO PEACEFUL ASSEMBLY AND ASSOCIATION Individuals have a right to peaceful D assembly and freedom of association with others, also to enable broader exercising of their civil, political, economic, social, and cultural rights. I The right applies also in the digital environment, digital technologies may be used as enablers for exercising the right both in the digital and physical worlds. RIGHT TO EQUAL PARTICIPATION IN POLITICAL AND PUBLIC AFFAIRS Individuals can fully participate in D and effectively influence public decision-making processes that affect them, including through participation in political and public affairs. I The right applies also in the digital environment whose underlying technologies may be used both as an enabler or barrier to its safeguard. RIGHT TO FREE AND FAIR ELECTIONS D Individuals can participate in the conduct of public affairs, directly or through freely chosen representatives, and to vote or be elected in free, fair and periodic elections. I Digitalisation may facilitate participation in electoral affairs but carries risks of manipulation, interference, and enabling violation of other rights. Source: RAND Europe analysis. RIGHT TO EDUCATION D No person shall be denied a right to education in conformity with their own religious and philosophical convictions. I Digitalisation can facilitate education delivery but also lead to exclusion of disadvantaged groups with no access to digital technologies. RIGHT TO HEALTH D Expresses the right to health as a basic human right and encapsulates standards for meeting the health needs of specific groups or individuals, and the means for implementing the right to health. I Digitalisation can facilitate innovative healthcare practices but also carries risks and vulnerabilities associated with mis- and disinformation. OTHER HUMAN RIGHTS AND FUNDAMENTAL FREEDOMS Other human rights and fundamental freedoms which may be shaped or influenced by digitalisation, ICTs, and associated opportunities, trends and challenges. D Definition I Implications in the digital age Figure 4.2: Cross-cutting opportunities, trends, challenges, and threats for HURDA OPPORTUNITIES TRENDS CHALLENGES 51 THREATS Opportunities for human rights in the digital age Democratisation of access to digital spaces, technologies, and services Strengthening of human rights safeguards Enhanced access to information and knowledge regarding HURDA Enhanced opportunities for the exercising and protecting of human rights and fundamental freedoms Wider political and socio-cultural trends in the digital age Data, digital technologies Data, digital technologies and services as an and services as a economic commodity governance commodity Rise of digitally transparent societies and limited societal understanding of challenges for HURDA Declining social trust, public uncertainty, and disengagement Worsening digital divide and inequality Commodification of data and surveillance capitalism Mass-scale adoption of machine learning-driven personalisation in communication Lack of oversight and control for the uptake of data-intensive technologies Lack of security and human rights safeguards by default Increasing digitalisation of public services and emergence of digital welfare states Unintended consequences of legislative and regulatory provisions Mass surveillance and digital authoritarianism Weakening or criminalisation of encryption and privacy-enhancing technologies Increased capabilities for mass surveillance Overbearing biometric data collection practices Exploitation of tools for monitoring, filtering, and blocking the dissemination of information or removing content online Illicit or criminal activities in digital spaces Weaponisation of the information environment for misand disinformation, propaganda, and recruitment Exploitation of digital spaces for targeted online harms Source: RAND Europe analysis. 52 Human Rights in the Digital Age In the context of the above-described HURDA landscape, donor organisations, human rights and civil society organisations, as well as capacity-building implementers can leverage a number of approaches and methods to build capacity to foster and safeguard HURDA. As discussed in Chapter 3, these approaches can be broadly grouped into four categories encompassing: governance and regulationfocused interventions, technical interventions, education-related interventions, and strategic communications. Within these categories, a range of activities and approaches can be leveraged to build capacity and contribute to a wider fostering and safeguarding of human rights and fundamental freedoms. To provide a better characterisation of the spectrum of capacity-building approaches and the benefits and limitations of different intervention mechanics, as well as relevant corresponding considerations for the UK FCDO, the study carried out in-depth comparative analysis of ten intervention approaches presented in Figure 4.3 below. Figure 4.3: Intervention approaches for HURDA-focused capacity-building interventions GOVERNANCE AND REGULATION Dialogue and consultative platforms and initiatives Knowledge generation and consolidation TECHNICAL INTERVENTIONS Enabling software for civil society and general population Enabling software for governmental and private sector actors Databases and repositories EDUCATION Training for senior decision-makers and institutional actors Training for civil society actors Education and lifelong learning Hackathons and competitions STRATEGIC COMMUNICATIONS Strategic communications campaigns Source: RAND Europe analysis. 53 As discussed in Section 3.6, a capacitybuilding intervention would typically be characterised by the presence and sequencing of multiple activities and instruments from across those presented in Figure 4.3 above. Furthermore, intervention implementation activities would be characterised by distinct benefits, limitations, enablers and barriers stemming from the specifics of the context of work. As such, donors and implementers of capacity-building interventions should consider during intervention design and implementation stages the trade-offs, strengths, and limitations of different approaches. Such considerations should take particular note of the specific contexts in which they would operate, the beneficiary groups they would target, and the resources, expertise and know-how available. Further to these considerations, HURDArelated capacity-building should also be guided and informed by a range of cross-cutting principles to maximise the potential impact and minimise the risk of any unintended consequences resulting from interventions being implemented. The following section outlines a set of recommendations concerning overarching principles to be considered by relevant stakeholders during the design and implementation of HURDA-related capacitybuilding interventions. 4.2. Recommendations A core aim of this study was to propose recommendations for action to be implemented by the FCDO CSSF Cyber and Tech Programme and the FCDO’s capacity-building work more broadly. The following sections present cross-cutting recommendations emerging from this study that are of relevance not only for the FCDO, but also for donor and implementer organisations more widely that are active in the space of HURDA-related capacity building. In particular, a review of data and insights collected during expert interviews and workshop consultations helped generate overarching recommendations and principles that donor and implementer organisations should consider adopting and implementing to further strengthen their capacity-building efforts on HURDA (see Section 4.2.1), as well as practical insights concerning specific intervention approaches considered in study activities (see Section 4.2.2). When formulating recommendations for action, the study team strived to focus on areas where donor and implementer organisations can take independent action and achieve impact. 4.2.1. Overarching principles A critical appraisal of the study evidence base collected, including interviews and workshop consultations with relevant stakeholders, led to the identification of several overarching principles that donor and implementer organisations should consider during the planning, design, and implementation stages of capacity-building efforts touching on HURDA. These are summarised in Figure 4.4 and are discussed in greater detail below. 54 Human Rights in the Digital Age Figure 4.4: Overarching principles for HURDA-related capacity-building efforts and programmes Adopt a holistic approach to human rights and fundamental freedoms embedding a focus on these across all capacity building initiatives and abandoning any residues of digital exceptionalism Ensure that policy and international engagement work support and are coherent and resonate with capacity building initiatives to foster and protect human rights and fundamental freedoms Develop a strategic approach and overarching framework to guide capacity building initiatives focusing on human rights and fundamental freedoms a. Identify overarching priorities and desired outputs and impact to achieve through a portfolio of initiatives b. Identify prioritisation approaches for selecting individual capacity building initiatives to ensure their relevance to and coherence with the overall strategic vision Build on established principles and good practices for the delivery of individual interventions to maximise their impact, sustainability, effectiveness, and efficiency a. Consider the adoption of complex interventions spanning multiple activities, objectives, and beneficiaries of focus b. Tailor interventions to local and regional contexts to facilitate local ownership and ensuring adequate nuance in intervention activities and content c. Ensure the adoption of inclusive, multi-stakeholder approaches to capacity building intervention design and implementation d. Incorporate comprehensive planning, risk assessment, and evaluation activities to mitigate potential unintended consequences and maximise learning e. Embed knowledge-, skills-, and competence-transfer components in capacity building initiatives and interventions 1. Adopt a holistic approach to human rights and fundamental freedoms, embedding a focus on these across all capacitybuilding initiatives and abandoning any residues of digital exceptionalism. Stakeholders and experts consulted during study activities lamented that HURDArelated considerations in capacity building are often present only in those initiatives specifically designed to focus on HURDA. Furthermore, they lamented that HURDArelated capacity-building interventions often have a narrow thematic approach – touching only on specific segments and challenges of the HURDA problem space. Donors and implementers should strive to embed an active focus and consideration on HURDA across wider portfolios of capacity-building interventions, not limiting such focus to those with a specific human rights and fundamental freedoms focus. Furthermore, given the pervasiveness of implications stemming from digital technologies for human rights (as evidenced in Chapter 2), any focus on human rights in capacity-building interventions should move away from more traditional concepts of human rights towards recognising the saliency and significance of the implications for human rights that stem from socio-technological advances. Confining HURDA-related considerations to specific interventions and initiatives was seen by experts consulted as running a significant risk of undermining the ability of donors and implementers to 55 achieve meaningful and sustainable impact with regards to HURDA. 2. Ensure that policy and international engagement work support and are coherent and resonate with capacitybuilding initiatives to foster and protect human rights and fundamental freedoms. Experts consulted over the course of study activities noted the inherent limitations and challenges associated with taking an approach considering only capacitybuilding efforts for issues related to HURDA. The presence of a responsive culture and environment to HURDA in target and beneficiary countries and regions was noted as a key enabler and requirement to ensure the achievement of sustainable and meaningful impact. In this regard, governmental donor and funder organisations and programmes, such as those of the FCDO, should consider framing and supporting HURDA-related capacity-building efforts through wider diplomatic initiatives. This should employ the full range of instruments and levers of power available to foster the development of a conducive context and culture for HURDA in partner and beneficiary countries where this may not exist. In this regard, stakeholders and experts also noted a need to ensure that any violations committed with regard to HURDA by partner and beneficiary countries be addressed and stigmatised on a scale comparable to violations touching on human rights in a more traditional sense. Experts and stakeholders consulted lamented a tendency in global human rights practice for such violations to be considered of lesser significance or impact than traditional human rights violations taking place in the physical world. 3. Develop a strategic approach and overarching framework to guide capacity-building initiatives focusing on human rights and fundamental freedoms. As evidenced in Chapter 2, recent and ongoing developments at the sociotechnological and political levels provide significant opportunities and challenges for HURDA. Recognising the complexity of the HURDA problem space, and in light of a general context characterised by resource constraints and finite opportunities for action, prior to selection of specific initiatives and interventions donor and funder organisations should consider developing an overarching intervention logic and: a. Identify overarching priorities and desired outputs and impact to achieve through a portfolio of initiatives. This could contribute to ensuring that organisations are setting themselves up to achieve impact where it is most critical for their mission and strategic priorities. Efforts in this regard should aim to clearly define the strategic and operational objectives that different organisations and institution wish to achieve with their capacity-building interventions in regard to HURDA. The vision and guidance set out in an overarching intervention logic could also help facilitate coherence and consistency of action as regards HURDA in wider capacity-building interventions and initiatives whose main focus may rest on other policy issues and challenges. b. Identify prioritisation approaches for selecting individual capacity-building initiatives to ensure their relevance to and coherence with the overall strategic vision. In developing an overarching vision and strategic approach to fostering and protecting HURDA overseas, organisations and institutions involved in capacitybuilding work should also identify criteria 56 Human Rights in the Digital Age and approaches for prioritising their efforts. Criteria could focus on identifying those areas where they may offer a unique value proposition or benefit from a comparative advantage compared to other actors and donors active on HURDA. Prioritisation criteria should also be informed by an identification of those countries, regions, and constituencies characterised by a more permissive and responsive environment to HURDA-related interventions. Several stakeholders consulted during study activities emphasised the critical importance of political buy-in and support among beneficiary communities to facilitate a sustainable uptake and impact of HURDA-related initiatives. 4. Build on established principles and good practices for the delivery of individual interventions to maximise their impact, sustainability, effectiveness, and efficiency. Donor, funder, and implementer organisations should continue building on good international development and capacity-building practices, leveraging their expertise and experience in these spaces to maximise results stemming from the implementation of individual HURDArelated interventions. To achieve they should: a. Consider the adoption of complex interventions spanning multiple activities, objectives and beneficiaries of focus. No single solution exists for fostering and protecting HURDA within any context. Organisations and institutions active in HURDA-related capacity building should consider that to achieve widespread and sustainable impact, capacity-building initiatives should typically comprise a mix of interventions and programmes spanning different thematic and issue areas and targeting different beneficiaries and stakeholder groups within a given country or region. Furthermore, sustained efforts over protracted periods of time are likely to be required to ensure sustainability of results in the long term and facilitate transformational change not only at the technical, but also at the cultural level. b. Tailor interventions to local and regional contexts to facilitate local ownership and ensuring adequate nuance in intervention activities and content. Experts consulted in the study noted the importance of ensuring that HURDA interventions and capacity-building efforts are tailored to local contexts, not only in terms of responding to actual and specific local needs, but also in terms of content framing and nuance. In this regard, organisations involved in HURDA-related capacity building should strive to adopt framing approaches and content in their interventions that highlight the universality of human rights whilst also linking it to specifics and the context and culture of the region or country of intervention. To achieve this, organisations and institutions should consider developing long-term, strategic partnerships with local actors and organisations based in beneficiary areas and countries. These actors and organisations could in turn facilitate the bridging and adaptation of existing initiatives, interventions, and materials to local contexts to maximise their potential for uptake. To achieve this, governmental institutions and programmes, such as those of the FCDO, should consider leveraging embassies and consulates in a structured manner to develop and maintain access to local networks of implementers, enablers and gatekeepers who could support and/or facilitate capacity-building efforts. Embassies could also play a significant role in facilitating 57 coordination and in maintaining situational awareness and coordination with other international actors and donors active in specific countries or regions. This could help ensure that duplication of effort is avoided, stakeholder fatigue is minimised, and coherence of action towards HURDA is strengthened. c. Ensure the adoption of inclusive, multi-stakeholder approaches to capacity-building intervention design and implementation. Fostering human rights and fundamental freedoms in the digital age is at its core a multi-stakeholder process that necessitates wide engagement and dialogue practices, particularly in relation to the shaping of the policy, regulatory and normative environment. Organisations and institutions active in HURDA-related capacity building should ensure that their capacity-building interventions for HURDA are cognisant of this and ensure participation of different international, regional and local stakeholder groups in the design and implementation of their activities. In conjunction with this, efforts should also be made to minimise risks of stakeholder fatigue. This can be achieved by limiting engagements to those initiatives and interventions where stakeholders can add meaningful contributions to the work being delivered and minimising the risk of duplicating efforts by maintaining situational awareness of wider initiatives and efforts conducted at the global, regional and national levels by other actors and donors. d. Incorporate comprehensive planning, risk assessment, and evaluation activities to mitigate potential unintended consequences and maximise learning. As discussed in Section 2.8 of this report, HURDA-related interventions may face the risk of sparking adverse impacts through unintended consequences. Organisations and institutions active in HURDA-related capacity-building work should ensure that adequate risk assessments and planning are conducted during the design stages of an intervention development to minimise any such risks. Such exercises should also entail a rapid evidence assessment of available literature and evidence on similar initiatives and efforts to benefit from lessons identified and learned by other implementers and organisations. Donors and funders should also ensure that the design, planning and conduct of monitoring and evaluation activities take place from the inception phase of an intervention and throughout the duration of its implementation. The timing of an evaluation is an important element that should be considered carefully. In many instances an evaluation can usefully happen before an intervention or policy is launched. So-called ex-ante evaluation can help assess the expected results of a programme and provide recommendations as to how the design could be improved to amplify its expected impacts. Before committing to a full evaluative undertaking, evaluability assessments should be considered with a view to establishing the extent to which an activity or project can be evaluated in a reliable and credible fashion. During the implementation of an intervention, real-time evaluations and process evaluations can generate insights concerning adjustments needed to the intervention’s design or its approach. Lastly, with regard to impact evaluations, consideration should be given to the timing of these (i.e. assessing when results of an intervention may start to emerge and become measurable). Donors and funders should also ensure that minimum 58 Human Rights in the Digital Age quality and robustness requirements for evaluations of their interventions are articulated. This should entail the use of empirical data, multiple methods and adequate stakeholder engagement. e. Embed knowledge-, skills-, and competence-transfer components in capacity-building initiatives and interventions. Embedding knowledge and skillstransfer components can strengthen the sustainability of initiatives. In this regard, organisations and institutions delivering HURDA-related capacity-building efforts should strive to ensure that their initiatives and interventions maximise opportunities for such skills and competences transfer to take place. This could be done for instance through the embedding of a train-the-trainer components or approach in education interventions, as well as through the inclusion of modules facilitating the handover of tools, techniques, or programmes developed to local stakeholders and implementers when developing knowledge, software, or strategic communication materials. 4.2.2. Operational insights on individual intervention approaches In addition to overarching principles stemming from cross-cutting considerations on HURDA-related capacity building, the study team developed a series of practical recommendations and operational insights that donors and implementers may leverage during the commissioning, designing and implementation of capacity-building interventions that touch on HURDA. These are discussed in the following sections. Governance and regulation As discussed in Section 3.2, regulation and governance play an integral part of the HURDA landscape by shaping processes, attitudes and the general environment for HURDA safeguards from a top-down, structural perspective. This study focused on two particular mechanisms to strengthen the regulation and governance layers of the HURDA landscape – dialogue and consultative platforms and initiatives, and knowledge consolidation and generation. Table 4.1 provides an overview of the key recommendations in relation to these two intervention approaches. 59 Table 4.1: Recommendations for governance and regulation-related interventions Governance and regulation-related interventions Dialogue and consultative platforms and initiatives Definition Recommendations Interventions that aim at facilitating interaction between different stakeholder groups and their participation in shaping the policy, regulatory and normative environment. • Conduct pre-engagement mapping of existing platforms and initiatives: To facilitate meaningful dialogue and participation that fosters sustainable impacts, conducting a comprehensive mapping of existing mechanisms, platforms and initiatives should be carried out to avoid duplication and avoid risk of stakeholder fatigue. • Enable tailoring of existing platforms and initiatives to new themes or audiences: Existing platforms and initiatives may be tailored to cover novel issues or expand in target audience. Rather than create novel mechanisms, there may therefore be opportunities to leverage existing mechanisms through tailoring and adaptation, while avoiding the emergence or reinforcing of thematic silos. • Leverage innovations in and expansion of teleworking: There are considerable opportunities to leverage new practices in teleworking to foster dialogue and consultations at lower cost. Mechanisms such as teleworking may also facilitate inclusion of new stakeholder groups who may otherwise face greater resource-related constraints to participate in large-scale international initiatives. • Link interventions with strategic communications: Interventions in this category may leverage strategic communications to foster interest in key themes and foster visibility for the agenda of the given platform or initiative. Knowledge generation and consolidation Definition Interventions such as research projects and evaluations that aim at strengthening the evidence base on HURDA and identifying evidence-based practices for fostering these. Recommendations • Link interventions with dialogue and consultations: Knowledge generation and consolidation may follow, or be conducted in tandem with, dialogue and consultative initiatives in order to identify gaps and opportunities for knowledge generation and consolidation. • Orient interventions towards filling gaps in foundational knowledge of HURDA: Interventions in this category may be particularly suited for contexts in which there are greater gaps in foundational knowledge of HURDA-related issues and less maturity in HURDA-related legal and regulatory frameworks. • Foster local mechanisms for knowledge generation and consolidation: Interventions should seek to transfer mechanisms for knowledge generation and consolidation to foster local skills, competences and expertise on HURDA-related issues. As such, there may be opportunities for linking knowledge generation and consolidation with training or educational interventions. • Ensure access to and foster availability of data: Interventions should ensure access to data. As such, interventions in this category may go hand in hand with efforts to improve data access and transparency from key ‘data providers’ (e.g. private sector actors) to facilitate the formulation of evidence-based policy-making. • Leverage interventions to navigate technological complexity: Knowledge generation and consolidation may help stakeholders navigate a rapidly evolving and increasing complex technological landscape. However, activities should focus on technology-agnostic, future-proof frameworks to foster replicability and sustainability of initiatives. Source: RAND Europe analysis. 60 Human Rights in the Digital Age Technical interventions While emerging technologies continue to play a key role in shaping the HURDA problem space, the study highlighted that HURDA capacity building should not centre on technological solutions. Rather, technical interventions can serve as enablers for wider capacity-building activity, with careful considerations as to their design and uptake necessary to prevent potential misuse or unintended consequences. Table 4.2 provides a summary of the practical considerations and recommendations for the CSSF programme in relation to this category of interventions. Table 4.2: Recommendations for technical interventions Technical interventions Enabling software for civil society actors and general population Definition Technical resources and tools (e.g. browser extensions, apps) that enable the bottom-up exercise and safeguard of HURDA by civil society actors, including journalists, human rights defenders, and other grassroot organisations, as well as the general public. Recommendations • Foster awareness and trust in technological solutions: The uptake of enabling software tools among civil society may be constrained by a general mistrust of emerging technologies or a lack of technical skills and competencies. As such, the development of technical software for this stakeholder group may benefit from accompanying awareness-raising initiatives or training and education-focused activities, or the development of adjacent products such as guidance and toolkits. • Conduct comprehensive risk assessment to identify dependencies: Interventions should be preceded by comprehensive risk assessment to identify and mitigate potential risks stemming from disruptions to connectivity. More broadly, interventions should consider how fostering dependencies on technological solutions can be avoided to minimise risk of disruption, should enablers such as connectivity be disrupted (e.g. internet shutdowns). • Foster cross-government collaboration on technology development: Cross-government collaboration and multi-stakeholder engagement may support the development of inclusive, innovative technical solutions as well as encourage incorporation of HURDA standards in technology design. Enabling software for governmental and private sector actors Definition Technical resources and tools (e.g. factchecking software) that facilitate a top-down fostering and protection of HURDA and contribute to a structural strengthening, monitoring and moderation of digital spaces by governmental and private sector actors. Recommendations • Ensure tailoring of interventions to local capacities: Technical software for governmental and private sector actors can range from baseline software to more high-end tools (e.g. those integrating AI to detect mis- and disinformation). The fostering of either type of software tools should be based on a comprehensive assessment of local skills, competencies and existing infrastructure to facilitate uptake and ensure context-specificity. • Foster oversight and accountability mechanisms to minimise risk of unintended consequences: Interventions focusing on providing enabling software to government actors may be accompanied by initiatives to strengthen mechanisms for oversight and accountability (e.g. through lobbying and watchdog activities) to mitigate potential unintended consequences). • Foster cross-government collaboration to identify relevant technological advances: Similarly to enabling software for civil society and general population, cross-government collaboration may help identify key trends in innovation relating to technical tools for government and private sector actors. In relation to more high-end technical tools, this could for example include advances in AI explainability and ability of AI models to recognise contextual nuance, improving their effectiveness in the context of content moderation. • Link interventions to training in futures methods: To enable the identification of opportunities to innovate enabling software solutions for this stakeholder group, technical interventions in this category may also be linked with projects aiming to foster competencies among the target beneficiaries in futures methods such as technology watch and horizon scanning. 61 Databases and repositories Definition Resources that provide access to ‘one-stop-shops’ for users including human rights defenders, policymakers, and researchers, to information from the wild of particular phenomena. Recommendations • Foster development of context-specific databases and repositories: Databases and repositories can serve as technical enablers to knowledge generation and consolidation activities and the fostering of evidence-based policy-making. To facilitate such practice with sufficient contextual nuance, the design of databases and repositories should therefore also support context-specific resources (e.g. repositories in different local languages). • Facilitate data availability and transparency in data sharing: Similarly to knowledge generation and consolidation, considerations should be made as to data availability and potential opportunities for enhancing the transparency of data sharing practices. Dialogue and consultative platforms and initiatives may be used to identify gaps in data availability or systemic constraints (e.g. lack of transparency in content moderation) as well as engage different groups of stakeholders to identify opportunities to strengthen data availability and transparency. • Ensure sufficient safeguards for data protection and research ethics: Interventions in this category require careful considerations to be made regarding data protection, privacy and research ethics. This is due to the potential sensitive nature of data captured in databases and repositories (e.g. instances of hateful online content). Source: RAND Europe analysis. Education Education-focused initiatives provide opportunities for holistically strengthening the skills, capacities and resilience of at-risk communities and key stakeholder groups in light of emerging challenges for HURDA. This includes a range of time-bound mechanisms for tactical improvements in key skills and competencies through training initiatives or hackathons and competitions, as well as more longer-term, formal engagements with beneficiaries through education and lifelong learning. Table 4.3 captures the study team’s recommendations for education-focused initiatives, highlighting the need to tailor interventions to the needs and existing skills and competencies of the target beneficiaries and consider a wide range of mechanisms for the transfer of these skills and competencies. 62 Human Rights in the Digital Age Table 4.3: Recommendations for education-focused interventions Education Training for senior decision-makers and institutional actors Definition Interventions with a time-bound and active transfer of knowledge, skills, and competences to stakeholders within government and public institutions, including senior policy and decision-makers, and members of law enforcement agencies, the judiciary, and prosecution services. Recommendations • Identify opportunities for targeted training for ‘digital champions’: To minimise risks of stakeholder fatigue, enhance buy-in from key stakeholder groups, and foster a long-term transfer of skills and competencies, in-depth training initiatives for selected groups of ‘digital champions’ may be preferred. This is particularly the case in contexts in which a significant level of training activity has already taken place and where a good level of baseline skills and expertise has already been established. • Prioritise baseline training for a broad audience where relevant: In contrast, in contexts in which HURDA-related issues have only emerged on the agenda of decision-makers and institutional actors, training may focus on a transfer of a broader set of baseline skills and competencies. • Foster skills and competencies of beneficiaries to navigate technological advances: Given the challenges associated with navigating a rapidly evolving and complex technological landscape, training for senior decision-makers and institutional actors may be particularly oriented towards strengthening awareness and understanding of emerging technology. This can be oriented towards both enabling governmental and institutional actors to identify opportunities to leverage emerging technologies to strengthen HURDA safeguards, as well as mitigate risks stemming from technological advances. Training for civil society actors Definition Interventions that entail a time-bound and active transfer of knowledge, skills, and competences to stakeholders and actors within civil society, including inter alia human rights defenders, journalists, lawyers and other CSOs. Recommendations • Engage with a variety of civil society actors: Interventions in this category should consider the unique role of different civil society actors, their role in the HURDA ecosystem, and their potential skills and competency gaps or needs. Interventions may therefore not focus exclusively on CSOs and human rights defenders, but also journalists, librarians, and other stakeholders who may traditionally not be considered in relation to HURDA-related education initiatives. • Leverage ‘train the trainer’ models to enhance sustainability: ‘Training the trainer’ models may serve as a mechanism for more sustainable skills and knowledge transfer for all stakeholder groups, particularly civil society actors. Donors and implementers may leverage existing links with trusted partners in the CSO community to facilitate this mechanism, through which these partners can then carry out further training activities with local civil society actors. • Identify for opportunities to strengthen regulation and governance through training initiatives: Contexts with maturing legislative and regulatory frameworks for HURDA safeguards may benefit from training oriented at judges and lawyers to foster awareness of the utility of mechanisms such as strategic litigation. Similarly, training may be oriented towards strengthening CSO skills and capacities to carry out lobbying and watchdog activities, thus contributing to more robust governance and regulatory framework (incl. through observation and accountability mechanisms). 63 Education and lifelong learning Definition Long-term educational programmes (e.g. formal curricula) that aim to foster and structurally enhance broader sets of relevant knowledge, skills, and competences among various beneficiary groups. Recommendations • Identify opportunities to support education and lifelong learning through knowledge generation and consolidation: Given the potentially low palatability of this intervention approach to donors and implementers as chief funding organisations and the significant context specificity required, donors and implementers may rather focus on strengthening the evidence and knowledge base concerning best practices for integrating HURDA in education and lifelong learning activities (e.g. in relation to media and digital literacy). • Engage with a variety of target beneficiaries in different age groups: Interventions in this category should consider addressing the skills and knowledge gaps of all relevant age groups, avoiding an exclusive focus on the youth. Where evidence base is lacking concerning the opportunities to engage a particular age group, donors and implementers may consider carrying out knowledge generation and consolidation activities to address those gaps. Hackathons and competitions Definition An alternative mechanism for enhancing skills and knowledge through engaging relevant stakeholder groups (e.g. technologists, students) in collaborative learning events, which include the provision of measurable rewards (e.g. prize money) for achieving defined objectives within a competitive, fast-paced environment. Recommendations • Prioritise contexts with an emerging community of interest: Hackathons and competitions may be prioritised for contexts with an emerging, rather than a matured, community of interest in HURDA-related issues. Interventions of this type could be used to spark interest in HURDA-related issues as well as related disciplines such as computer science and cybersecurity. • Leverage interventions to ensure initial buy-in from senior decision-makers: Interventions in this category could also be used to secure an initial early buy-in from senior decision-makers, given the conventionally high interest in hackathons and competitions from this stakeholder group. • Identify opportunities for a regular programme of activity or follow-on events: Initiatives in this category require careful consideration as to potential follow-on events or their integration with wider educational initiatives, so as to foster a sustainable transfer of skills and competencies. As such, the formulation of a regular ‘drumbeat’ of activity or a framework of hackathons and competitions combined with other activities should be prioritised over the implementation of singular events. Source: RAND Europe analysis. Strategic communication Strategic communications, including messaging campaigns and counterspeech, play an important role in the context of promoting HURDA through directly contesting or providing an alternative to narratives that may undermine HURDA. While strategic communications campaigns are unlikely to significantly shape the behaviour of target beneficiaries in isolation, they may significantly strengthen the HURDA landscape in conjunction with other interventions. Table 4.4 provides an overview of the study team’s recommendations in relation to strategic communications to maximise their impact visà-vis key target audiences. 64 Human Rights in the Digital Age Table 4.4: Recommendations for strategic communications Strategic communications Strategic communications campaigns Definition The design and promotion of HURDA narratives that aim to shape national and international discourse, as well as to ‘reclaim’ narratives from hostile and malicious actors in the online information space, ensuring that positive messages reach the wider public or specific at-risk communities and segments. Recommendations • Ensure coherence of messaging, including through cross-government coordination: Coherence is a key guiding principle for strategic communications. The formulation of concise and coherent narratives should be prioritised in order to ensure effectiveness of strategic communications initiatives. Cross-government collaboration may help ensure coherence of messaging as well as sharing of best practices to enhance the effectiveness of strategic communications. • Ensure context specificity in messaging: Ensuring context specificity in strategic communications is key to ensuring the credibility of narratives being promoted. As such, the formulation of narratives should be conducted in close coordination with local actors to fully capture contextual sensitivities and nuances. Similarly, assessment of the local context in which strategic communications are being implemented is needed to ensure messaging is designed to resonate within the defined context and the target audience. • Consider a wide variety of techniques and narratives: Donors and implementers may consider leveraging different types of messages, including alternative peer-to-peer messaging initiatives and storytelling to enhance visibility of priority issues among key audiences. • Leverage different kinds of online and offline mechanisms: A combination of online and offline mechanisms can be leveraged for strategic communications, and interventions should take a view of the role of different types of media in a given local context to understand how best to reach a given target audience. Source: RAND Europe analysis. 65 References Amazeen, Michelle A., Emily Thorson, Ashley Muddiman & Lucas Graves. 2018. ‘Correcting Political and Consumer Misperceptions: the Effectiveness and Effects of Rating Scale versus Contextual Correction Formats.’ Journalism & Mass Communication Quarterly 95(1): 28-48. doi: 10.1177/1077699016678186 Amnesty International. 2019. ‘The Digital Verification Corps: Amnesty International’s Volunteers for the Age of Social Media.’ Citizenevidence.org, 6 December 2019. As of 29 April 2021: https://citizenevidence.org/2019/12/06/thedigital-verification-corps-amnesty-internationalsvolunteers-for-the-age-of-social-media/ ———. 2021. ‘Amnesty Tech.’ Amnesty.org. As of 29 April 2021: https://www.amnesty.org/en/tech/ Aston, Valerie. 2017. ‘State Surveillance of Protest and The Rights to Privacy and Freedom of Assembly: A Comparison of Judicial and Protester Perspectives.’ European Journal of Law and Technology 8(1). As of 29 April 2021: https://www.statewatch.org/media/ documents/news/2017/mar/val-astonsurveillance-and-protest.pdf Baldauf, Johannes, Julia Ebner & Jakob Guhl. 2019. Hate Speech and Radicalisation Online, The OCCI Research Report. London: Institute for Strategic Dialogue. As of 29 April 2021: https://www.isdglobal.org/wp-content/ uploads/2019/06/ISD-Hate-Speech-andRadicalisation-Online-English-Draft-2.pdf Bellasio, J., E. Silfversten, E. Leverett, F. Quimbre, A. Knack & M. Favaro. 2020 The future of cybercrime in light of technology developments. Santa Monica, Calif.: RAND Corporation. RR-A137-1. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RRA137-1.html Benedek, Wolfgang & Matthias Kettlemann. 2014. Freedom of expression and the Internet. Strasbourg: Council of Europe. As of 29 April 2021: https://book.coe.int/en/human-rights-anddemocracy/5810-freedom-of-expression-andthe-internet.html Bennett, Colin & Smith Orudo-Marfo. 2019. Privacy, Voter Surveillance and Democratic Engagement: Challenges for Data Protection Authorities. London: Information Commissioners Office. As of 29 April 2021: https://privacyconference2019.info/wp-content/ uploads/2019/11/Privacy-and-InternationalDemocratic-Engagement_finalv2.pdf 66 Human Rights in the Digital Age Berkman Klein Center. 2021 ‘Assembly: Disinformation.’ Cyber.harvard.edu, Programs, 25 February 2021. As of 29 April 2021: https://cyber.harvard.edu/research/assembly ———. 2021. ‘Digital Citizenship+ Resource Platform.’ Dcrp.berkman.harvard.edu. As of 29 April 2021: https://dcrp.berkman.harvard.edu/ Bot Sentinel. 2020. ‘About: More than Just Bots…’ Botsentinel.com. As of 29 April 2021: https://botsentinel.com/info/about Boudreaux, Benjamin, Matthew DeNardo, Sarah Denton, Ricardo Sanchez, Ricardo, Katie Feistel & Hardika Dayalani. 2020. Data Privacy During Pandemics: A Scorecard Approach for Evaluating the Privacy Implications of COVID19 Mobile Phone Surveillance Programs. Santa Monica, Calif.: RAND Corporation. RR-A365-1. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RRA365-1.html CEIP (Carnegie Endowment for International Peace). 2021. ‘AI Global Surveillance Technology.’ Carnegieendowment.org. As of 29 April 2021: https://carnegieendowment.org/publications/ interactive/ai-surveillance# CMU (Carnegie Mellon University). 2020. ‘List of Known Misinformation and Disinformation Regarding Corona Virus in Social Media.’ As of 29 April 2021: https://www.cmu.edu/ideas-socialcybersecurity/research/misinformation-anddisinformation-4-16-2020.pdf CCDH (Center for Countering Digital Hate) & Restless Development. 2020. Failure to Act: How Tech Giants Continue to Defy Calls to Rein in Vaccine Misinformation. London: Center for Countering Digital Hate. As of 29 April 2021: https://252f2edd-1c8b-49f5-9bb2cb57bb47e4ba.filesusr.com/ugd/ f4d9b9_8d23c70f0a014b3c9e2cfc334d44 72dc.pdf Center for Internet and Society. 2021. ‘Warrant Workshop’. Stanford Law School. As of 29 April 2021: http://cyberlaw.stanford.edu/our-work/ projects/warrant-workshop CEPOL (European Union Agency for Law Enforcement Training). 2021. ‘Online Course 88/2021: OSINT - Focus on Fake News and Disinformation Leading to Extremism.’ CEPOL. As of 29 April: https://www.cepol.europa.eu/educationtraining/what-we-teach/online-courses/ online-course-882021-osint-focus-fake-news Citron, Danielle Keats. 2016. Hate Crimes in Cyberspace. Cambridge, MA: Harvard University Press. CLARIN (European Research Infrastructure for Language Resources and Technology). 2021. ‘Hackathon on COVID-19 Related Disinformation (Virtual Event).’ CLARIN. As of 29 April 2021: https://www.clarin.eu/event/2020/ hackathon-covid-19-related-disinformation Clay, Marcus. 2020. ‘The PLA’s AI Competitions.’ The Diplomat, 5 November. As of 29 April 2021: https://thediplomat.com/2020/11/ the-plas-ai-competitions/ Co-inform. 2019. ‘The Project: Co-creating Misinformation Resilient Societies.’ Coinform. eu. As of 29 April 2021: https://coinform.eu/about/the-project/ Cognlianese, Cary & David Lehr. 2017. ‘Regulating by Robot: Administrative Decision Making in the Machine-Learning Era.’ University of Pennsylvania Carey Law School, June 2017. As of 29 April 2021: https://scholarship.law.upenn.edu/ faculty_scholarship/1734/ 67 Common Sense Education. 2021 ‘Responding to Online Hate Speech (UK).’ Commonsense. org. As of 29 April 2021: https://www.commonsense.org/ education/uk/digital-citizenship/lesson/ responding-to-online-hate-speech Council of Europe. 2021. ‘Platform to Promote the Protection of Journalism and the Safety of Journalists.’ Council of Europe. As of 29 April 2021: https://www.coe.int/en/web/media-freedom Cox, Kate, William Marcellino, Jacopo Bellasio, Antonia Ward, Katerina Galai & Giacomo Persi Paoli. 2018. Social Media in Africa; A DoubleEdge Sword for Security and Development. Santa Monica, Calif: RAND Corporation. EP-67728. As of 29 April 2021: https://www.africa.undp.org/content/rba/en/ home/library/reports/social-media-in-africa-.html Crain, Matthew. 2016. ‘The Limits of Transparency: Data Brokers and Commodification.’ New Media & Society 20(1): 88-104. doi: 10.1177/1461444816657096 Cybil. 2021. ‘Countering Online Disinformation and Hate Speech Relating to the COVID-19 Pandemic.’ Cybilportal.org. As of 29 April 2021: https://cybilportal.org/projects/counteringonline-disinformation-and-hate-speech-relatingto-the-covid-19-pandemic/ Daly, Emma. 2014. ‘Why tech is a Doubleedged Sword for Human Rights.’ Human Rights Watch. As of 29 April 2021: https://www.hrw.org/news/2014/01/06/ why-tech-double-edged-sword-human-rights Data Privacy Project. 2021. ‘Data Privacy Project’. As of 29 April 2021: https://dataprivacyproject.org/# Data Protection Africa Summit. 2020. ‘Virtual Data Protection Africa Summit.’ As of 29 April 2021: https://dataprotectionafrica.org/ DeCuw, Judith. 2018. ‘Privacy’. In The Stanford Encyclopaedia of Philosophy, edited by Edward N. Zalta. As of 29 April 2021: https://plato.stanford.edu/entries/ privacy/#VieMeaValPri Decidim. 2021. ‘Decidim: Free Open-Source Participatory Democracy for Cities and Organizations.’ Decidim. As of 29 April 2021: https://decidim.org/ ECHR (European Court of Human Rights). 2021. ‘Factsheet – Right to Free Elections. European Court of Human Rights. As of 29 April 2021: https://www.echr.coe.int/Documents/FS_Free_ elections_ENG.pdf Equality and Human Rights Commission. 2017. ‘Equality and Human Rights Law during an Election Period: Guidance for Local Authorities, Candidates and Political Parties.’ Equality and Human Rights Commission. As of 29 April 2021: https://www.equalityhumanrights.com/sites/ default/files/equality-and-human-rights-lawduring-an-election-period.pdf Ethical OS. 2021. ‘Ethical OS Toolkit.’ Ethical OS. As of 29 April 2021: https://ethicalos.org/ ExpressVPN. 2021. ‘The ExpressVPN Future of Privacy Scholarship.’ ExpressVPN. As of 29 April 2021: https://www.expressvpn.com/dk/ expressvpn-scholarship Facebook. 2021a. ‘Counterspeech’. Facebook. As of 29 April 2021: https://counterspeech.fb.com/en/initiatives/ redirect/ ———. 2021b. ‘Peer to Peer (P2P): Facebook Global Digital Challenge’. Facebook. As of 29 April 2021: https://counterspeech.fb.com/en/initiatives/ p2p-facebook-global/ Facing Facts. 2021. ‘Facing Facts Online’. Facing Facts. As of 29 April 2021: https://www.facingfactsonline.eu/ 68 Human Rights in the Digital Age Flyverbom, Mikkel. 2016. ‘The Management of Visibilities in the Digital Age.’ International Journal of Communication 10: 98-109. As of 29 April 2021: https://ijoc.org/index.php/ijoc/article/ view/4841/1532 FRICoRe (Fundamental Rights In Courts and Regulation). 2021. ‘Judicial Training Project.’ Fricore.eu. As of 29 April 2021: https://www.fricore.eu/ Galvao, Jane. 2020. ‘COVID-19: the Deadly Threat of Misinformation.’ The Lancet, 5 October. doi: 10.1016/S1473-3099(20)30721-0 Gardner, James A. 2011. ‘Anonymity and Democratic Citizenship.’ William & Mary Bill of Rights Journal 19(4): 927-957. As of 29 April 2021: https://scholarship.law.wm.edu/cgi/ viewcontent.cgi?article=1584&context=wmborj GIFCT (Global Internet Forum to Counter Terrorism). 2021. ‘Prevent Terrorists and Violent Extremists from Exploiting Digital Platforms.’ Gifct.org. As of 29 April 2021: https://gifct.org/ GOS (Government Office for Science). 2013. ‘Future of Identities: Changing Identities in the UK: The Next 10 Years.’ Foresight Future Identities (2013) Final Project Report. London: Government Office for Science. As of 29 April 2021: https://assets.publishing.service.gov.uk/ government/uploads/system/uploads/ attachment_data/file/273966/13-523-futureidentities-changing-identities-report.pdf Hatespeechdata. 2021. ‘Hate speech data’. Hatespechdata.com. As of 29 April: https://hatespeechdata.com/ Hedayah Center. 2019. ‘Counter Narrative Library.’ Hedayah Centre. As of 29 April 2021: https://www.hedayahcenter.org/resources/ interactive_cve_apps/counter-narrative-library/ Helmus, Todd, Elizabeth Bodine-Baron, Andrew Radin, Madeline Magnuson, Joshua Mendelsohn, William Marcellino, Andriy Bega & Zev Winkelman. 2018. Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe. Santa Monica, Calif.: RAND Corporation. RR-2237-OSD. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RR2237.html HM Government. 2019. ‘Online Harms White Paper.’ Gov.uk. As of 29 April 2021: https://assets.publishing.service.gov.uk/ government/uploads/system/uploads/ attachment_data/file/793360/Online_Harms_ White_Paper.pdf Hohmann, Mirko, Alexander Pirang & Thorsten Benner. 2017. Advancing Cybersecurity Capacity Building. Berlin: Global Public Policy Institute. As of 29 April 2021: https://www.gppi.net/media/Hohmann__ Pirang__Benner__2017__Advancing_ Cybersecurity_Capacity_Building.pdf Holly, Louise. 2020. ‘Health in the Digital Age: Where Do Children’s Rights Fit In?’ Health and Human Rights Journal 22(2): 49-54. As of 29 April 2021: https://www.hhrjournal.org/2020/12/ perspective-health-in-the-digital-age-where-dochildrens-rights-fit-in/ Huang, R.H., D.J. Liu., L.X. Zhu, H.Y. Chen, J.F. Yang, A. Tlili, H.G. Fang, & S.F. Wang. 2020. Personal Data and Privacy Protection in Online Learning: Guidance for Students, Teachers and Parents. Beijing: Smart Learning Institute of Beijing Normal University. As of 29 April 2021: https://iite.unesco.org/publications/personaldata-and-privacy-protection-in-online-learning/ 69 Huguet, Alice, Jennifer Kavanagh, Garrett Beker & Marjory S. Blumenthal. 2019. Exploring Media Literacy Education as a Tool for Mitigating Truth Decay. Santa Monica, Calif.: RAND Corporation. RR-3050-RC. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RR3050.html Huszti-Orbán, Krisztina & Fionnuala Ní Aoláin. 2020. Use of Biometric Data to Identify Terrorists: Best Practice or Risky Business? Minneapolis, Min.: Human Rights Center of the University of Minnesota. As of 29 April 2021: https://www.ohchr.org/Documents/Issues/ Terrorism/biometricsreport.pdf Ignatidou, Sophia. 2019. ‘AI-Driven Personalization in Digital Media: Political and Societal Implications.’ Chatham House. As of 29 April 2021: https://www.chathamhouse.org/sites/ default/files/021219%20AI-driven%20 Personalization%20in%20Digital%20Media%20 final%20WEB.pdf IMDA (Infocomm Media Development Authority) & PDPC (Personal Data Protection Commission). 2019. ‘Model Artificial Intelligence Governance Framework: First Edition.’ As of 29 April 2021: https://ai.bsa.org/wp-content/ uploads/2019/09/Model-AI-Framework-FirstEdition.pdf Infodemic. 2021. ‘Infodemic.’ Infodemic. As of 29 April 2021: http://infodemic.eu/ Internet Freedom Festival. 2021. ‘About the Internet Freedom Festival’. Internet Freedom Festival. As of 29 April 2021: https://internetfreedomfestival.org/about/ Internet Society. 2019. ‘Policy Brief: Internet Shutdowns.’ Internet Society. As of 29 April 2021: https://www.internetsociety.org/policybriefs/ internet-shutdowns ———. 2021. ‘Your Digital Footprint Matters’. Internet Society. As of 29 April 2021: https://www.internetsociety.org/tutorials/ your-digital-footprint-matters/ IREX. 2021. ‘Learn to Discern: Media Literacy Trainer’s Manual.’ Irex.org. As of 29 April 2021: https://www.irex.org/resource/ learn-discern-media-literacy-trainers-manual Issa, Mayssaa & Keshav Jha. 2020. ‘Data Localisation: From Information Protection to Balkanisation of the Internet.’ The Delta Partners. As of 29 April 2021: https://www.deltapartnersgroup.com/sites/ default/files/The_Delta_Perspective_-_Data_ localisation_-_202001.pdf Jones, Kate. 2020. ‘Regulating Big Tech: Lessons from COVID-19.’ Chatham House. As of 29 April 2021: https://www.chathamhouse.org/2020/06/ regulating-big-tech-lessons-covid-19 Kavanagh, Jennifer & Michael D. Rich. 2018. Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. Santa Monica, Calif.: RAND Corporation. RR-2314-RC. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RR2314.html Kavanagh, Jennifer, Garrett Baker & Marjory Blumenthal. 2019. Exploring Media Literacy Education as a Tool for Mitigating Truth Decay. Santa Monica, Calif.: RAND Corporation. RR-3050-RC. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RR3050.html Kofi Annan Foundation. 2020. ‘Protecting Electoral Integrity in the Digital Age: The Report of the Kofi Annan Commission on Elections and Democracy in the Digital Age.’ The Kofi Annan Foundation. As of 29 April 2021: https://www.kofiannanfoundation.org/app/ uploads/2020/05/85ef4e5d-kaf-kaceddareport_2020_english.pdf 70 Human Rights in the Digital Age Leary, Virginia. 1994. ‘The Right to Health in International Human Rights Law.’ Health and Human Rights 1(1): 25-42. As of 29 April 2021: https://cdn1.sph.harvard.edu/wp-content/ uploads/sites/2469/2014/03/5-Leary.pdf Malin, Bradley. 2013. ‘Automated Detection of Anomalous Accesses to Electronic Health Records.’ National Institute of Health. As of 29 April 2021: https://grantome.com/grant/NIH/ R01-LM010207-07 Marcellino, William, Kate Cox, Katerina Galai, Linda Slapakova, Amber Jaycocks & Ruth Harris. 2020a. Human-Machine Detection of Online-Based Malign Information. Santa Monica, Calif.: RAND Corporation. RR-A519-1. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RRA519-1.html Marcellino, William, Christian Johnson, Marek N. Posard & Todd C. Helmus. 2020b. Foreign Interference in the 2020 Election: Tools for Detecting Online Election Interference. Santa Monica, Calif.: RAND Corporation. RR-A740-2. As of 29 April 2021: https://www.rand.org/pubs/research_reports/ RRA704-2.html Marie Collins Foundation. 2021. ‘About Click: Path to Protection.’ Marie Collins Foundation. As of 29 April 2021: https://www.mariecollinsfoundation.org.uk/cpp Maschmeyer, Lennart, Ronald J. Deibert & Jon R. Lindsay. 2021. ‘A tale of two cybers how threat reporting by cybersecurity firms systematically underrepresents threats to civil society’. Journal of Information Technology & Politics 18(1):1-20. As of 29 April 2021: https://doi.org/10.1080/19331681.2020.1776 658 in Artificial Intelligence. As of 29 April 2021: https://www.frontiersin.org/articles/10.3389/ frai.2019.00019/full McKinsey. 2017. ‘Fueling Growth Through Data Monetization.’ McKinsey.com. As of 29 April 2021: https://www.mckinsey.com/businessfunctions/mckinsey-analytics/our-insights/ fueling-growth-through-data-monetization Ning, Huansheng, Feifei Shi, Shan Cui & Mahmoud Daneshmand. 2020. ‘From IoT to future cyber-enabled Internet of X (IoX) and its fundamental issues.’ IEEE. As of 29 April 2021: https://ieeexplore.ieee.org/document/9239362 Nishinaga, Jesse & Faris Natour. 2019. Technology Solutions for Advancing Human Rights in Global Supply Chains: A Landscape Assessment. Berkeley, Calif.: University of California, Berkeley. As of 29 April 2021: https://humanrights.berkeley.edu/sites/default/ files/publications/technology_solutions_for_ advancing_human_rights_in_global_supply_ chains_june_2019_0.pdf OECD (Organisation for Economic Cooperation and Development). 2020. ‘Digital Transformation and The Futures of Civic Space to 2030.’ OECD Development Policy Papers. As of 29 April 2021: https://www.oecd-ilibrary. org/docserver/79b34d37-en. pdf?expires=1611817821&id=id&accname= ocid56013842&checksum=9C6D19D0FF0098 B80823BB3490F0562E Ofcom. 2020. ‘Use of AI in Online Content Moderation.’ Ofcom.org.uk. As of 12 April 2021: https://www.ofcom.org.uk/__data/assets/ pdf_file/0028/157249/cambridge-consultantsai-content-moderation.pdf McCarthy-Jones, Simon. 2019. ‘The Autonomous Mind: The Right to Freedom of Thought in the Twenty-First Century.’ Frontiers 71 OHCHR (Office of the United Nations High Commissioner for Human Rights). 1998. ‘Declaration on the Right and Responsibility of Individuals, Groups and Organs of Society to Promote and Protect Universally Recognized Human Rights and Fundamental Freedoms.’ Ohchr.org. As of 12 April 2021: https://www.ohchr.org/en/professionalinterest/ pages/rightandresponsibility.aspx ———.2003. ‘Human Rights in the Administration of Justice.’ Ohchr.org. As of 12 April 2021: https://www.ohchr.org/documents/ publications/training9chapter12en.pdf ———. 2014a. ‘The Right to Privacy in the Digital Age.’ Ohchr.org. As of 12 April 2021: https://www.ohchr.org/en/hrbodies/ hrc/regularsessions/session27/ documents/a-hrc-27-37_en.doc ———. 2014b. Factors that Impede Equal Political Participation and Steps to Overcome those Challenges. Report of the Office of the United Nations High Commissioner for Human Rights. Geneva: The United Nations. As of 12 April 2021: https://digitallibrary.un.org/ record/777756?ln=en ———. 2016b. Summary of the Discussions Held during the Expert Workshop on the Right to Participate in Public Affairs. Report of the Office of the United Nations High Commissioner for Human Rights. Geneva: The United Nations. As of 12 April 2021: https://undocs.org/A/HRC/33/25 ———. 2018. The right to privacy in the digital age: Report of the United National High commissioner for Human Rights. Geneva: The United Nations. As of 12 April 2021: https://undocs.org/A/HRC/39/29 ———. 2021b. ‘OHCHR and the Right to Health’. Ohchr.org. As of 12 April 2021: https://www.ohchr.org/EN/Issues/ESCR/ Pages/Health.aspx ———. 2021a. ‘International Covenant on Civil and Political Rights’. Ohchr.org. As of 12 April 2021: https://www.ohchr.org/EN/ ProfessionalInterest/Pages/CCPR.aspx ———. 2021c. ‘Protecting Civic Space and the Right to Access Resources’. Ohchr.org. As of 12 April 2021: https://www.ohchr.org/ Documents/Issues/Fassociation/ GeneralPrinciplesProtectingCivicSpace.pdf Ohman, Carl & Luciano Floridi. 2018. ‘An Ethical Framework for the Digital Afterlife Industry.’ Nature Human Behaviour 2: 318-320. doi: 10.1038/s41562-018-0335-2 Pamment, James, Henrik Twetman, Alicia Fjällhed, Howard Nothhaft, Helena Engelson & Emma Rönngren. 2020. RESIST Counter Disinformation Toolkit. London: Government Communication Service. As of 12 April 2021: https://3x7ip91ron4ju9ehf2unqrm1-wpengine. netdna-ssl.com/wp-content/uploads/2020/03/ RESIST-Counter-Disinformation-Toolkit.pdf Polyakova, Alina & Daniel Fried 2019. ‘Democratic Defense Against Disinformation 2.0.’ Brookings. As of 12 April 2021: https://www.brookings.edu/research/ democratic-defense-againstdisinformation-2-0/ Prasad, Smithna Krisha & Sharngan Aravindakshan. 2020. ‘Playing Catch up – Privacy Regimes in South Asia.’ The International Journal of Human Rights 25(1): 79-116. doi: 10.1080/13642987.2020.1773442 Privacy International. 2017. ‘What is Privacy?’ Privacy International. As of 12 April 2021: https://privacyinternational.org/explainer/56/ what-privacy 72 Human Rights in the Digital Age RED-Alert. 2021. ‘Real-Time Early Detection and Alert System’. Red alert project. As of 12 April 2021:https://redalertproject.eu/ Right to Education Initiative. 2018. ‘Understanding education as a right.’ Right-toeducation.org. As of 12 April 2021: https://www.right-to-education.org/page/ understanding-education-right Rocher, Luc, Julien Hendrickx & YvesAlexandre de Montjoye. 2019. ‘Estimating the Success of Re-identifications in Incomplete Datasets using Generative Models.’ Nature Communications 10: Article 3069. doi: 10.1038/ s41467-019-10933-3 Safer. 2020. ‘Protect your Platform from Abuse Content at Scale.’ Safer.io. As of 12 April 2021: https://safer.io/?__hstc=208625165.234cdb6 bbb1008e2b9c90747d9ef3b76.16152866476 63.1615286647663.1615286647663.1 &__hssc=208625165.1.1615286647664 &__hsfp=1889468195 Sensity. 2021. ‘The World’s First Detection Platform for Deepfakes: Find Manipulations in Videos and Images in a Few Seconds.’ Sensity. ai. As of 12 April 2021: https://sensity.ai/ Shahbaz, Adrian, Allie Funk & Andrea Hackl. 2020. ‘User Privacy or Cyber Sovereignty? Assessing the Human Rights Implications of Data Localization.’ Freedom House. As of 12 April 2021: https://freedomhouse.org/ report/special-report/2020/ user-privacy-or-cyber-sovereignty SHERPA. 2021. ‘About.’ Sherpa Project. As of 12 April 2021: https://www.project-sherpa.eu/about/ Special Rapporteur on Extreme Poverty and Human Rights. 2019. Digital welfare states and human rights. Geneva: United Nations Human Rights Council. As of 15 April 2021: https://www.ohchr.org/EN/Issues/Poverty/ Pages/AnnualReports.aspx Special Rapporteur on the Promotion and Protection of the Right to Freedom and Expression. 2015. Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression: The use of encryption and anonymity to exercise the rights to freedom of opinion and expression in the digital age. As of 29 April 2021: https:// undocs.org/A/HRC/29/32 ———. 2016. Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression: Freedom of expression, states and the private sector in the digital age. As of 29 April 2021: https://undocs.org/A/HRC/32/38 Special Rapporteur on the Right to Education. 2016. Report of the Special Rapporteur on the right to education. Geneva: United Nations Human Rights Council. As of 12 April 2021: https://undocs.org/A/HRC/32/37 Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association. 2019. Rights to freedom of peaceful assembly and of association: Report of the Special Rapporteur on the rights to freedom of peaceful assembly and of association. Geneva: United Nations Human Rights Council. As of 12 April 2020: https://undocs.org/A/HRC/41/41 Stop The Traffik. 2021. ‘Traffik Analysis Hub.’ Stopthetraffik.org. As of 12 April 2021: https://www.stopthetraffik.org/what-we-do/ traffik-analysis-hub/ Svantesson, Dan. 2020. ‘Data Localisation Trends and Challenges.’ OECD Digital Economy Papers No. 301. Paris: OECD Publishing. As of 12 April 2021: https://www.oecd-ilibrary.org/docserver/7fba ed62-en.pdf?expires=1615939678&id=id&acc name=ocid56013842&checksum=733EDC01 EE9BFD0A3C725445514E45CE 73 Taylor, Steve, Brian Pickering, Paul Grace & Michael Boniface. 2018. ‘Opinion Forming in the Digital Age.’ Hub4NGI. As of 12 April 2021: https://www.ngi.eu/wp-content/uploads/ sites/48/2018/10/Opinion-Forming-in-theDigital-Age-Public-Recommendations-v1.0.pdf Thomson, T.J., Daniel Angus, Paula Dootson, Edward Hurcombe & Adam Smith. 2020. ‘Visual Mis/disinformation in Journalism and Public Communications: Current Verification Practices, Challenges, and Future Opportunities.’ Journalism Practice. doi: 10.1080/17512786.2020.1832139 Titley, Gavan, Ellie Keen & László Földi. 2014. Starting Points for Combating Hate Speech Online: Three Studies about Online Hate Speech and Ways to Address It. Strasbourg: Council of Europe. As of 15 April 2021: https://rm.coe.int/1680665ba7 Tkacheva, Olesya, Lowell H. Schwartz, Martin C. Libicki, Julie E. Taylor, Jeffrey Martini & Caroline Baxter. 2013. Internet Freedom and Political Space. Santa Monica, Calif.: RAND Corporation, RR-295-DOS. As of 12 April 2021: https://www.rand.org/pubs/research_reports/ RR295.html Tkacik, Daniel. 2020. ‘App and Infrastructure Alert Users about Data Collection around Them: IoT Venders now Invited to Contribute Templates.’ CybLab, 3 December 2020. As of 12 April 2021: https://www.cylab.cmu.edu/news/2020/12/03iotassistant.html Tkacik, Daniel. 2021. ‘What if Opting-Out of Data Collection Were Easy?’ CybLab, 12 January 2021. As of 12 April 2021: https://www.cylab.cmu.edu/news/2021/01/12opt-out-easy.html UC Berkeley Center for Long-Term Cybersecurity. 2021. ‘The Daylight Security Research Lab’. University of California, Berkeley. As of 12 April 2021: https://daylight.berkeley.edu/mlfailures/ UN (United Nations). 2021. ‘The Universal Declaration of Human Rights.’ The United Nations. As of 12 April 2021: https://www.un.org/en/ universal-declaration-human-rights/ UNCTAD (United Nations Conference on Trade and Development). 2019. Digital Economy Report 2019: Value Creation and Capture – Implications for Developing Countries. New York: United Nations Publications. As of 12 April 2021: https://unctad.org/system/files/ official-document/der2019_en.pdf UNESCO (United Nations Educational, Scientific and Cultural Organization). 2020. ‘What You Need to Know about the Right to Education’. UNESCO, 26 November 2020. As of 12 April 2021: https://en.unesco.org/news/ what-you-need-know-about-right-education UN News. 2019. ‘“Digital divide” will worsen inequalities, without better global cooperation.’ UN News, 4 September. As of 12 April 2021: https://news.un.org/en/ story/2019/09/1045572 Van Veen, Christiaan. 2019. ‘Why the Digitization of Welfare States is a Pressing Human Rights Issue.’ Oxford Human Rights Hub, 19 December 2019. As of 12 April 2021: https://ohrh.law.ox.ac.uk/why-the-digitizationof-welfare-states-is-a-pressing-human-rightsissue/ Weber, Rolf H. 2015. ‘The digital future – A challenge for privacy?’ Computer Law & Security Review 31(2015): 234-242. As of 29 April 2021: https://www.dhi.ac.uk/san/waysofbeing/data/ governance-crone-weber-2015d.pdf 74 Human Rights in the Digital Age West Virginia University John Chambers College of Business and Economics (N.d.). ‘Maier AI Ethics Competition’. Wvu.edu. As of 12 April 2021: https://business.wvu.edu/news-and-events/ events/maier-ai-ethics-competition Wingfield, Richard. 2019a. ‘A human rightsbased approach to disinformation.’ Digital Partners Online, 15 October 2019. As of 12 April 2021: https://www.gp-digital.org/a-human-rightsbased-approach-to-disinformation/ ———. 2019b. ‘Unpacking “harmful content”: Extremist content.’ Digital Partners Online, 7 October 2019. As of 12 April 2021: https://www.gp-digital.org/ unpacking-harmful-content-extremist-content/ Xiao, Liu. 2020. ‘China’s Hyperactive Debates on Personal Data Protection’. The Diplomat, 18 December. As of 12 April 2021: https://thediplomat.com/2020/12/chinashyperactive-debates-on-personal-dataprotection/ Yayboke, Erol & Samuel Brannen. 2020. ‘Promote and Build: A Strategic Approach to Digital Authoritarianism.’ CSIS.org, 15 October 2020. As of 12 April 2021: https://www.csis.org/analysis/promoteand-build-strategic-approach-digitalauthoritarianism

  • 关注微信

猜你喜欢