Bridging the digital global governance gap: views from Future Leaders

Cidob Briefings 53
Data de publicació: 12/2023
Autor:
Inés Arco Escriche, Researcher, CIDOB
Descarregar PDF

This document is based on the debates of the Santander-CIDOB Future Leaders Forum online session titled “Bridging the digital global governance gap: international cooperation and the regulation of emerging technologies” that took place on November 21st, 2023; and the video interviews with the selected leaders from the Santander-CIDOB 35 under 35 List. The document is structured in three blocs: first, it introduces the current landscape of international digital cooperation, second, it identifies the key challenges to achieve a global framework to regulate technology; and finally, it highlights three proposals for international digital cooperation identified by the participants. The text was finalized on December 21st, 2023.

CIDOB and Santander

The rapid development of emerging technologies is driving unprecedented changes with profound implications for our societies. On the one hand, innovations such as Artificial Intelligence (AI), including its generative capabilities, are welcomed by public administrations, businesses, and citizens because they bear the promise of enormous opportunities, the potential to help solve global challenges and the positive transformation of our societies. In fact, these technological innovations are already being used by governments and businesses alike. Indeed, AI is increasingly considered a common good, with the potential to help us in decision-making processes, improving efficiency and service delivery, and addressing some of the global challenges, such as climate change or pandemic prevention. On the other hand, this swift progress is filled with risks which need to be prevented if possible and mitigated if not. While some of these risks are still unknown, it has become evident that societies cannot afford the cost of not regulating these technologies. The potential disruption of established social structures, rising inequality, the concentration of power in digital companies, the material and social costs linked to new technologies, threats to fundamental rights – such as privacy or freedom of expression – and the increase of cyber threats are some of the reasons why regulation is imperative. 

This situation has become evident in the past years, with renewed enthusiasm and hyperactivity in the governance of digital technologies alongside the development of multiple initiatives to promote international cooperation in digital and technological areas. However, the ever-changing landscape of emerging and disruptive technologies has evidenced the lack of global governance and international cooperation frameworks capable of responding to the challenges arising from these developments, with many of these initiatives only finding traction in a reactive – rather than proactive – manner. Additionally, as the United Nations highlights, there are many gaps in global digital cooperation, with multiple areas of digital governance and new technologies still unregulated. Moreover, in areas where some progress has been achieved, it has been at the cost of fragmentation and voluntary frameworks. Thus, a new push towards global digital cooperation is more needed than ever, especially in a complex context characterised by permacrises, growing conflict, changing globalisation patterns, and the erosion of democratic governance.

1.     What is global digital cooperation?

In May 2020, as the world was grappling with the impact of the coronavirus pandemic, the United Nations Secretary-General published a report to establish a Roadmap for Digital Cooperation. This effort, which signals the relevance of digital technologies for rethinking the role of effective multilateralism, aimed to identify a set of five areas where the international community should collaborate and cooperate regarding the use of digital technologies while, at the same time, reducing and mitigating potential risks. One of these five key areas is fostering global digital cooperation, which is defined as a multi-stakeholder effort in which governmental actors and other stakeholders, including the private sector, technology companies, civil society, or academia, jointly work to achieve an interoperable framework for digital technologies. This approach aims to guarantee the adoption of effective, inclusive, and practical solutions and policies in the digital and technological domains (UN, 2020, p. 22). 

The prioritisation of global digital cooperation within the UN framework was further emphasised under the Secretary-General report in 2021, titled Our Common Agenda. This document invoked the adoption of a Global Digital Compact based on shared principles for an “open, free and secure digital future for all” for the first time (p. 63). Between 2022 and 2023, negotiations between member states and consultations with relevant stakeholders have advanced within the UN with the intention to avoid the fragmentation of the Internet, increase digital connectivity, build trust within cyberspace and promote the regulation of Artificial Intelligence. The culmination of this process will be the adoption of the Global Digital Compact during the 2024 Summit of the Future

However, the UN hasn’t been the sole institution promoting new initiatives of global digital cooperation. Indeed, the unprecedented irruption of generative AI at the end of 2022 set off a global – although uncoordinated – push towards regulation, with significant advances in technical and standard-setting procedures and around social and ethical aspects of AI. Initiatives by other international organisations, like the Organisation for Economic Cooperation and Development (OECD); plurilateral agreements such as the Bletchley Declaration on security risks of AI adopted by 30 countries – including China – during the United Kingdom’s AI Safety Summit in 2023; as well as regulations at national level and guidelines by private actors are rapidly proliferating. The most recent example is the G-7 adoption of the Hiroshima AI Process Comprehensive Policy Framework in December 2023, which includes guiding principles for the development of AI systems and a code of conduct with multiple recommendations for developers and users, with an explicit focus on disinformation, as well as project-based cooperation.    

The European Union (EU) has been at the forefront of many of these efforts, aiming to provide the world’s first comprehensive legislation with solid standards in AI. The AI Act represents an act of ‘courage’, which will establish a series of technical standards, but it will also create moral ones. Through a de-risking approach to regulation, this initiative aims to identify some no-go zones in the development, deployment, and use of AI technologies – especially for those considered high-risk. In December 2023, the European Parliament and the European Council reached a provisional agreement on the AI Act, which will be ratified in early 2024.

The EU’s AI Act is the latest addition to Brussels’ arsenal of digital regulations, including the Digital Services Act (DSA) and Digital Markets Act (DMA). In 2022, the EU adopted the Declaration on European Digital Rights, proposing a digital transition defined by European values and six principles, including a people-centric approach, solidarity and inclusion, freedom of choice, sustainability, safety and security, and participation. Moreover, the EU has also adopted further legislation in highly specialised domains, such as the management of crypto assets, with the adoption of the Markets in Crypto-Assets Regulation (MICA) in 2023. Concurrently, the EU and the United States have strengthened cooperation on standards and technical underpinnings of regulation through the Trade and Technology Council (TTC). These elements are setting the framework for the future development of the data economy, the European industry, and the digital future of Europe – but with potential expansion beyond European borders, reminiscing the ‘Brussels Effect’ after the General Data Protection Regulation (GDPR) adoption.

 Cities are another actor of utmost relevance. While local governance is embedded and affected by national regulations, cities are also key players in experimentation, cross-border collaboration, and regulation. Trying to close the global governance gap, local governments are also adopting their own frameworks – such as AI strategies or public procurement clauses sensitive to human rights – and implementing bans on specific applications, including facial recognition technologies. One of the successful examples of good practices on AI governance at this level is the adoption of AI registries by cities such as Helsinki or Amsterdam to ensure transparency and accountability. 

However, no actor – country, organisation or forum - has become the centre for digital cooperation and technology regulation. While no single approach can address the multiplicity of global challenges of emerging technologies given the transnational nature of the digital and cyber domains, as well as growing digital interdependence, what these examples show is how the current governance landscape is fragmented, nationally and internationally (Fay, 2022). Furthermore, there is a considerable overlap between the different initiatives, regulations and mechanisms addressing digital issues. This creates a highly complex architecture for coordination and cooperation without the certainty of its effectiveness (UN, 2019).

2.     What are the challenges to adopt an effective global governance framework to regulate emerging technologies?

While there have been increasing calls from different stakeholders to adopt a global approach in the regulations of these technologies, especially AI, it is important to ask why we have failed to do so until now.

Firstly, given the transnational nature of digital issues alongside the speed of technological change and development, it is challenging to rely on traditional forms of governance based on sovereignty and territoriality to regulate technology. Our current tools and structures for regulation are insufficiently agile and lack the flexibility to ensure adaptation to future challenges, needs and unknown risks (Wheeler, 2023). Indeed, deep, continuous international collaboration will be fundamental to adapt to groundbreaking developments and ensure that adopted frameworks do not foreclose the opportunity for civil society and latecomer actors to get their perspectives on the table. Moreover, the multidimensional impact of digital technologies cuts across different policy issues managed by different governmental structures or international organisations. The lack of a global institution with a substantive mandate to develop a policy model or regulation of technology that is truly universal further complicates the efforts to adopt a global framework for cooperation.

Secondly, there has been a lack of consensus on critical and baseline issues. Taking the example of artificial intelligence, the first of these barriers has been the lack of consensus in such fundamental issues such as its definition, the venue or process that is desirable for the governance of disrupting technologies, the authority and responsibility of actors involved in regulation – including the role of the private sector and big tech – or the digital future (a more utopian or a dystopian one?) that we imagine (Colomina, 2023). This absence of consensus is also visible in the lack of a shared understanding by different actors of how basic foundations and principles of international law apply to the use of technologies. As such, there is a mismatch of focus and agreement on what we are regulating, which tools we have or should create or which areas we should prioritise in global cooperation.

Thirdly, past efforts to adopt a global framework have failed given the diversity of interests, values, or approaches to risks. Regulation faces an inherent tension between the promotion and defence of national interests and values, the balance of ethical issues and human rights and the protection of the fundamental freedoms of every citizen. In other words, it is a tension between protecting rights and promoting innovation. A clear example is the more consumer-oriented approach of EU regulation of technologies, which contrasts with the security and control-focused Chinese model or the US’ laissez-faire. According to Tiberghien, Luo and Pourmalek (2022) digital governance is fragmenting around the US, European, Chinese and Indian models – marked by multiple splits on the role of state, data ownership, industrial innovation and competitiveness, and protection and fundamental rights. 

In contrast, there is a significant disparity of substantive participation between the actors involved in global digital cooperation. Developing countries, for example, are still facing significant digital divides and may lack the resources for a successful participation in some of these debates and initiatives, being then forced to follow systems that do not fit their realities, concerns or needs. A similar trend is also visible in a more individual-focused perspective, where non-experts, indigenous communities, women, youth and elderly, and people with disabilities are not able to join the discussions or may lack the capacity to participate in a meaningful way. 

Fourthly, the most evident challenge is the growing trend of politicisation and securitisation of digital technologies and its intersection with growing geopolitical rivalries between the United States and China. Together with the EU and India, these actors are bidding to achieve technological supremacy and to dominate the standard setting of these technologies to harvest the benefits of their development and use. In parallel, each jurisdiction is becoming wary of the risks from data and digital technologies, prompting the adoption of more protectionist measures to achieve data sovereignty. The centrality of technology in their competition heavily influences the capability to reach a consensus on international standards while promoting contrasting approaches to regulate digital issues.

In conclusion, the lack of a coherent, global approach is unsettling the international order in digital governance and negatively impacting the delivery of effective and innovative solutions for the governance of digital and technological issues. This situation has consequential risks, such as the splintering of the Internet or the incapacity of successfully responding to critical problems, given the failure to conduct a comprehensive and in-depth assessment of multiple risks, vulnerabilities, and outcomes of digital and technological developments. The different rules and regulations – as well as the existing gaps, for example, the military use of these technologies – can have deep impacts on governance and, as a result, on citizens’ lives. Paradoxically, guidelines and regulations are more needed than ever in the current context.

3.     Towards an effective global digital cooperation

Taking into consideration the challenges of establishing a set of shared values to guide technology development and deployment, global digital cooperation should be people-centered, transparent, open, ethical, inclusive, and equitable while keeping in mind the multi-level, multi-issue and multi-stakeholder nature of digital and tech governance.

Considering the current challenges and developments, the international community should focus on making progress in three different areas: 

Meaningful multi-stakeholderism

The recent digital advances show the tension and interplay between two different cultures of governance: a bottom-up multi-stakeholder approach – for example, in the open consultation processes adopted by the UN for the Global Digital Compact – and a top-down multilateral approach which gives primacy to the role of states. However, even in these multi-stakeholder initiatives, the current objective is a multilateral solution for a better tomorrow, implying the subordination of multi-stakeholder processes to multilateral solutions.

As a result, the emphasis must be placed on achieving meaningful multi-stakeholderism while upholding inclusivity and effective participation. Current efforts at regulating these technologies are being led and dominated by traditional technological powerhouses – such as the US, the EU or China–, creating a highly specialised conversation with a limited number of countries alongside a small pool of big tech companies. Countries from the Global South are mostly absent or overlooked in ongoing regulatory processes. As such, adopted international agreements may not be suitable for non-Western realities.

Besides more a representative global cooperation in terms of geography, the different actors involved – governmental representatives, civil society actors, academia and the private sector – should have the opportunity to participate and influence the conversations on an equal footing. Diversity of genders, generations, and underrepresented communities – including most vulnerable populations, indigenous communities, and people with disabilities – must have their participation ensured. This is also especially relevant when addressing and ensuring youth participation – as the decisions taken today will ultimately define their future. Each of these groups can bring a unique perspective to the table and, through communication and trust-building measures, these initiatives can help build consensus and common understandings, and identify shared challenges and risks. In conclusion, the governance of technology must incorporate democratic and participatory elements on national and international levels.

Ensuring interoperability across regulatory frameworks and enforcement

The current hyperactivity in the international landscape risks creating a patchwork approach with too many loopholes that allow easy forum shopping. As a result, the most urgent task at hand is the need for coordination. Feedback loops should be established between ad hoc, regional and international initiatives to avoid duplication, overlapping – and contradicting – efforts. As Internet governance is a cross-cutting issue, the current siloed governance should be connected to accurately address and respond to related issues around digital technologies that cross borders, topics, rights, and regulations. As such, for a truly comprehensive and harmonised regulatory framework, intergovernmental processes and global multilateral forums should be aligned, with a clear division of labour and consistency when it comes to the rules that apply to the work of these forums.  

Beyond ensuring policy coordination, two further concerns and challenges that arise from current efforts are the interoperability of regulations and the consequent protection of citizens who could be subjected to different jurisprudential criteria depending on applicable legislation. By building international frameworks grounded in consensus-adopted shared values, different jurisdictions should be committed to following this leadership by the international community while retaining enough flexibility to develop regimes tailored to their domestic environments. This can be further encouraged through capacity-building initiatives in the digital and cyber domains at a global level, using cooperation to assist countries with practical insights on regulation and implementation. Moreover, further collaboration through bringing legal expertise and knowledge will be necessary to support other countries in transposing international agreements and standards in their own legislations as well as its implementation and enforcement.

Finally, a further challenge will be how to fulfil the promises made in regulations to safeguard rights effectively. Enforcement and sanctioning will be a requirement for the international community. As such, these international agreements need to become binding. The development of global, joint enforcement mechanisms and a sanctions framework for those who fail to comply should also be part of global digital cooperation debates and efforts.

Going beyond regulation

Besides the challenges of interoperability and enforcement, global digital cooperation should extend beyond regulation. While regulation is a fundamental first step, it is important to acknowledge that it is not enough to produce the desired change of cooperation and risk mitigation of emerging technologies. Previous experiences, such as the GDPR, offer relevant insights into the limitations of regulation to promote a shift in business models or different Internet behaviour. While the GDPR established clear obligations on the processing of personal data by operators, some have managed to circumvent or avoid these obligations. The €1.2 billion fine to Meta for violating the data privacy rules established in the GDPR is a clear example of how enforcement is not working. As such, other creative and innovative approaches should be considered – including the establishment of a new, digital social contract.

Moreover, the unequal development and adoption of technologies around the world and the knowledge of these issues require further research and the development of capacity-building actions. Sharing best practices, promoting training for public administrations and the private sector, and ensuring the exchange of knowledge will be essential to guarantee that the benefits of these technological changes are equally shared. Regulations should also be coupled with awareness-raising campaigns to ensure that citizens, users, and developers are aware of their rights and responsibilities under these new frameworks.

Additionally, given the unpredictable risks and impacts of these disruptive technologies, it is crucial to establish common safe spaces for experimental development, including sandboxes, funded by public bodies. The deployment of these spaces can help us identify and understand in the early stages of the development process the risks of specific technologies, but also, to test the effectiveness of regulations. These spaces will be useful in risk assessments. Adopting standards based on a value-sensitive design and participatory approaches for assessing the impact of these technologies before they are deployed in the market will test their respect for human rights and limit their negative externalities.

Furthermore, global digital cooperation needs to provide public global goods and technological solutions for all. Government involvement can further enhance innovation, adopting a supporter investor and early customer role for these technology advancements. As such, countries should invest and develop open, shared digital public infrastructure – from computing power to democratically and justly governed data layers –, to boost global digital connectivity and ensure it is accessible for entrepreneurs and citizens. More critically, technological transfer will also be key between developed, emerging, and developing countries to leverage the opportunities of digital technologies and close the digital divide.

Finally, one ambitious proposal concerns the need to establish new effective and flexible institutions of global governance that manage the profound changes that digital technologies pose for our societies. From international agencies to monitor and verify compliance, to global advisory bodies for truly multi-stakeholder and all-inclusive processes, public participation must be ensured to build the foundations of the future and take ownership of the governance of the unprecedented transformations for our societies.

References

Colomina, Carme. “Una IA ética: la UE y la gobernanza algorítmica”. CIDOB Opinion, 784 (December 2023). (online) https://www.cidob.org/es/publicaciones/serie_de_publicacion/opinion_cidob/2023/una_ia_etica_la_ue_y_la_gobernanza_algoritmica

Fay, Robert. “Global Governance of Data and Digital Technologies: A Framework for Peaceful Cooperation”. Center for International Governance Innovation (CIGI), February 2022. (online) https://www.cigionline.org/articles/global-governance-of-data-and-digital-technologies-a-framework-for-peaceful-cooperation/

Tiberghien, Yves; Luo, Danielle and Pourmalek, Panthea. “Existential Gap: Digital/AI Acceleration and the Missing Global Governance Capacity”. Center for International Governance Innovation (CIGI), February 2022 (online). https://www.cigionline.org/articles/existential-gap-digitalai-acceleration-and-the-missing-global-governance-capacity/

United Nations (UN). Report of the Secretary-General’s High-level Panel on Digital Cooperation. International Governance Forum, 2019 (online). https://intgovforum.org/en/content/report-of-the-un-secretary-general%E2%80%99s-high-level-panel-on-digital-cooperation

UN. Report of the Secretary-General: Roadmap for Digital Cooperation. New York: United Nations, May 2020 (online). https://www.un.org/techenvoy/sites/www.un.org.techenvoy/files/general/Roadmap_for_Digital_Cooperation_9June.pdf

UN. Report of the Secretary-General: Our Common Agenda. New York: United Nations, 2021 (online).https://www.un.org/en/content/common-agenda-report/assets/pdf/Common_Agenda_Report_English.pdf

Wheeler, Tom. “The three challenges of AI regulation” Brookings Commentary, June 2023 (online). https://www.brookings.edu/articles/the-three-challenges-of-ai-regulation/