
American Civil Liberties Union
American Civil Liberties Union
2 Projects, page 1 of 1
assignment_turned_in Project2021 - 2025Partners:Home Office, University of Edinburgh, Ada Lovelace Institute, American Civil Liberties Union, Royal United Services Institute +3 partnersHome Office,University of Edinburgh,Ada Lovelace Institute,American Civil Liberties Union,Royal United Services Institute,The Home Office,Fair Trials International,UNICRIFunder: UK Research and Innovation Project Code: MR/T041552/1Funder Contribution: 975,068 GBPTerrorist risks and threats are increasingly identified and countered through new forms of data analytics made possible by rapid advances in machine learning (ML) and artificial intelligence (AI). Private actors, including social media platforms, airlines and financial institutions, now actively collaborate with states and international organisations (IOs) to implement ambitious data-led security projects to support global counterterrorism efforts. The UN Security Council (UNSC) has called on all states to intensify the exchange of information about suspected terrorists by building watchlists and sharing biometric data, using ML to predictively identify 'future terrorists' in advance. Social media platforms are using AI to detect extremist content online and regulate global data flows on an unprecedented scale. Passenger data from the aviation industry is analysed to identify suspicious 'patterns of behaviour' and control the movements of risky travellers. Financial data is mined by banks to spot suspicious transactions and terrorist 'associations'. These changes are all putting new and far-reaching global information infrastructure projects into motion. Yet the implications of these shifts for how international law is practiced, global security threats known and powerful actors held accountable remain uncertain. The data infrastructures underlying global governance have been largely neglected in legal scholarship. And whilst potential problems that AI poses (discrimination and privacy violations) are becoming clearer, solutions remain elusive - especially in the security domain, where secrecy is key and the inner workings of algorithms are 'black-boxed' even more than usual. Regulatory theorists argue that we urgently need to 'expand our frame of rights discourse to encompass our socio-technical architecture' to respond to the accountability challenges of AI (Yeung 2019). Data infrastructures, in other words, might provide the basis for reimagining how information and rights could be reconnected in our digital present. This project rethinks global security law from the 'infrastructure space' it is creating, focusing on (i) countering terrorism online and (ii) controlling the movements of 'risky' individuals. My hypothesis is that the most far-reaching changes to global security governance are not being written in the language of international law, or created through the formal powers of states and IOs, but built through new socio-technical infrastructures and the expertise they are enabling. Data infrastructures are critical for understanding how rights might be extended through AI. I develop the concept of 'infra-legalities' (or, the regulatory effects of data infrastructures) to analyse these shifts and develop a new approach for studying international law and regulation in the age of algorithmic global governance. Infrastructure is usually disregarded as an invisible substrate on which powerful actors act. It is rarely seen as something through which knowledge and governance can be created and shaped. Drawing from Science and Technology Studies, computer science and security studies, this project performs what Bowker and Star (1999) call an 'infrastructural inversion' by mapping the seemingly mundane governance work of data infrastructures in this domain. By 'following the data' - and tracing the socio-technical relations, norms, knowledge practices and power asymmetries that security infrastructures are enacting - a different method of studying global governance can emerge. States, IOs and tech platforms are all calling for the ethical development of AI. Different regulatory approaches are proposed with no consensus on how to mitigate the adverse effects of AI whilst embracing its vast potentialities. Studying the infra-legalities of global security law opens space for addressing these challenges and shaping current policy debates on security, trust and accountability in the age of AI and automation.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::20cd32c2f99f290c6a30e0a7b87314f4&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::20cd32c2f99f290c6a30e0a7b87314f4&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2025 - 2028Partners:FSU, University of Copenhagen, The Alan Turing Institute, KCL, University of Edinburgh +9 partnersFSU,University of Copenhagen,The Alan Turing Institute,KCL,University of Edinburgh,American Civil Liberties Union,Tech Against Terrorism (UK),University of Chicago,UNSW Sydney,New York University,Ada Lovelace Institute,Meta,HOME OFFICE,BellingcatFunder: UK Research and Innovation Project Code: MR/Z00036X/1Funder Contribution: 594,294 GBPTerrorist risks and violent extremist threats are increasingly identified and governed via new forms of data analytics and automated decision-making (ADM) made possible by advances in machine learning (ML) and artificial intelligence (AI). Private actors, including social media platforms, airlines and financial institutions, are collaborating with states, global bodies and international organisations (IOs) to implement ambitious data-led global counterterrorism projects. The UN Security Council, for example, has called on all states to intensify the exchange of information about suspected terrorists by building and sharing watchlists and analysing biometric data, bolstering the use of ML to predictively identify 'future terrorists' in advance. Social media platforms are using AI to detect extremist content online and regulate global data flows on an unprecedented scale. Passenger data from the aviation industry is analysed to identify suspicious 'patterns of behaviour' and control the movements of risky travellers. Financial data is mined by banks to spot suspicious transactions and terrorist 'associations'. These changes are putting new and far-reaching global security infrastructure projects into motion. Yet the implications of these shifts for how international law is practiced, global security threats known and powerful actors held accountable remain uncertain. The data infrastructures underlying global governance have been largely neglected in legal scholarship. And whilst potential problems that AI poses (discrimination and privacy violations) are becoming clearer, solutions remain elusive - especially in the security domain, where secrecy is key and the inner workings of algorithms are 'black-boxed' even more than usual. Regulatory theorists argue that we urgently need to 'expand our frame of rights discourse to encompass our socio-technical architecture' to respond to the accountability challenges of AI (Yeung 2019). Studying global security infrastructures in action might help us in reimagining how data, security and rights could be reconnected in our digital present. This project rethinks global security law and governance from the 'infrastructure space' it is creating. It focuses on three areas: (i) digital bordering infrastructures for controlling the cross-border movements of 'risky' people (Work Package 1); (ii) platform infrastructures for moderating terrorist and violent extremist content online (Work Package 2); and (iii) counterterrorism watchlisting infrastructures (Work Package 3). The project contends that the most far-reaching changes to global security governance are not being written in the language of international law or created through the formal powers of states and IOs but built through new socio-technical infrastructures and the data-driven security expertise they are enabling. I use the concept of 'infra-legalities' (or, the co-productive effects of data infrastructure, law and regulation) to analyse these shifts and develop a novel approach for studying international law and regulation in the age of algorithmic global governance. Infrastructure is often disregarded as an invisible substrate on which powerful actors act, but it helps create and shape power, knowledge and governance. Drawing from Science and Technology Studies, computer science, critical data studies and critical security studies, this project performs what Bowker and Star (1999) call an 'infrastructural inversion' by mapping the seemingly mundane governance work of AI-driven global security infrastructures. By 'following the data' - and tracing the socio-technical relations, norms, knowledge practices and power asymmetries that security infrastructures are enacting - a different method of studying global governance can emerge. Studying the infra-legalities of global security opens space for addressing key challenges and shaping policy debates on security, responsibility and accountability in the age of AI and automation.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::d80906856b411d97a736ad997fdcd2cf&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::d80906856b411d97a736ad997fdcd2cf&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu