
HOME OFFICE
HOME OFFICE
2 Projects, page 1 of 1
assignment_turned_in Project2024 - 2027Partners:Refugee Council, HOME OFFICE, British Red Cross, University of Brighton, Greater Manchester Immigration Aid UnitRefugee Council,HOME OFFICE,British Red Cross,University of Brighton,Greater Manchester Immigration Aid UnitFunder: UK Research and Innovation Project Code: ES/Z50371X/1Funder Contribution: 459,926 GBPLast year, the UK received 5,152 applications for asylum from unaccompanied children: children under 18 who arrived seeking asylum without a parent or legal guardian to care for them. We refer to this small but vulnerable group as UCYP: Unaccompanied Children and Young People. Their vulnerability is underscored by reports that many UCYP who were housed in hotels since July 2021 have gone missing. Research has also repeatedly found poor mental health among UCYP, often linked to post-migration factors. There is an urgent need for a deeper understanding of the day-to-day lives of UCYP, and to improve their welfare and reduce risks of harm. Critical to this is understanding how UCYP engage with digital technology. Excessive screen time and social media use are affecting mental health of current youth, and young refugees increasingly use digital technology: it is a key tool to meet their needs during their flight, and to support them to establish and maintain social connections and integrate in their new country. However, it also exposes them to risks. Investigating the digital worlds of UCYP is important in examining safeguarding risks, but it is also crucial to investigate how their engagement with digital tools relates to their sense of belonging, social integration and wellbeing. This understanding will enable those responsible for their care to better support and safeguard their wellbeing. We will investigate this in a multidisciplinary, participatory mixed methods project. The research takes place throughout the UK, and adopts a longitudinal approach so that social networks and wellbeing of UCYP can be tracked over time. It addresses the following questions: How do UCYP in the UK engage with and experience digital technology, and how does this change over time and across context and place? How does UCYP's engagement with the digital world link to their social networks (online, offline, in the UK, and elsewhere), their sense of belonging, social risks, and their wellbeing? How can services and stakeholders better support and protect wellbeing of UCYP while they navigate the intersection of child protection and immigration control in an increasingly digital world? The project uses a participatory mixed methods design: we collaborate with organisations supporting UCYP, and four UCYP will have pivotal roles in all stages of the research process as co-researchers, with the Project Leads and Research and Innovation Associate (RIA) providing training and support throughout. Data collection consists of: Life mapping interviews with UCYP in Brighton and Manchester (N = 20) to gain biographical and visual information about how UCYP use digital technology over time, and how this links to their social connections and their wellbeing in various stages in their lives (links to RQ1); A longitudinal online survey of 200 UCYP throughout the UK, available in multiple languages, to measure relationships between variables such as their digital technology use, online/offline social capital, sense of belonging, and wellbeing (links to RQ2); Six group workshops with 15 UCYP each, in Brighton and Manchester, where UCYP will discuss their own experiences, reflect on and add to the findings of the above, and discuss dissemination (links to RQ3) The UK Home Office, Refugee Council, British Red Cross, organisations supporting UCYP, and academics are represented on an Advisory Group to advise on the research, and ensure the widest dissemination and impact. The project will finish with a policy-focused conference.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::44ee73ca115f8151f44b541d68185c1e&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::44ee73ca115f8151f44b541d68185c1e&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2025 - 2028Partners:FSU, University of Copenhagen, The Alan Turing Institute, KCL, University of Edinburgh +9 partnersFSU,University of Copenhagen,The Alan Turing Institute,KCL,University of Edinburgh,American Civil Liberties Union,Tech Against Terrorism (UK),University of Chicago,UNSW Sydney,New York University,Ada Lovelace Institute,Meta,HOME OFFICE,BellingcatFunder: UK Research and Innovation Project Code: MR/Z00036X/1Funder Contribution: 594,294 GBPTerrorist risks and violent extremist threats are increasingly identified and governed via new forms of data analytics and automated decision-making (ADM) made possible by advances in machine learning (ML) and artificial intelligence (AI). Private actors, including social media platforms, airlines and financial institutions, are collaborating with states, global bodies and international organisations (IOs) to implement ambitious data-led global counterterrorism projects. The UN Security Council, for example, has called on all states to intensify the exchange of information about suspected terrorists by building and sharing watchlists and analysing biometric data, bolstering the use of ML to predictively identify 'future terrorists' in advance. Social media platforms are using AI to detect extremist content online and regulate global data flows on an unprecedented scale. Passenger data from the aviation industry is analysed to identify suspicious 'patterns of behaviour' and control the movements of risky travellers. Financial data is mined by banks to spot suspicious transactions and terrorist 'associations'. These changes are putting new and far-reaching global security infrastructure projects into motion. Yet the implications of these shifts for how international law is practiced, global security threats known and powerful actors held accountable remain uncertain. The data infrastructures underlying global governance have been largely neglected in legal scholarship. And whilst potential problems that AI poses (discrimination and privacy violations) are becoming clearer, solutions remain elusive - especially in the security domain, where secrecy is key and the inner workings of algorithms are 'black-boxed' even more than usual. Regulatory theorists argue that we urgently need to 'expand our frame of rights discourse to encompass our socio-technical architecture' to respond to the accountability challenges of AI (Yeung 2019). Studying global security infrastructures in action might help us in reimagining how data, security and rights could be reconnected in our digital present. This project rethinks global security law and governance from the 'infrastructure space' it is creating. It focuses on three areas: (i) digital bordering infrastructures for controlling the cross-border movements of 'risky' people (Work Package 1); (ii) platform infrastructures for moderating terrorist and violent extremist content online (Work Package 2); and (iii) counterterrorism watchlisting infrastructures (Work Package 3). The project contends that the most far-reaching changes to global security governance are not being written in the language of international law or created through the formal powers of states and IOs but built through new socio-technical infrastructures and the data-driven security expertise they are enabling. I use the concept of 'infra-legalities' (or, the co-productive effects of data infrastructure, law and regulation) to analyse these shifts and develop a novel approach for studying international law and regulation in the age of algorithmic global governance. Infrastructure is often disregarded as an invisible substrate on which powerful actors act, but it helps create and shape power, knowledge and governance. Drawing from Science and Technology Studies, computer science, critical data studies and critical security studies, this project performs what Bowker and Star (1999) call an 'infrastructural inversion' by mapping the seemingly mundane governance work of AI-driven global security infrastructures. By 'following the data' - and tracing the socio-technical relations, norms, knowledge practices and power asymmetries that security infrastructures are enacting - a different method of studying global governance can emerge. Studying the infra-legalities of global security opens space for addressing key challenges and shaping policy debates on security, responsibility and accountability in the age of AI and automation.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::d80906856b411d97a736ad997fdcd2cf&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::d80906856b411d97a736ad997fdcd2cf&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu