Powered by OpenAIRE graph
Found an issue? Give us feedback

DETECTOR

Deepfake Evidence and Technology for Forensic Content Oversight and Research
Funder: European CommissionProject code: 101225942 Call for proposal: HORIZON-CL3-2024-FCT-01
Funded under: HE | HORIZON-RIA Overall Budget: 4,489,410 EURFunder Contribution: 4,489,410 EUR

DETECTOR

Description

AI is transforming law enforcement, offering new tools for policing but also enabling advanced criminal tactics that challenge traditional methods. The global nature of crime, including cyber threats, trafficking, and terrorism, calls for innovative solutions as LEAs face vast data volumes and increasingly sophisticated criminal activities. AI has raised concerns with deepfakes—highly realistic but fake audio, video, or text that can depict individuals saying or doing things they never did. Deepfakes pose serious risks, impacting politics, economy, and social trust. Examples include fabricated videos of political figures and voice-cloned audio for financial fraud, often spread through social networks to deceive and defraud on a large scale. Forensic institutes and courts struggle to differentiate authentic evidence from AI fabrications, especially in cases involving national security. Despite promising detection research, existing methods fall short as current models rely on limited, non-diverse datasets and produce results with limited legal admissibility. The DETECTOR initiative aims to address these challenges, supporting LEAs and forensic experts in analyzing altered media. It offers an integrated solution through cross-border collaboration among AI researchers, LEAs, forensic scientists, legal experts, and ethicists. DETECTOR’s goals include: developing specialized tools for detecting media manipulation, creating comprehensive datasets, researching digital evidence exchange across borders, engaging stakeholders, informing policymakers, and training forensic experts in digital media and AI. Through these efforts, DETECTOR seeks to safeguard digital evidence authenticity and enhance forensic capabilities to counter AI-driven media manipulation across Europe

Data Management Plans
Powered by OpenAIRE graph
Found an issue? Give us feedback

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

All Research products
arrow_drop_down
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=corda_____he::a3ab2412c60982f5b6fcb0e53e8c6cb9&type=result"></script>');
-->
</script>
For further information contact us at helpdesk@openaire.eu

No option selected
arrow_drop_down