Loading
This project is a direct response to significant changes taking place in the domain of computing and the arts. Recent developments in Artificial Intelligence and Machine Learning are leading to a revolution in how music and art is being created by researchers (Broad and Grierson, 2016). However, this technology has not yet been integrated into software aimed at creatives. Due to the complexities of machine learning, and the lack of usable tools, such approaches are only usable by experts. In order to address this, we will create new, user-friendly technologies that enable the lay user - composers as well as amateur musicians - to understand and apply these new computational techniques in their own creative work. The potential for machine learning to support creative activity is increasing at a significant rate, both in terms of creative understanding and potential applications. Emerging work in the field of music and sound generation extends from musical robots to generative apps, and from advanced machine listening to devices that can compose in any given style. By leveraging the internet as a live software ecosystem, the proposed project examines how such technology can best reach artists, and live up to its potential to fundamentally change creative practice in the field. Rather than focussing on the computer as an original creator, we will create platforms where the newest techniques can be used by artists as part of their day-to-day creative practices. Current research in artificial intelligence, and in particular machine learning, have led to an incredible leap forward in the performance of AI systems in areas such as speech and image recognition (Cortana, Siri etc.). Google and others have demonstrated how these approaches can be used for creative purposes, including the generation of speech and music (DeepMinds's WaveNet and Google's Magenta), images (Deep Dream) and game intelligence (DeepMind's AlphaGo). The investigators in this project have been using Deep Learning, Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Long Short-Term Memory Networks (LSTMs), and other approaches to develop intelligent systems that can be used by artists to create sound and music. We are already among the first in the world to create reusable software that can 'listen' to large amounts of sound recordings, and use these as examples to create entirely new recordings at the level of audio. Our systems produce outcomes that out-perform many other previously funded research outputs in these areas. In this three-year project, we will develop and disseminate creative systems that can be used by musicians and artists in the creation of entirely new music and sound. We will show how such approaches can affect the future of other forms of media, such as film and the visual arts. We will do so by developing a creative platform, using the most accessible public forum available: the World Wide Web. We will achieve this through development of a high level live coding language for novice users, with simplified metaphors for the understanding of complex techniques including deep learning. We will also release the machine learning libraries we create for more advanced users who want to use machine learning technology as part of their creative tools. The project will involve end-users throughout, incorporating graduate students, professional artists, and participants in online learning environments. We will disseminate our work early, gaining the essential feedback required to deliver a solid final product and outcome. The efficacy of such techniques has been demonstrated with systems such as Sonic Pi and Ixi Lang, within a research domain already supported by the AHRC through the Live Coding Network (AH/L007266/1), and by EC in the H2020 project, RAPID-MIX. Finally, this research will strongly contribute to dialogues surrounding the future of music and the arts, consolidating the UK's leadership in these fields.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::51714eb98e5cc77db14175a087b51d22&type=result"></script>');
-->
</script>