
Aix-Marseille University
Aix-Marseille University
6 Projects, page 1 of 2
assignment_turned_in Project2015 - 2019Partners:University of Nottingham, Max Planck Institutes, Nottingham Uni Hospitals NHS Trust, Aix-Marseille University, NTU +2 partnersUniversity of Nottingham,Max Planck Institutes,Nottingham Uni Hospitals NHS Trust,Aix-Marseille University,NTU,Aix-Marseille University,GUFunder: UK Research and Innovation Project Code: MR/M022722/1Funder Contribution: 556,941 GBPInformation about the external and internal world is conveyed to the brain by an extensive system of sensory nerves. The skin contains multiple types of sensory receptors/nerves which inform the brain about events occurring on the body surface. There are many very basic questions about the neuroscience underlying human tactile processing that remain unanswered. We aim to use recent advances in the neuroimaging techniques of ultra-high field functional magnetic resonance imaging (UHF-fMRI) and magnetoencephalography (MEG), coupled in conjunction with nerve recording (microneurography) and stimulating techniques (intraneural microstimulation (INMS)), to provide novel insight into the brain mechanisms involved in operating the sense of touch in humans. Using fMRI, we can measure changes in the local blood flow that occur with increased neural activity. These changes cause an increase in the signal intensity in the MR image in the part of brain that is active. This means that we can measure, for example, which parts of the brain are more active while subjects feel an object touch their finger. One of the problems we face when studying the mechanism underlying our sense of touch is that the changes in signal intensity that occur are relatively small. We have overcome this problem by using a very high field magnetic resonance scanner which allows us to measure robust neural responses to touch, non-invasively, with much higher spatial resolution than has previously been possible, and we can now obtain robust activation maps of individual participants brains. This makes UHF- fMRI a very attractive tool for clinical applications. MEG is another non-invasive neuroimaging technique that offers a way to probe the temporal aspects of somatosensory processing. The technique of microneurography allows unprecedented access to the earliest stages of information transfer to the brain, it involves inserting a very fine needle through the skin into an underlying nerve, so you can hear and see (the nerve recording) sending messages to the brain. A step further is to electrically stimulate a single nerve fibre with a very small current, using the technique of INMS, so that a person can feel touch when there is no actual skin stimulus. Combining INMS with neuroimaging UHF-fMRI and MEG, will allow us to reveal the representation of single sensory nerves in the brain. In this project, we will use these cutting-edge techniques and take a multidisciplinary approach, combining expertise in MRI, neuroscience, neurophysiology and neurology, to improve our understanding of sensory pathways. Specifically, we will use UHF-fMRI to map carefully the detailed anatomy and function of the somatosensory cortex, and will use MEG to characterize the temporal dynamics of brain responses to tactile stimulation of the skin. We will develop a new MR- and MEG-compatible device to perform INMS in the UHF-fMRIscanner and MEG scanner. The use of fMRI during INMS will allow us to map the brain's response to single sensory afferents (in contrast to vibration, which stimulates multiple sensory receptors of various types). We will also apply these methods to measure alterations in the somatosensory pathways in patient groups with neuropathologies. Specifically we will study Focal Hand Dystonia and Carpal Tunnel Syndrome, and assess how somatosensory processing is altered by therapeutic interventions. Overall, this research will considerably advance our understanding of human somatosensation and perception and will be relevant to a wide range of clinical disorders related to neurotraumatic injury, neurology, neurodevelopment, neurodegeneration, neuropathology, pharmaceutical interventions and pain.
more_vert assignment_turned_in Project2014 - 2019Partners:University of Birmingham, ASTRAZENECA UK LIMITED, University of Birmingham, UF, Florida State University +19 partnersUniversity of Birmingham,ASTRAZENECA UK LIMITED,University of Birmingham,UF,Florida State University,Birmingham Childrens Hospital NHS FT,Aix-Marseille University,Thermo Fisher (To be removed 1),Thermo Electron Corporation,Texas A & M University,Waters UK,Aix-Marseille University,Birmingham Childrens Hospital NHS FT,Owlstone Limited,Waters UK,Aix-Marseille University,AstraZeneca plc,Advion Ltd,National Physical Laboratory NPL,UT System,Advion Ltd,Owlstone Limited,Astrazeneca,NPLFunder: UK Research and Innovation Project Code: EP/L023490/1Funder Contribution: 1,484,530 GBPThe aim of the research is to develop novel approaches for the analysis of biomolecules, and in particular proteins, directly from their natural (or actual) environment, i.e., to develop approaches for in situ biomolecular analysis. Proteins are the work-horses of the cell and perform all the functions required for life. They also find uses as therapeutics and in consumer products. To gain insight into the various and specific roles of proteins in life processes, or to determine the therapeutic efficacy of protein drugs, or to establish the environmental fate of protein additives in consumer products, it is necessary to be able to analyse proteins at a molecular level. Mass spectrometry, in which ionised molecules are characterised according to their mass-to-charge, is ideally suited to this challenge, offering high sensitivity, broad specificity (all molecules have a mass), and the capability for chemical structure elucidation. The ultimate goal is to link molecular analysis directly to molecular environment. Much like a forensics officer tasked with determining the presence of an illicit substance, there is much greater reliability and credibility afforded to an analysis performed at the scene of the crime than to one performed following removal of the sample to a separate location and alternative surroundings. Growing evidence suggests in situ protein analysis has groundbreaking roles to play in biomarker discovery, diagnosis & early detection of disease, targeting therapeutics (personalised medicine) and assessment of therapeutic efficacy. The benefits of in situ protein analysis can be illustrated by considering a thin tissue section through a drug-treated tumour. In principle, in situ analysis would inform on drug-target interactions (i.e., is the drug binding to the correct protein?). Moreover, with in situ protein analysis the capacity for artefact introduction as a result of sample preparation (e.g., application of a matrix) or sample damage is eliminated. Nevertheless, a number of challenges exist. Proteins are large molecules associated with a vast array of chemical modifications, and which form loosely-bound complexes with themselves, other proteins and other molecule types. It is not only their chemical structure but also their overall 3-D structure which dictate their function. Other molecular classes that are hugely important in biological processes also have an intricate relationship with proteins. Any in situ mass spectrometry approach needs to be able to meet these analyte-driven challenges, i.e., it must be capable of (a) measuring proteins and characterising any modifications, (b) detecting protein complexes and determining their constituents, (c) providing information on 3-D structure, and (d) detecting other relevant molecular classes. Moreover, there are technique-driven challenges for in situ analysis including inherently high sample complexity and wide ranging concentrations, and opportunities for quantitation. The research will meet these challenges by developing a newly emerging in situ approach, liquid extraction surface analysis mass spectrometry, in combination with two complementary types of ion mobility spectrometry (which can either provide information on 3-D structure, or separate ionised molecules in the mass spectrometer on the basis of their 3-D shape) and a structural elucidation strategy known as electron-mediated dissociation mass spectrometry. The research will be undertaken primarily at the University of Birmingham in the Advanced Mass Spectrometry Facility in the School of Biosciences and the School of Chemistry mass spectrometry facility. The programme involves a number of academic and industrial collaborators and additional research will be carried out during scientific visits to National Physical Laboratory (NPL), Thermo Fisher Scientific, Waters, Owlstone, Florida State University, Texas A&M University and Université d'Aix-Marseille.
more_vert assignment_turned_in Project2010 - 2011Partners:Aix-Marseille University, Universites d'Aix-Marseille Paul Cezanne, Swiss Federal Institute of Technology, University of Glasgow, University of Southampton +9 partnersAix-Marseille University,Universites d'Aix-Marseille Paul Cezanne,Swiss Federal Institute of Technology,University of Glasgow,University of Southampton,EPFZ,University of Southampton,Gold Standard Simulations,UCL,University of Glasgow,Aix-Marseille University,Institute of Material Sciences Barcelona,Institute of Material Sciences Barcelona,GSSFunder: UK Research and Innovation Project Code: EP/I004084/1Funder Contribution: 712,368 GBPComputers and electronic gadgets, such as the iphone, have transformed modern life. The silicon transistor is at the core of this revolution, having been continuously made faster and smaller over the last forty years. In a chip, millions of them are squeezed into an area the size of a pinhead, switching a billion times in one second. Transistor size has now reached nanometre dimensions; one nanometre is only ten time larger than an atom. Moore's law, which dictates that transistor size halves every two years and is the driving force behind the success of the electronics industry, has come to a halt. The happy and easy days of transistor scaling are now gone. Quantum mechanical laws conspire against transistor function making it leak when switched off and generating poor electrical control. Also, our inability to control the precise atomic structure of interfaces and chemical composition during fabrication makes transistors less predictable. Hence semiconductor companies are searching for alternative, non-planar (multigate) transistor architectures and novel devices such as nanowires, nanotubes, graphene and molecular transistors, which will ultimately break through the nano-size barrier resulting in a completely new era of miniaturization. There is a significant gap between our ability to fabricate transistors and to predict their behaviour.The simulation and prediction of the silicon transistor has become an vital mission. Current planar transistor architecture presents serious problems in scalability regarding leakage and controllability. Transistors of nanometre dimensions are more vulnerable to the atomic nature of matter than their previous cousins of micrometre dimensions. Furthermore, at nanoscales heat transfer is a source of heat death for novel transistor applications due to the decrease of thermal conductivity. Within this context I propose to develop a Quantum Device simulator, with atomic resolution that will enable the accurate prediction of present and future transistor performance. The simulator will deploy a quantum wave description of electron propagation, treating the interaction of electrons with crystal lattice vibrations (heat) at a fully quantum mechanical level. It will have the capability of describing the electron interactions with the roughness of the semiconductor/dielectric interface and with each other under the effect of a high electric field. Devices will be properly tested and optimised regarding materials, chemical composition and geometry without the high costs implicit in fabrication. A wide range of transistors will be explored from planar, non-planar and novel. This is timely as existing computer design tools lack predictive capabilities at the nanoscale and the industrial build-and-test approach has become prohibitively costly. Efficient quantum-models/algorithms/methodologies and tools will be developed.These are dynamic times as device dimensions move closer to the realm of atoms, which are inherently uncontrollable. In this regime two streams collide: the classical and quantum worlds making the need for new regularities and patterns vital as we strive to conquer nature at this scale. This offers exiting opportunities to merge an engineering top-to-bottom approach with a physics bottom-up approach. As 21st century environmental concerns rise, the need for greener technology is increasing. My proposal addresses the lowering of power consumption, raw material reductions delivering more functionality and the provision of a cheaper way to assess new design technologies. Collectively, these will help companies to provide a greener alternative to consumers.
more_vert assignment_turned_in Project2020 - 2023Partners:Bordeaux INP, City, University of London, Aix-Marseille University, University of London, Aix-Marseille University +1 partnersBordeaux INP,City, University of London,Aix-Marseille University,University of London,Aix-Marseille University,Aix-Marseille UniversityFunder: UK Research and Innovation Project Code: EP/T018313/1Funder Contribution: 249,526 GBPThe proposed research lies at the interface of the areas of verification and machine learning, interactions of which are attracting a lot of attention currently and of potential huge benefits for both sides. Verification is this domain of computer science aiming at checking and certifying computer systems. Computer systems are increasingly used at all levels of society and peoples' lives and it is paramount to verify that they behave the way they are designed to and that we expect (examples of crucial importance, among many others, are embedded software for planes auto-pilot or self-driving cars). Unfortunately, the verification of complex systems encounters limits: there is no universal fully automated way to verify every system and one needs to find a good trade-off between the constraints of time, memory space and accuracy, which are often difficult to overcome. Machine learning has been studied since the 50's and regained much attention recently with breakthroughs in speech recognition, image processing or game playing. The development of neural networks (studied since the 60's) awarded Hinton, LeCun, and Bengio the Turing award 2019 and using deep learning, the British firm DeepMind developed its successful AlphaGo and AlphaGo Zero which were impressive steps forward and reaffirmed the amazing potential of machine learning. This project proposes to apply learning techniques in verification to improve the efficiency of some algorithms which certify computer systems and to compute fast accurate models for real-life systems. Automata are one of the mathematical tools used in verification to model computer or real-life systems. Giving certifications on these systems often boils down to running some algorithms on the corresponding automata. The efficiency of such algorithms usually depends on the size of the considered automaton. Minimising automata is thus a paramount problem in verification, as a way to verify large computer or real-life systems faster. This proposal aims at studying the minimisation of some streaming models of quantitative automata using machine learning techniques. The kind of automata we are going to focus on, are streaming models, in the sense that the input is not stored but received as a stream of data and dealt with on the fly, thus being particularly suitable for the treatment of big data. They are also suited to deal with optimisation problems such as minimising the resource consumption of a system or computing the worst-case running time of a program, for example. Minimising these kind of automata is highly challenging and linked with the long-standing open problem of the determinisation of max-plus automata. This proposal gives several directions of research, such as using learning methods to tackle it.
more_vert assignment_turned_in Project2024 - 2026Partners:University of Central Lancashire, Aix-Marseille University, Aix-Marseille UniversityUniversity of Central Lancashire,Aix-Marseille University,Aix-Marseille UniversityFunder: UK Research and Innovation Project Code: EP/Y003489/1Funder Contribution: 140,287 GBPMicrorobotic systems have the potential to revolutionise medicine and treatment in many applications including highly localized drug delivery, cancer therapies, such as hyperthermia and brachytherapy, minimally invasive surgery and cell transportation. Swimming microrobots, for example, are tiny machines that swim in the body's intravascular or interstitial fluids to perform biomedical operations. The application of microrobots in medicine requires a multidisciplinary delicate investigation. A current challenge in developing autonomous systems is to provide power and control for the microrobots. Magnetic actuation remains the most practical way for untethered powering and control of microrobots as it transfers power for movement and enables guidance for delivery. However, magnetic microrobots so far afford insufficient functionality to accomplish the foreseen tasks lacking, for example, the ability to sense their environment, make real-time decisions, and induce desired changes. Further, the effectiveness and flexibility of magnetic actuation drastically declines when controlling a team of microrobots since magnetic actuation provides an identical driving force for all devices in the team that makes the control of individual robots highly complicated. Additionally, the artificial microrobots lack the ability to sense the status of their mates in the group. The amount of medicine that can be carried by a single microrobot is not sufficient for an effective treatment, hence, a large group of microrobots should be utilised. Alternatively, harnessing microorganisms, especially bacteria, as intelligent tiny robots provides a novel strategy for cargo delivery at micro-/nanoscale owing to their several advantages. These bacteria have a small size, swim fast and contain a system of sensors and actuators that automatically responds to environmental stimuli. The feasibility of biomicrorobotic systems has been studied in recent works. For example, drug-loaded microparticles have been attached to bacteria and driven to the target position. We propose, for the first time, a biohybrid system composed of a magnetic microrobot (a "master") and groups of bacteria ("followers") to benefit from the subtle sensory system of bacteria along with their collective motility and precise magnetic navigation of the synthetic microrobot. In this system, a magnetic synthetic swimmer is functionalised with a chemical and controlled to trigger the tactic response of bacteria. The bacteria are attracted towards the chemical because of this tactic response. Therefore, the microrobot can lead them and direct them to a target position to deliver medicine. This research overcomes the insufficient functionality and communication of solitary artificial microrobots, challenges in artificial microrobotic swarm control, and instability and inaccuracy in the navigation of bacteria as bio micro/nanorobots. Chemotaxis and phototaxis-based motion of bacteria in response to the precisely controlled microrobot creates a platform, which can carry a large-volume cargo and have more sensitive local sensing such as biochemical and light to reach specific microenvironments to perform intricate applications in biomedicine and nanotechnology. This research would help efforts in creating a reliable microengineered system with multiple functions including propulsion, sensing, guidance, cargo delivery, and operation that move the micro/nanorobotics technology toward clinical trials more rapidly.
more_vert
chevron_left - 1
- 2
chevron_right