
Laboratoire d'informatique système, traitement de l'information et de la connaissance
Laboratoire d'informatique système, traitement de l'information et de la connaissance
11 Projects, page 1 of 3
assignment_turned_in ProjectFrom 2022Partners:CEDRIC, Research Centre Inria Sophia Antipolis - Méditerranée, Laboratoire d'Informatique du Parallélisme, Laboratoire d'informatique système, traitement de l'information et de la connaissance, CENTRE DETUDES ET DE RECHERCHE EN INFORMATIQUE ET COMMUNICATIONS +3 partnersCEDRIC,Research Centre Inria Sophia Antipolis - Méditerranée,Laboratoire d'Informatique du Parallélisme,Laboratoire d'informatique système, traitement de l'information et de la connaissance,CENTRE DETUDES ET DE RECHERCHE EN INFORMATIQUE ET COMMUNICATIONS,LABORATOIRE DINFORMATIQUE, SYSTÈMES, TRAITEMENT DE LINFORMATION ET DE LA CONNAISSANCE,Laboratoire dInformatique dAvignon,Laboratoire d'Informatique d'AvignonFunder: French National Research Agency (ANR) Project Code: ANR-21-CE25-0013Funder Contribution: 559,192 EURNew generations of mobile access networks promise low delay and high-speed throughput data connections paired with in-network processing capabilities. IoT data and local information available to users’ devices will feed AI-based applications executed in proximity on edge servers and service composition will routinely include such applications and their microservice components. PARFAIT tackles new resource allocation problems emerging due to the need of distributed edge orchestration of both computing and communication, in a context where the unknown footprint of AI-based applications requires advanced learning capabilities to permit efficient and reliable edge service orchestration. The PARFAIT project develops theoretical foundations for distributed and scalable resource allocation schemes on edge computing infrastructures tailored for AI-based processing tasks. Algorithmic solutions will be developed based on the theory of constrained, delayed, and distributed Markov decision processes to account for edge service orchestration actions and quantify the effect of orchestration policies. Furthermore, using both game and team formulations, the project will pave the way for a theory of decentralized orchestration, a missing building block necessary to match the application quest for data proximity and the synchronization problems that arise when multiple edge orchestrators cooperate under local or partial system view. Finally, to achieve efficient online edge service orchestration, such solutions will be empowered with reinforcement learning techniques to define a suit of orchestration algorithms able to at once adapt over time to the applications’ load and cope with the uncertain information available from AI-based applications’ footprints. Validation activities will be designed to demonstrate real-world solutions for practical orchestration use cases, using both large scale simulation experiments and research testbeds.
more_vert assignment_turned_in ProjectFrom 2024Partners:Laboratoire d'informatique système, traitement de l'information et de la connaissance, Université Savoie Mont Blanc, CNRS, LAPThLaboratoire d'informatique système, traitement de l'information et de la connaissance,Université Savoie Mont Blanc,CNRS,LAPThFunder: French National Research Agency (ANR) Project Code: ANR-23-CE31-0021Funder Contribution: 332,241 EURDIRECTA (Deep learnIng in REal time for the Cherenkov Telescope Array), as the name states, is a project to apply deep learning solutions based on convolutional neural networks (CNNs) to the Cherenkov Telescope Array (CTA), in real-time. It is a continuation of the GammaLearn project, that already demonstrated the applicability of CNNs to CTA data, and of the ACADA work package that is developing the real-time analysis for CTA using the standard reconstruction techniques. Its objective is the demonstration of the applicability of CNNs in real-time for CTA with a working proof-of-concept applied to the already observing Large-Sized Telescope 1 (LST-1) and later to the LST-2 and Mid-Sized Telescope 1 whose construction will start in 2023. It will greatly improve CTA's reconstruction performances in real-time necessary for the study of transient sources such as gamma-ray bursts and flaring active galactic nuclei, of the Lorentz Invariance Violation and of the Extragalactic Background Light.
more_vert assignment_turned_in ProjectFrom 2018Partners:Laboratoire dInformatique, Systèmes, Traitement de lInformation et de la Connaissance, CS, Laboratoire de lIntégration du Matériau au Système, Laboratoire d'informatique système, traitement de l'information et de la connaissance, L2S +5 partnersLaboratoire dInformatique, Systèmes, Traitement de lInformation et de la Connaissance,CS,Laboratoire de lIntégration du Matériau au Système,Laboratoire d'informatique système, traitement de l'information et de la connaissance,L2S,University of Paris-Saclay,LABORATOIRE ENERGÉTIQUE MÉCANIQUE ELECTROMAGNÉTISME,Paris Nanterre University,CNRS,Laboratoire de l'Intégration du Matériau au SystèmeFunder: French National Research Agency (ANR) Project Code: ANR-17-ASTR-0015Funder Contribution: 297,477 EURWhen it comes to sensing the environment (RADAR, imaging, seismic, ...), the current trend is to develop acquisition systems that are more and more sophisticated. For example, we can point out an increase in the number of sensors, the use of multiple arrays for either emission or reception, as well as the integration of several modalities like polarization, interferometry, temporal, spatial and spectral information, or waveforms diversity. Obviously, this sophistication is made to enrich the obtained information and to reach better performances compared to classical systems, such as improving the resolution, improving detection performance (especially for low SNR settings), or allowing a better discrimination between physical phenomena. However, the simple transposition of classical process/algorithms in these new systems does not necessary led to the expected improved performances. Indeed, several effects impose to deeply re-derive the modelizations and the processes: - the answer of the sensed environment becomes complex and heterogeneous, - the size of the data is increased, so the estimation of statistical parameters may become difficult, - in systems with multiple modalities, the construction of the data vector is nontrivial, - there are more uncertainties on the model of the useful signal (therefore on its parameterization) The MARGARITA project aims at solving the aforementioned issues by developing new estimation/detection processes for multi-sensors/multi-modal systems operating in a complex heterogeneous environment. These new methods will be based upon the combination of recent tools and advances in signal processing: robust estimation, optimization methods, differential geometry and large random matrices theory. Hence, the project aims at: + integrating an accurate statistical modeling (i.e. handling non Gaussianity and heterogeneity) for estimation/detection problems in large dimension settings. + integrating prior information and model uncertainties in a modern robust estimation/detection framework. + accurately characterizing the theoretical performances of the developed processes. Apart from providing theoretical guarantees, this characterization will also offer tools for system design and specification. + Demonstrating that the proposed tools can be applied in fields that involve modern acquisition systems. We propose to adapt these processes to specific radar applications (STAP, MIMO-STAP, SAR) as well as other civilian applications (Hyperspectral imaging, radio-astronomy and GPR) From a scientifical and technical perspective, this project will: - use tools from the robust estimation framework and the optimization framework (majorization-minimization and optimization on manifolds) to propose new estimators (notably for covariance matrices) that exploit available prior information to counter the large dimension problem. - extend the Bayesian subspace estimation methods to a robust estimation/detection framework in order to integrate uncertainties on the signal model. - exploit the misspecified performance bounds framework to solve the problem of multi-sensors/multi-modal systems calibration. - use recent theoretical tools (large random matrices theory and intrinsic bounds) to characterize the performances of the developed processes.
more_vert assignment_turned_in ProjectFrom 2017Partners:Laboratoire dinformatique, systèmes, traitement de linformation et de la connaissance, Laboratoire d'informatique système, traitement de l'information et de la connaissance, Université Pierre et Marie Curie, IMT, Télécom SudParis, LIG +1 partnersLaboratoire dinformatique, systèmes, traitement de linformation et de la connaissance,Laboratoire d'informatique système, traitement de l'information et de la connaissance,Université Pierre et Marie Curie,IMT, Télécom SudParis,LIG,SCALITYFunder: French National Research Agency (ANR) Project Code: ANR-16-CE25-0013Funder Contribution: 919,534 EURRainbowFS proposes a “just-right” approach to storage and consistency, for developing distributed, cloud-scale applications. Existing approaches shoehorn the application design to some predefined consistency model, but no single model is appropriate for all uses. Instead, we propose tools to co-design the application and its consistency protocol. Our approach reconciles the conflicting requirements of availability and performance vs. safety: common-case operations are designed to be asynchronous; synchronisation is used only when strictly necessary to satisfy the application's integrity invariants. Furthermore, we deconstruct classical consistency models into orthogonal primitives that the developer can compose efficiently, and provide a number of tools for quick, efficient and correct cloud-scale deployment and execution. Using this methodology, we will develop an entreprise-grade, highly-scalable file system, exploring the rainbow of possible semantics, and we demonstrate it in a massive experiment.
more_vert assignment_turned_in ProjectFrom 2014Partners:Laboratoire de glaciologie et geophysique de lenvironnement, Environnements, dynamiques et territoires de la montagne, Institut des sciences de la terre, Laboratoire dinformatique système, traitement de linformation et de la connaissance, Laboratoire d'informatique système, traitement de l'information et de la connaissance +7 partnersLaboratoire de glaciologie et geophysique de lenvironnement,Environnements, dynamiques et territoires de la montagne,Institut des sciences de la terre,Laboratoire dinformatique système, traitement de linformation et de la connaissance,Laboratoire d'informatique système, traitement de l'information et de la connaissance,Centre Européen de Recherche et dEnseignement des Géosciences de lEnvironnement,Centre Européen de Recherche et d'Enseignement des Géosciences de l'Environnement,LGGE,UGA,Université Savoie Mont Blanc,CNRS,Université BourgogneFunder: French National Research Agency (ANR) Project Code: ANR-14-CE03-0006Funder Contribution: 498,930 EURThis project aims at better understanding the impact of the climate change on the morphologic and environmental processes in the Mont-Blanc Massif (MBM), with particular focus on the reduction of glacier surface-area, rock-fall increase related to permafrost warming and downstream changes of water and sediments fluxes. Adequately tackling the environmental and societal challenges arising from the acceleration of these processes requires 1) a documentation of the spatio-temporal evolution of each component, i.e. local climate, rock faces, glaciers, sediment production and hydrological regimes; and 2) an understanding of the complex interactions between these components. To address the two issues, we formed a team of climatologists, geomorphologists, glaciologists, permafrost specialists and hydrologists that will perform a systemic approach within five work-packages. The first one is dedicated to the coordination aspects; the other four focus on the study of the spatio-temporal changes of the different components influencing the evolution of the MBM: climate, hydrology, permafrost, erosion products, and present-day and Holocene glacier dynamics. In order to investigate the complex interplay between these parameters, active exchange between work-packages will assure cross-analysis of the resulting data. The project is based on both observations (field measurements, remote sensing and geochemistry) and modeling. Direct field observations will benefit from: 1) the contributions of the GLACIOCLIM observatory (LGGE-LTHE) regarding the glacio-hydrological processes; 2) the expertise of the EDYTEM lab in permafrost studies, and 3) the one of the ISTerre lab in erosional processes. Climate modeling will be handled by the “Centre de Recherche de Climatologie” of BioGeoscience. Remote sensing will benefit from the expertise of the LISTIC in satellite image processing while the study of long-term glacial and peri-glacial processes will be based on cosmogenic nuclides, including notably the new in-situ produced 14C dating tool currently implemented at CEREGE. Several modeling will be applied for the present-day (last ~50 years) period: the 1979-today regional climate variability around the MBM will first be analyzed through kilometer-scale numerical climate modeling and compared with statistically downscaled fields derived from atmospheric re-analyses and general circulation models. In addition to climate analysis (mostly focused on local orographic effects), the derived high-resolution data will be used to feed hydrological, permafrost and glacier models. Glacio-hydrological model will rely on a degree-day modeling. Glacier modeling will be based on functions linking mass balance and surface elevation changes, thermal evolution of the permafrost on physical modeling of rock surface temperature distribution, and sub-glacial erosion will be estimated as a function of the basal-ice velocity. Glacier fluctuations, including glacier retreat during the warm periods of the Holocene, will be studied using in-situ produced cosmogenic nuclides (14C and 10Be). An erosion/ice cover history will be deduced from modeled glacier mass balance and sub-glacial erosion functions will be calibrated with the present-day period and forced by different Holocene climate scenarii. Projections of future environmental evolutions will be achieved through a statistical downscaling of climate change simulations using the most recent IPCC scenarii. The reliability of the regionalized climate will be evaluated through comprehensive comparisons with observations under present conditions before applying the downscaling technique to a multi-model, multi-scenario (RCP2.6 and 8.5 radiative forcings) ensemble of global climate models throughout the 21st century. Projection of the glacier extents and permafrost changes till at least the mid-21st century will be statistically deduced from the multi-scenario climatic ensemble applied to the mass balance and thermal models.
more_vert
chevron_left - 1
- 2
- 3
chevron_right