[Show/Hide Left Column]


April 25, 2017, Paris Saclay, France

BATIMENT Alan Turing - INRIA - Palaiseau

Morning talks

Internet of Things Analytics
Presenter: Albert Bifet (Telecom ParisTech, LTCI)
The Internet of Things (IoT), the large network of physical devices that extends beyond the typical computer networks, will be creating a huge quantity of Big Data streams in real time in the near future. The realization of IoT depends on being able to gain the insights hidden in the vast and growing seas of data available. Since current approaches do not scale to Internet of Things (IoT) volumes, new systems with novel mining techniques are necessary due to the velocity, but also variety, and variability, of such data.This IoT challenging setting needs algorithms that use an extremely small amount (iota) of time and memory resources, and that are able to adapt to changes while not stopping the learning process. These algorithms should be distributed to allow them to run on top of Big Data infrastructures. How to do this accurately in a fully automatic, and transparent elastic, real-time, system is going to be the main challenge for IoT analytics systems in the near future.
Equivalence of Deterministic Pushdown Automata on Finite and Infinite Words
Presenter: Stefan Göller (CNRS, LSV)
The equivalence problem of deterministic pushdown automata (dpda) is a famous problem in formal language theory. Sénizergues showed decidability in 1997 and Stirling showed a TOWER upper bound in 2002.
In my talk I will briefly sketch a link found by Sénizergues in 2003 that shows a connection of the complexity of the dpda equivalence problem with the growth of a Ramsey-like function f  in word combinatorics: 
Given an alphabet size k and an unavoidable pattern p (a pattern that all but finitely many words over any k-letter alphabet encounter), the function f(p,k) denotes the smallest length n such that all words over alphabet {1,...,k} of length at least n encounter p. 
I will present some recent results on the growth rate of this function f.

Based on joint work with Arnaud Carayol.
Contracts in Model-Based Design of Cyber-Physical Systems
Presenter: Antoine Girard (CNRS, L2S)
Cyber-physical systems (CPS) result from integrations of computational devices with physical processes and are to become ubiquitous in modern societies (autonomous vehicles, smart buildings, robots, etc.). The development of rigorous model based approaches to the design of CPS therefore constitutes a major challenge for the future years. Hybrid systems are natural models of CPS enabling to capture the tight interactions between ``discrete computing devices with the ``continuous physical world. Despite considerable progress in the field, current techniques apply to hybrid systems of moderate complexity. Thus, the design of complex CPS requires to divide large design problems in smaller sub-problems that can be solved using existing tools. The CODECSYS project aims at developing such approaches by decomposing a complex CPS into components which are designed independently. Each component is assigned a contract, which specifies guarantees that the component must fulfill under assumptions on the behavior of other components. For a given desired behavior of the global system, the decomposition into contracts to be satisfied by components is generally not unique: some contracts may be infeasible by components, resulting in an unsuccessful overall design; and even when all contracts can be satisfied, their choice may impact the robustness of the overall system.
The CODECSYS project will contribute to contract based design of CPS by developing novel approaches, which explore systematically the space of possible design contracts. In this talk, we will first explain our methodology using a simple example. Then, its application in embedded control systems design will be discussed. We will conclude by highlighting some of the current research directions explored within the CODECSYS project.

Afternoon talks, Session 1

Detecting Structural Changes in Dynamic Community Network
Presenter: François Meyer (University of Colorado at Boulder & INRIA)
We describe existing distances between graphs, and study their ability to reveal orgazitional changes. We propose a novel distance that can detect changes occurring on a graph at multiple scales. We develop a fast randomized algorithm to compute an approximation to this novel graph distance.
We apply this novel distance to the analysis of a dynamic community graph. We detect the time at which the graph dynamics switches from a normal evolution — where balanced communities grow at the same rate — to an abnormal behavior — where communities start merging.
This is work in collaboration with Dr. Nathan Monnig, and Peter Wills.
Stochastic Geometry Modeling for Large-Scale Networks
Presenter: Marco Di Renzo (CNRS, L2S)
Graph Algorithms for Decision Support in Molecular Construction
Presenter: Dominique Barth (Université de Versailles St Quentin, DAVID)
The objective of this research project is to propose and evaluate on real chemical data new algorithms of prediction of a reaction chain for the synthesis of a new molecule. These algorithms consist first in the  selection of  a set of molecules sufficiently similar to the target one in a database of reactions. This measure of similarity is based on a specific metric of comparison of graphs modeling  the structure of molecules at a high level of granularity. The second algorithmic phase consists in solving an edition problem   for the prediction of the target reaction chain from existing reaction chains in the data base, selected by using the similarity of the molecules.   
Making Networks Better, Faster, Stronger, Harder with Information Centric Networks__
Presenter: Dario Rossi (Telecom ParisTech, LTCI)
Information Centric Networks (ICN) is a prominent paradigm among the Future Internet (FI) architecture proposals, that makes named content a first class citized. After briefly introducing the potential of ICN for 5G networks, in this talk we focus on an important aspect of ICN, namely the caching function, that in ICN becomes an important primitive  as opposite as being offered as an over-the-top patch as in the current Internet. We then (i) introduce the basic caching concepts, as well as (ii) overview our research activity in the area and finally (iii) cherry-pick a couple of recent representative examples, notably concerning video-streaming over ICN and a new highly scalable hybrid simulation technique of cache networks.
D-spaces: Distributed Spaces in Concurrent Epistemic Systems
Presenter: Franck Valencia (LIX)
Epistemic, mobile and spatial behavior are common place in today’s distributed systems. The intrinsic epistemic nature of these systems arises from social behavior. Spatial and mobile behavior is exhibited by programs and data moving across (possiblynested) spaces defined by, for example, friend circles, groups, and shared folders. In this talk I shall give a brief description of spatial constraint systems (scs) as semantic structures for specifying spatially distributed mobile multi-agent systems. Roughly speaking scs can be seen as Scott's information systems with additional structure for specifying agents and their spaces. From a computational point of view scs can be used to specify partial information holding in a given agent's space (local information). From an epistemic point of view scs can be used to specify information that a given agent considers true (beliefs). Spatial constraint system can also specify the mobility of information/processes from one space to another. Ccs provide for process/information extrusion, a central concept in formalisms for mobile communication from concurrency theory. From an epistemic point of view extrusion corresponds to a notion we shall call utterance; a piece of information that an agent
commuicates to others but that may be inconsistent with the agent’s beliefs. Finally, scs can also also distributed information, understood as information that agents may conclude if they were to combine their local information. Spatial constraint systems have been used as semantic structures for spatial concurrent programming (sccp) and to reason about modal logics. In this talk I shall focus on the issue of distributed information, and briefly describe the applications of sccp.

Afternoon talks, Session 2

Temporal Relation Extraction from Clinical Narratives
Presenter: Aurélie Névéol (CNRS, LIMSI) / Xavier Tannier (Université Paris Sud, LIMSI)
References to phenomena occurring in the world and their temporal characterization can be found in a variety of natural language utterances. For this reason, temporal analysis is a key issue in natural language processing. This project aims to analyze documents in the field of medicine viz. clinical narratives with a strong temporal and chronological perspective. We develop methods and tools to automatically extract significant medical events linked to relevant temporal information from the Electronic Health Records of patients. We aim to create a timeline of the patient's medical history by merging the information extracted from multiple documents concerning the patient. This work addresses the complex problem of temporal analysis of clinical documents with an original approach leveraging information from multiple documents. It is guided by linguistic principles and by the application to retrospective analysis of patient care, which shall facilitate event normalization.
Shallow survey of Deep Learning
Presenter: Marc Schoenauer (INRIA)
This talk will briefly introduce Deep Learning, as well as some of its recent spectacular applications .
Control of Nonlinear Switched Systems
Presenter: Julien Alexandre dit Sandretto (ENSTA ParisTech, U2IS)
We present an algorithm of control synthesis for nonlinear switched systems, based on an existing procedure of state-space bisection and made available for nonlinear systems with the help of validated simulation. The use of validated simulation also permits to take
bounded perturbations and varying parameters into account. The whole approach is entirely guaranteed and the induced controllers are correct-by-design.
Ideal Codes, Isogenies, and Cryptographic Applications
Presenter: Benjamin Smith (INRIA & LIX)
Maximal Monotone Operators, Stochastic Approximation, and Some Applications
Presenter: Walid Hachem (CNRS, LIGM)
The proximal gradient algorithm allows to find the minimizers of a sum $F+G$ of two proper closed convex functions, one of them being differentiable. In this work, we introduce a stochastic version of the proximal gradient algorithm. The iterations involve an iid sequence of two random functions, whose expectations coincide with $F$ and $G$ respectively. The aim is to provide a convergence analysis in an adaptive context where the step size of the algorithm is constant. We prove that, in C\'esaro mean, the probability that the iterates are away from the sought minimizers is small when the number of iterations tends to infinity and in the limit of small step sizes. The ergodic behavior is studied as well. Finally, the algorithm is extended to the context of random maximal monotone operators.
Work done with Pascal Bianchi and Adil Salim (Telecom ParisTech).

Pour toutes questions / For any issues : contact at digicosme.fr